FTL.Wanderer

FTL.Wanderer

Enlightened
May 31, 2018
1,782
"Yes, I'm thinking of hurting myself and I have the means and a plan." That was the actionable criterion back in the day. NOW just acting or sounding the wrong way can get you imprisoned ... I mean, helped-against-your-will.

https://trueventures.com/voi-suicide-detection-prevention/

From National Institutes of Mental Health summary statement on "State of Suicide Prevention" in US: "The increasing use of 'big data' to predict behavior will likely result in ED providers having access to actuarial risk information on ED patients that does not have to rely on self-report. Optimal ways of using this information in triage and referral remains to be tested." (link: https://www.nimh.nih.gov/news/events/2017/state-of-suicide-prevention-in-emergency-care.shtml)

You think privacy concern is a big deal with corporations following you online? Data sciences and predictive analysis are bringing Minority Report to life today.

Minority Report
 
  • Like
Reactions: Weeping Garbage Can, Marawa, Rex2019 and 2 others
uiop

uiop

Fun drugs make me happy
Mar 27, 2019
218
I do think this is a breach of what privacy we have, or lack thereof. If people knew I was visiting this site on a daily basis, it would raise some serious red flags lol. However, I think it'll take a good amount of time before any of this becomes realistic. Software is full of bugs; it's only a prototype in production, so if they did deploy it to production, it would generate a lot of false positives. As of now, I think using big data to predict human behavior is still in the 'idea' stage. Machine learning, artificial intelligence, and big data are relatively new areas in technology. Software engineers are only entertaining the possibilities these new technologies will bring to the world.
 
  • Like
Reactions: Weeping Garbage Can and FTL.Wanderer
FTL.Wanderer

FTL.Wanderer

Enlightened
May 31, 2018
1,782
I do think this is a breach of what privacy we have, or lack thereof. If people knew I was visiting this site on a daily basis, it would raise some serious red flags lol. However, I think it'll take a good amount of time before any of this becomes realistic. Software is full of bugs; it's only a prototype in production, so if they did deploy it to production, it would generate a lot of false positives. As of now, I think using big data to predict human behavior is still in the 'idea' stage. Machine learning, artificial intelligence, and big data are relatively new areas in technology. Software engineers are only entertaining the possibilities these new technologies will bring to the world.

I do respect your perspective, but, again with the utmost respect, think this is a dangerous stance to take. Technology is evolving exponentially quickly. I agree with Elon Musk--the time to circumscribe these technologies, and the ethical problems they create, is not once they're robust, but in their infancy. Besides, by the time we are hearing about the technologies, it is likely the case they already exist beyond what the lay believes. Weaponized drones are a simple example. When the sh*t hits the fan, we'll all have no one to blame but ourselves for allowing it to get there in the first place. Again, no offense meant...

Case in point: https://www.washingtonpost.com/outl...ory.html?noredirect=on&utm_term=.06d69d1c2ffc

Suicide hotlines ALREADY using predictive software: https://www.vox.com/science-and-hea...nthony-bourdain-crisis-text-line-data-science
 
Last edited:
  • Like
Reactions: Weeping Garbage Can, Jen Erik and uiop
uiop

uiop

Fun drugs make me happy
Mar 27, 2019
218
I do respect your perspective, but, again with the utmost respect, think this is a dangerous stance to take. Technology is evolving exponentially quickly. I agree with Elon Musk--the time to circumscribe these technologies, and the ethical problems they create, is not once they're robust, but in their infancy. Besides, by the time we are hearing about the technologies, it is likely the case they already exist beyond what the lay believes. Weaponized drones are a simple example. When the sh*t hits the fan, we'll all have no one to blame but ourselves for allowing it to get there in the first place. Again, no offense meant...

Case in point: https://www.washingtonpost.com/outl...ory.html?noredirect=on&utm_term=.06d69d1c2ffc

Suicide hotlines ALREADY using predictive software: https://www.vox.com/science-and-hea...nthony-bourdain-crisis-text-line-data-science
No offense taken, I enjoy counter arguments: it allows for deeper insight into the topic at hand. I'm a software developer, so from my perspective, all software is broken, buggy, and written in spaghetti code. In reality, we don't know what we're doing. We just Google stuff, and copy code from Stack Overflow.

But now that I think about it, all of this is highly plausible. To generate more revenue, the tech giants are already using ML to predict our impulses, presenting to us enticing products, videos, and advertisements, all relative to our online activity. it is an attempt to manipulate the users. Moore's law posits that technology will increase exceptionally, and I haven't considered that in my previous reply, so you are right, to a certain degree, to assert that I underestimated the situation forthcoming.

Indeed, internet surveillance by big corporations is pervasive, and we should all be uneasy about it. Nonetheless, I am not anxious over it. Let them come at me.
 
  • Like
Reactions: Weeping Garbage Can and FTL.Wanderer
Rex2019

Rex2019

Can't wait for the summer
Feb 23, 2019
128
"Yes, I'm thinking of hurting myself and I have the means and a plan." That was the actionable criterion back in the day. NOW just acting or sounding the wrong way can get you imprisoned ... I mean, helped-against-your-will.

https://trueventures.com/voi-suicide-detection-prevention/

From National Institutes of Mental Health summary statement on "State of Suicide Prevention" in US: "The increasing use of 'big data' to predict behavior will likely result in ED providers having access to actuarial risk information on ED patients that does not have to rely on self-report. Optimal ways of using this information in triage and referral remains to be tested." (link: https://www.nimh.nih.gov/news/events/2017/state-of-suicide-prevention-in-emergency-care.shtml)

You think privacy concern is a big deal with corporations following you online? Data sciences and predictive analysis are bringing Minority Report to life today.

Minority Report
This makes me so angry. It's almost like these "helplines" are hunting you under the guise of helping you.
 
  • Like
Reactions: Weeping Garbage Can and FTL.Wanderer
Jen Erik

Jen Erik

-
Oct 12, 2018
637
I agree with Elon Musk--the time to circumscribe these technologies, and the ethical problems they create, is not once they're robust, but in their infancy.
Yeah. And where there is a dollar to be made – or several billion – fat chance of that happening.
 
  • Like
Reactions: Weeping Garbage Can and FTL.Wanderer
FTL.Wanderer

FTL.Wanderer

Enlightened
May 31, 2018
1,782
This makes me so angry. It's almost like these "helplines" are hunting you under the guise of helping you.


Exactly. Like the Salem Witch Trials...
Yeah. And where there is a dollar to be made – or several billion – fat chance of that happening.

I agree with the implication: humans eventually f*ck everything up.
 
Last edited:
  • Like
Reactions: Weeping Garbage Can and Pulpit2018

Similar threads

GuessWhosBack
Replies
7
Views
1K
Recovery
butterflyguy
butterflyguy
todiefor
Replies
20
Views
7K
Recovery
Rhizomorph1
Rhizomorph1