Tintypographer

Tintypographer

I am done as of 4-21-2023. Somewhere I am no more
Apr 29, 2020
471
Here is a true profit discussion that I feel should be where @fixthe26 should spend the time trying to protect people. First, this is an article in politico which is designed as a news source to investigate and generate interest for their advertisers. This is not one of my clinical study data posts with conclusions.

I ADVOCATE THAT IF YOU ARE CONSIDERING CALLING A SUICIDE OR CRISIS HOTLINE YOU SHOULD GO AHEAD AND MAKE THE CALL IF YOU FEEL VULNERABLE OR IN DANGER; THESE HOTLINES WERE DESIGNED WITH THE INTENT TO HELP.



It turns out that the backbone of the automated system behind the suicide hotline is mining the data for marketing purposes.

The article discussed anonymizing the data and then using it to help other silicon valley startups with chat systems.

Screenshot 20220218 065213  01

If we really want to discuss monetizing suicide or monetizing self harm, isn't this where kellikarens and @fixthe26 should begin their efforts?

From the article:

Crisis Text Line is not the only nonprofit support line that collects and shares data as part of its operations. For example, the Trevor Project, a 23-year-old group that provides suicide prevention services to young LGBTQ people, discloses in its online privacy policy that it passes along information about visitors to its website to third parties including Meta and Google for targeting of online ads or other purposes, and the content of their conversations to partners for research. The Trevor Project group did not initially respond to questions from POLITICO, but said after the story published Friday that it does not sell or share personal information from its "crisis contacts" with any for-profit companies or use it for any other commercial purpose, including "to make money, to help for-profits, nor to power advertising."

Nosta, the tech think tank founder, noted that it's also not uncommon in the digital health space for businesses to share data in exchange for services, describing it as "the nature of the beast" — with one "classic example" being the genetics testing company 23andMe.

"It's definitely not unusual in the life sciences industry," Nosta said, "and I think in many instances, it's looked at as almost a cornerstone of revenue generation: If we're generating data, we could use the data to enhance our product or our offering, but we can also sell the data to supplement our income."

But Duke's Perakslis argued that Crisis Text Line's arrangement with a for-profit company is still unusual.


Advertisement
"For self-improvement of the services, I think that's an expected use of their data," he said. "But the fact that that improvement then goes to a for-profit company that sells it for other uses — that's where you have to kind of look at and see: Is this simply exploiting people with mental health crises?"

The last line above raises the exact spectre of the @fixthe26 question for legislators? Why is the world of suicide data said between individuals and the private moments speaking with a hotline allowed the egregious violation of selling the conversation to tech companies for profit.

These problems are the real issue with protecting mental health and trying to slow suicide rates.
 

Similar threads