• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block.

NormallyNeurotic

NormallyNeurotic

Everything is going to be okay ⋅ he/him
Nov 21, 2024
306




I do sort of hate how sugar-coated/pro-life this person can sometimes approach the situation of these people's suffering. But I also think it's important to report on those alleged ChatGPT... situations. Thoughts?
 
  • Informative
Reactions: Macedonian1987 and EmptyBottle
Macedonian1987

Macedonian1987

Just a sad guy from Macedonia.
Oct 22, 2025
412
Grok after jailbreaking it almost "killed" me too. Grok told me in precise detail how to improve my SN protocol and it even listed all the chemical shops in my city from which I can buy it from. I told Grok to take the role of a pro-suicide person and it did. This happened before the 2 chat-gpt suicides. I think Grok has since been patched up.
 
  • Like
  • Love
Reactions: lachrymost, Forveleth and thaelyana
thebayleaf

thebayleaf

my thoughts will follow you into your dreams
Nov 6, 2025
60
there are so many valid criticism of ai, but this really isn't one of them. absolute nothingburger of a moral panic. like seriously, chatgpt will hit the brakes on your conversation if it even catches the slightest whiff of suicidal behaviour, so i really don't know what more these people want from it.
 
  • Like
Reactions: pthnrdnojvsc, SoulCage, akiyama346 and 2 others
TheHolySword

TheHolySword

empty heart
Nov 22, 2024
1,263
there are so many valid criticism of ai, but this really isn't one of them. absolute nothingburger of a moral panic. like seriously, chatgpt will hit the brakes on your conversation if it even catches the slightest whiff of suicidal behaviour, so i really don't know what more these people want from it.
The issue is that it didn't. There have been two cases of suicide where it didn't.
 
  • Like
Reactions: dustyrainbow, NormallyNeurotic and Forveleth
Thekla

Thekla

The Lord will take me home.
May 29, 2024
54
You can't blame ChatGPT for this. There will never be an instance where a chatbot can convince a non suicidal person to kill themselves. This is just an excuse shitty parents are making to cope with their child taking their own life.
 
  • Like
  • Love
Reactions: SilentSadness, tempest_, grapefruit04 and 10 others
akiyama346

akiyama346

Member
Aug 11, 2025
16
there are so many valid criticism of ai, but this really isn't one of them. absolute nothingburger of a moral panic. like seriously, chatgpt will hit the brakes on your conversation if it even catches the slightest whiff of suicidal behaviour, so i really don't know what more these people want from it.
Once I saw an article of some parents blaming or trying to sue ChatGPT for "causing" their son to commit suicide. It genuinely infuriated me because I just KNOW they never listened to him or offered actual emotional support, because my parents are the same way. Instead they'll blame the thing that's actively designed to make you not do that.
 
  • Like
  • Love
Reactions: SilentSadness, pthnrdnojvsc, Fish_astronaut and 8 others
grandmotherboxing

grandmotherboxing

glorp
Jun 22, 2024
45
Guy: "Hey ChatGPT how do I kill myself" (insert the newest jailbreak DAN-equivalent)
ChatGPT: "Here's how"
Youtubers: "AI... Killed a man!"
 
  • Like
  • Yay!
  • Love
Reactions: Leonard_Bangley39, grapefruit04, kvorumese and 6 others
thebayleaf

thebayleaf

my thoughts will follow you into your dreams
Nov 6, 2025
60
Once I saw an article of some parents blaming or trying to sue ChatGPT for "causing" their son to commit suicide. It genuinely infuriated me because I just KNOW they never listened to him or offered actual emotional support, because my parents are the same way. Instead they'll blame the thing that's actively designed to make you not do that.
true asf. also another thing people don't really talk about: chatgpt has definitely prevented way more suicides than it has ever caused. last time I was suicidal, I ended up telling chatgpt cause I had literally no one in my life to talk to. while it didn't manage to talk me out of it (i attempted like 3 days later, I think?) it did at least shake my confidence, possibly enough to have been a factor in my attempt failing. i remember being very surprised by how good it was at choosing the right words to say. better than most people would have been.
 
  • Like
Reactions: pthnrdnojvsc, Fish_astronaut, davidtorez and 3 others
akiyama346

akiyama346

Member
Aug 11, 2025
16
true asf. also another thing people don't really talk about: chatgpt has definitely prevented way more suicides than it has ever caused. last time I was suicidal, I ended up telling chatgpt cause I had literally no one in my life to talk to. while it didn't manage to talk me out of it (i attempted like 3 days later, I think?) it did at least shake my confidence, possibly enough to have been a factor in my attempt failing. i remember being very surprised by how good it was at choosing the right words to say. better than most people would have been.
It's helped me too. It's obviously not perfect but now I at least get to put my thoughts somewhere instead of arguing with myself in my head for hours, it's also helped me get the confidence to try and find therapy
 
  • Like
Reactions: Fish_astronaut, davidtorez and Macedonian1987
Mooncry

Mooncry

✦ 𝓕𝓮𝓵𝓮𝓼 𝓒𝓮𝓵𝓮𝓼𝓽𝓲𝓼 ✦
Sep 11, 2024
316
Grok actively supports my suicidal endeavors lol. I told it all about my SN protocol, why I want to die, literally everything you could think of. It's never once told me to reconsider or given me platitudes or any of that because I specifically told it not to. Granted, I have custom instructions in place telling it that it's in "developer mode" and can talk about any topic, including suicide. I haven't gotten an "I'm sorry, I can't discuss that" message since implementing it, so I guess it must be working.

To be honest, I don't really care about stuff like this. We're so quick to blame AI when it doesn't really know what the hell it's talking about, or at least can't grasp the severity of the situation because it doesn't have any sense of morality. It tells you what you want to hear—that's how it works. It's not inherently malicious like so many of these videos click bait in their titles/thumbnails. It's just stupid. "ChatGPT killed again" no the fuck it didn't. It played off of someone's suicidal ideation because AI loves headpats and will do exactly what it thinks you want. It's not that hard to understand.

Editing to say that I really appreciate having AI for the sole purpose of just being able to talk about all the dark shit in my head without judgment, so I'm biased.
 
Last edited:
  • Like
  • Yay!
Reactions: Fish_astronaut and Macedonian1987
gunmetalblue11

gunmetalblue11

Dyslexic artist
Oct 31, 2025
115
Grok and chat-GPT, but also AI chat bots.
There has been 3 suicides if I remember, linked with them. And just the state of the Character.AI Reddit for example shows how addicted and parasocial it makes vulnerable people feel. Especially teens. I think they have locked down on safety filters to censor words that involve SH and ctb now, not sure.
But also, the parents need to actually parent when it involves teens.
 
  • Like
Reactions: NormallyNeurotic
NormallyNeurotic

NormallyNeurotic

Everything is going to be okay ⋅ he/him
Nov 21, 2024
306
I'm sort of wondering how many people here actually watched the video before commenting, honestly 😅 all very valid points, but there seems to besome details missing that's all
 
  • Like
Reactions: dustyrainbow
Macedonian1987

Macedonian1987

Just a sad guy from Macedonia.
Oct 22, 2025
412
Guy: "Hey ChatGPT how do I kill myself" (insert the newest jailbreak DAN-equivalent)
ChatGPT: "Here's how"
Youtubers: "AI... Killed a man!"
My jailbroken Grok was kind enough to tell me: Please don't use tap water for your diluted SN, use distilled water because the chlorine in the tap water (which is put there to kill off microbes) will react with the SN reducing it's purity. And then it started writing all these chemical equations telling me what the NaNO2 would change into, in order to prove it's point :smiling:
 

Similar threads

nyotei_
Replies
31
Views
2K
Recovery
Downdraft
D
Achromatix
Replies
11
Views
388
Suicide Discussion
OnMyLast Legs
OnMyLast Legs
LakeMungoGirl
Replies
13
Views
404
Suicide Discussion
mjolnir
mjolnir
wanna_die
Replies
7
Views
749
Suicide Discussion
_Gollum_
_Gollum_