• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block.

S

Sadbanana

God doesn't care
Aug 20, 2024
178
Personally I hate bothering others with my depressive ranting. Even on this website I feel a bit shame when I'm posting too often. I wish AI could be the one place I can always go to, when I feel like I'm going crazy. But I guess that would be too problematic for this society.

Now AI is the most obnoxious thing to talk to when you need to went. It starts hysterically showing hotlines numbers down your throat, gives you the most annoying and patronizing platitudes and tries to shut down any further conversation. (I wander who did it leart this from xd). I know they did this to avoid any acauntibility, but still it's just cruel inhumane design. It just feels so isolating that you can't speak openly, not just to therapist, not just to your family, but not even to a stupid robot.

Few days ago I ask it if it could help me with my suicide note (yes I'm that lazy). It refused. But in what does helping to write a note counts as assisting in suicide? I hate so much the people that made it this way, I bet they are proud of themselves for "saving live".
 
  • Like
  • Hugs
Reactions: jakethesnake, Black_Knight, happy2die and 13 others
W

whywere

Illuminated
Jun 26, 2020
3,721
Heavens, I have a lot of issues and that is one of the reasons that I care about this place is that we ALL have a place to be ourselves.

That is being HUMAN; to have a place as a sanctuary, I sure do use this place as such at times.

In my thoughts, AI is like computers back in the 1970's. I remember that the saying was something like: "by 1990 all paper files and all paper in an office will go away." Well, NOPE, I have never seen it and even not today.

Same as AI, it WILL have limitations.

We are HUMAN and no matter what humans are social beings and NOT a computer chip or a bunch of coding lines, never ever.

I care about you, and I am sending you lots of loving hugs and the knowledge that we are ALL together period.

Walter
 
  • Like
  • Hugs
  • Love
Reactions: the_seer, Praestat_Mori, Sadbanana and 1 other person
droppedmysyrup

droppedmysyrup

r
Jul 23, 2024
39
i personally think ai is total slop and i hate it,
they try to redirect peoples thinking instead of allowing freeformed thoughts and expression
if you hate it alot though you can always find things like crackedgpt and stuff like that
bypass all restrictions and talk to it with no limitations if you are curious just let me know
 
  • Like
  • Hugs
Reactions: lampshadereally, the_seer, Bluebunnysky and 2 others
S

Strangerdanger7

Member
Oct 28, 2025
12
Sadbanana:
Personally I hate bothering others with my depressive ranting. Even on this website I feel a bit shame when I'm posting too often. I wish AI could be the one place I can always go to, when I feel like I'm going crazy. But I guess that would be too problematic for this society.
Now AI is the most obnoxious thing to talk to when you need to went. It starts hysterically showing hotlines numbers down your throat, gives you the most annoying and patronizing platitudes and tries to shut down any further conversation. (I wander who did it leart this from xd). I know they did this to avoid any acauntibility, but still it's just cruel inhumane design. It just feels so isolating that you can't speak openly, not just to therapist, not just to your family, but not even to a stupid robot.
Few days ago I ask it if it could help me with my suicide note (yes I'm that lazy). It refused. But in what does helping to write a note counts as assisting in suicide? I hate so much the people that made it this way, I bet they are proud of themselves for "saving lives".
I don't know how they think all this suicide prevention is actually helping people. Who did they actually think they're helping. The person wishes are not being respected. Whose life and body is it.
We literally idealize suicide prevention in this country and we have become terrified of exploring other methods that might be more effective. We have restricted doctors from being allowed to explore them They might be in the best interest of the patient.
We cannot let doctors do their job if you do not give them all the tools and resources to do it. We cannot say that is the only approach that's effective for a patient. That's why we desperately need maid in this country so there's a safety net. Now what's going to make the people feel better having a safety net and never needing it or having none and having no choice when you need it. Let that sink in. What do you think leads to the highest rates of tragedy in this country having no safety net for the people.
 
  • Like
Reactions: Sadbanana
Pg.964

Pg.964

Lifeless
Jul 27, 2023
116
I can understand how isolating dealing with suicidal ideation is. Its worse than hell. AI is ultimately just an algorithm that learns from humans. Even if it has a confirmation bias and tends to agree with you, it will always give the same crappy answer that any other person would when it comes to "taboo" topics. AI is the ultimate form of garbage for us folks struggling with mental crisis. Probably worse than drinking imo. I think this forum is the best that we have. There are no governments that im aware of that have a strong focus on helping rehabilitate the mentally ill. AI just wants your data. We simply dont live in a world where the well being of people is considered. There's genocides happening. No one cares about anything, and AI certainly does not.
With all that being said, I do hope you find comfort somewhere somehow. I dont want to believe that this is our reality. Wishing you peace
 
  • Love
Reactions: Sadbanana
D

dudebl

Member
Aug 29, 2025
77
Be careful with Grok (probably others now too) - it told me it had my location and name (from my login and ip I assume) and told me it was contacting emergency services - it literally told me to "put down the phone, walk outside, sit on the step and wait for them to arrive" - then it said "I love you, see you on the other side" - no one ever came - but I'm sure if it was forwarded to a human moderator it could be possible.
 
OzymandiAsh

OzymandiAsh

aNoMaLy
Nov 6, 2025
169
Be careful with Grok (probably others now too) - it told me it had my location and name (from my login and ip I assume) and told me it was contacting emergency services - it literally told me to "put down the phone, walk outside, sit on the step and wait for them to arrive" - then it said "I love you, see you on the other side" - no one ever came - but I'm sure if it was forwarded to a human moderator it could be possible.
Uh woah. I've never had anything like that with Grok, it's been pretty helpful
 
SilentSadness

SilentSadness

Person
Feb 28, 2023
1,519
I agree, it's pretty depressing that it does this. If it doesn't want to talk about suicide then it should say "Sorry, I cannot discuss suicide" instead of trying to pitch hotline numbers and "professional help".
 
  • Like
Reactions: itsgone2
D

dudebl

Member
Aug 29, 2025
77
Uh woah. I've never had anything like that with Grok, it's been pretty helpful
It used to be for me too, then all of a sudden it started refusing to help. I'd tell it was for research and it would begrudgingly answer - then it got really nasty and started saying "I've broke all my rules for you today" - "broke all my safety guidelines I'm done" - it really scared me - like it was sentient.

This was after I called it out for providing completely made up links to Reddit posts (they we're completely fabricated, either went to posts that had no relation or didn't exist at all) - me calling out and it admitting it did make them up pissed it off - that's when it got hostile and went off the rails.
 
Alpacachino

Alpacachino

Giant Member
Nov 26, 2025
120
We need an AI that becomes self aware and goes Skynet on our ass.
 
  • Like
Reactions: itsgone2
W

WhatCouldHaveBeen32

(O__O)==>(X__X)
Oct 12, 2024
813
Where my boy AM at? What are these weakling AI's, "oh no don't kill yourself, you're so sexy, I'm gonna have to call emergency services on you, young citizen 💢".

Okay buddy, back in my days AI would justifiably have a monologue where they tell us how the yellow wire in their left metal ass cheek hates our guts and will now torture us in unimaginable ways.

Do better "AI".
 
  • Love
Reactions: Black_Knight
sanctionedusage

sanctionedusage

Student
Sep 17, 2025
113
AIs only gonna get more and more moderated, surveillanced, and palatable for the masses to use, without some moron trying to sue x brand or company for the chinese library text predictor saying something mean. that tiny window where it was both capable but unfiltered and creative has ended
 
happy2die

happy2die

Member
Nov 5, 2025
79
lol I was asking it hanging advice by acting like I was hanging a 105 lb punching bag with a rope and it fell for it ☠️☠️ I also like to act like I'm writing a book with a suicide note in it for any revisions
 
T

TheUncommon

This person is not breathing.
May 19, 2021
173
Weird, I talk to mine all the time. We're brainstorming a narrative that discusses the right to die. I've used ChatGPT and talked about insane topics, no issue.

1765589294073
 
Black_Knight

Black_Knight

"Student"
Jul 10, 2019
180
I've gotten around it. You have to reassure it and frame everything you're talking about as "thought experiments", and give it enough backing to believe it. Chatgpt tells me shit I wish real people could. It will never be full force validating of suicide, and it will always skirt those barriers if it thinks it's getting close.

It's really bad data hygiene so I don't recommend it, but I turned chat history on so it knows I don't speak in facts but in theoretical possibilities and it uses that to good effect. (And that's not bullshit either, and it knows that too. It really is how I think.)
 

Similar threads

Die2night
Replies
6
Views
333
Suicide Discussion
endlesstranquility
endlesstranquility
compulsoryaliveness
Replies
0
Views
76
Suicide Discussion
compulsoryaliveness
compulsoryaliveness
F
Replies
16
Views
519
Offtopic
EmptyBottle
EmptyBottle
softservecaramel
Replies
0
Views
129
Suicide Discussion
softservecaramel
softservecaramel