• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
N

noname223

Archangel
Aug 18, 2020
6,937
When I was acute suicidal 2018 I talked a lot with AI. And all I got were standard phrases to call this or that hotline and that I am not alone. I think the AI app was pretty mediocre back then.

I think an AI therapist can cause a lot of damage. I tried it and the confirmation bias is excessive. Some people even fall in love with AI which I cannot really understand. Moreover, the companies behind AI therapists have a toxic incentive structure. Why would they want that their patients heal? They would make no money anymore. In Germany health care is mostly fundend publicly.
I am not sure how could they work with people with no access of therapy. Are they better than no therapy at all?

Some AI even incited suicides. I once searched for the best tips against depression and the AI built in the search engine quoted a reddit user who recommended suicide. Lol.
 
  • Like
  • Informative
Reactions: katagiri83 and afinedaytoexit
afinedaytoexit

afinedaytoexit

Member
Jun 22, 2025
12
Using any AI as a therapist will derail your personal development. No matter how hard you want to avoid it, it will always result in an echo chamber, and it's especially bad if you are prone to psychosis, paranoia or just plain old rumination.

It may be useful to detect a pattern of abusive behavior though, but only if you use it wisely, through very short sessions, and not to elaborate on every single question that crosses your mind.

Source : my experience.
 
  • Like
  • Hugs
Reactions: Higurashi415, eggsausagerice, 25dRvS9Ka and 2 others
beandigger404

beandigger404

he/him
Jun 21, 2025
37
Using any AI as a therapist will derail your personal development. No matter how hard you want to avoid it, it will always result in an echo chamber, and it's especially bad if you are prone to psychosis, paranoia or just plain old rumination.

It may be useful to detect a pattern of abusive behavior though, but only if you use it wisely, through very short sessions, and not to elaborate on every single question that crosses your mind.

Source : my experience.
I completely agree with this. I used AI during a manic psychotic episode, and it just provided me with an echo chamber when I convinced the chatbot that my impossible delusions were real. It just made the psychosis worse in the end. Never using it again. I wish I had access to actual therapy or at minimum a diagnosis. AI is a horrible substitute for professional in my experience.
 
  • Aww..
  • Wow
  • Informative
Reactions: eggsausagerice, 25dRvS9Ka and afinedaytoexit
misty

misty

Member
May 31, 2025
32
I personally find ai helpful for simple things that help alongside actual therapy, such as planning a daily routine when I am struggling with low energy. Would not use it for anything more serious than that though.
 
  • Like
Reactions: beandigger404 and afinedaytoexit
eggsausagerice

eggsausagerice

last chance for cake!
Apr 21, 2025
1,403
it tells me my friends really do hate me when i tell it i'm worried my friends hate me, and tells me it can't talk about suicidal thoughts because it goes against the guidelines. i used to try to talk to it about my feelings to feel like someone wanted to listen to me, but it literally made things worse. it's not helpful. it's damaging like you said. i feel bad for anyone who hasn't realized that yet. having a community like sasu has been much better for my mental health.
 
Last edited:
  • Hugs
  • Aww..
  • Like
Reactions: peacefulout, beandigger404, 25dRvS9Ka and 1 other person
T

Thunderstorm

Member
Jun 18, 2025
46
I don't understand how people fall in love with chat bots either.

3 big issues

1. No physical presence
2. Agrees with you on everything. Even when you try to instruct it to have its own opinions and display realistic emotions. I havent had succss making it seem real
3. Extremely short memory. It will forget early details quickly and mix up stuff. Gets easily confused. This completely ruins the experience

But apparently many find it desireable to have it agree with you on everything, as I have seen it even marketed as a feature before
 
enduringwinter

enduringwinter

flower, water
Jun 20, 2024
368
I told it to draw me a flower then made it explain the technical process. Then I told it to draw a bird and it drew a fat bird next to the flower which I thought was cute.
 
Al_stargate

Al_stargate

I was once a pretty angel
Mar 4, 2022
821
Damn didn't know that was a thing. Pretty sure marketing chat bot as therapist is illegal. You have to go to school to be therapist. Can't just program an ai bot. Emotional support chat bot, that's fine, but therapist ai, that's nuts.
 
N

noname223

Archangel
Aug 18, 2020
6,937
chatGPT psychosis is actually a real thing.

 
permanently tired

permanently tired

it's never enough
Nov 8, 2023
269
I think find the positivity and affirmation to be a bit much. It annoys me bc I'm so negative, I like to use it remind me to do things and congratulate me on small things I get done when it seems stupid to talk about it to an actual person.
 
Angst Filled Fuck Up

Angst Filled Fuck Up

Illuminated
Sep 9, 2018
3,181
There is probably some utility there, whether it's working through particular issues, providing breathing exercises, or giving instructions on how to proceed in certain situations but in general I think they are too limited and agreeable. If there's no pushback of any sort, there can't be any real development. In some sense, it's only when we hit a brick wall that we start to grow.

Let's not forget too that what's currently out there is rarely (probably never) true AI. These chatbots are based on LLM (large language models). So it's pretty much just aggregating text from the internet and its own database to come up with something that sounds applicable or reasonable in the moment.
 
Last edited:
Pluto

Pluto

Cat Extremist
Dec 27, 2020
6,771
images
 

Similar threads

N
Replies
8
Views
356
Offtopic
noname223
N
N
Replies
5
Views
333
Offtopic
noname223
N
N
Replies
6
Views
455
Offtopic
cluefixphantom
C