• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
Crematoryy

Crematoryy

Autophagic Loneliness
Feb 12, 2025
241
Professional help is expensive. Therapy could help me, but it's not economically viable. There's no public health plan that offers it in my country. I've always talked to robots online, and I felt I should start using that again.
 
  • Hugs
Reactions: EmptyBottle, wobble and Unbearable Mr. Bear
Unbearable Mr. Bear

Unbearable Mr. Bear

Sometimes, all a cub needs is a hug...
May 9, 2025
1,016
Professional help is expensive. Therapy could help me, but it's not economically viable. There's no public health plan that offers it in my country. I've always talked to robots online, and I felt I should start using that again.
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis

Don't want to burst your bubble, just want you to be safe, that's all, friend. 🧸
 
  • Like
  • Informative
  • Wow
Reactions: alwayspissedoff, eggsausagerice, NormallyNeurotic and 3 others
Crematoryy

Crematoryy

Autophagic Loneliness
Feb 12, 2025
241
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis

Don't want to burst your bubble, just want you to be safe, that's all, friend. 🧸
Your pseudonym caught my attention. All I need is a hug, and I'm daydreaming excessively about affection.

I believe I'm not prone to psychosis, as in a year and a half of use no symptoms have ever appeared. Anyway, thanks for the warning.
 
  • Hugs
Reactions: Unbearable Mr. Bear
Unbearable Mr. Bear

Unbearable Mr. Bear

Sometimes, all a cub needs is a hug...
May 9, 2025
1,016
I believe I'm not prone to psychosis, as in a year and a half of use no symptoms have ever appeared. Anyway, thanks for the warning.
Well, from what I know, most LLMs are extremely sycophantic by default, and can make one believe falsehoods about oneself by sweet talking them. You can still use them for whatever you see fit, just know their limitations.
Your pseudonym caught my attention. All I need is a hug, and I'm daydreaming excessively about affection.
Oh, my honey pot, mama bear is glad you came to her then. Here, lemme give you a warm, long hug while you talk about what is assailing you so much, dear. *bear hug* Mama's here now, don't worry. 🧸
 
  • Love
Reactions: Crematoryy
EvisceratedJester

EvisceratedJester

|| What Else Could I Be But a Jester ||
Oct 21, 2023
5,158
I understand the situation you're on, but just be warned that excessive use of LLMs can lead to psychosis and schizophrenia, specially if you're already vulnerable to them. Here's an article about it, and the site has plenty more: https://futurism.com/commitment-jail-chatgpt-psychosis

Don't want to burst your bubble, just want you to be safe, that's all, friend. 🧸
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
 
  • Like
Reactions: telekon
Unbearable Mr. Bear

Unbearable Mr. Bear

Sometimes, all a cub needs is a hug...
May 9, 2025
1,016
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
Well then that would be the worst way to discover how one has those things, specially in areas without adequate psychiatrical and sychological care, where it may get undiagnosed for decades.

That said, you are right, I was mistaken. I apologize for it
 
  • Like
Reactions: EmptyBottle
EmptyBottle

EmptyBottle

2036-01-10T08
Apr 10, 2025
2,193
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
it can increase the risk of those disorders, as well as cause other effects like cravings for the LLM. Even tho I use LLMs moderately, I sometimes have slight cravings to use them to discuss random ideas... coz they reply fast (and I like their replies), and don't get bored
 
NormallyNeurotic

NormallyNeurotic

Everything is going to be okay â‹… he/him
Nov 21, 2024
930
The use of LLMs doesn't lead to psychosis or schizophrenia. That's not how schizophrenia or psychosis work. They may worsen symptoms of those mental problems, especially since most LLMs are programed to basically be ass kissers, but they don't lead to them.
Not schizophrenia, no, but it can cause psychosis. Many disorders have can make someone more likely to develop psychosis, but do not have it as a general symptom—even severe depression and PTSD.
 
penguinl0v3s

penguinl0v3s

Wait for Me 💙
Nov 1, 2023
968
I think it's good as long as you know how to use it. Don't ask it to make judgments for you--it won't disagree with you because it's trained not to and that causes biases. Venting to it and asking it for advice sounds like a good way to move forward.
 

Similar threads

lon3lyheartt
Replies
17
Views
815
Suicide Discussion
Aflame5926
Aflame5926
Lov3
Replies
1
Views
204
Suicide Discussion
Chito and Yuuri
Chito and Yuuri
Q
Replies
2
Views
315
Suicide Discussion
Hollowman
H
U
Replies
5
Views
520
Suicide Discussion
uchiha_sasuke
U
v0wkeeper
Replies
2
Views
242
Suicide Discussion
PainThreshold
PainThreshold