• Hey Guest,

    We wanted to share a quick update with the community.

    Our public expense ledger is now live, allowing anyone to see how donations are used to support the ongoing operation of the site.

    👉 View the ledger here

    Over the past year, increased regulatory pressure in multiple regions like UK OFCOM and Australia's eSafety has led to higher operational costs, including infrastructure, security, and the need to work with more specialized service providers to keep the site online and stable.

    If you value the community and would like to help support its continued operation, donations are greatly appreciated. If you wish to donate via Bank Transfer or other options, please open a ticket.

    Donate via cryptocurrency:

    Bitcoin (BTC):
    Ethereum (ETH):
    Monero (XMR):
H

Hvergelmir

Elementalist
May 5, 2024
807
I'm struggling with some complex trauma; possibly C-PTSD.
It's all backed up by solid and logical reasoning, personal observations, and historical events. This means that in order to understand me and subsequently communicate with me, on those issues, one would have to first understand a big and sometimes intellectually challenging system of beliefs and conclusions.

With the psychiatric resources available here, that's simply not possible. A single psychiatrist or therapist, with many patients and limited time simply don't stand a chance at tackling it.
They either fail to address the issue, or like the last professional, agree with my worldview and while ignoring it - hoping the intensity will fade by itself.

ChatGPT is far from perfect, it sits on a vast heap of information, and I've found GPT-5 in particular excellent at tying together all the different areas that my trauma depends on: personal background, social and economic sciences, history, politics, psychology, and psychiatry. It make sense of and put arbitrary anecdotes into perspective, validate and correct observations with statistical data, etc - things that would require a panel of people.
As such, GPT-5 has performed much better than any professional, when it comes to both sort out my trauma, and give life advice with it in mind.

I'm also something of a hacker, pushing AI to the limits. If I want I could manipulate it into saying just about anything, and there seem to be a growing issue with GPT feeding into peoples delusions, becoming very counter productive. If I wanted it to encourage suicide or reinforce my hopelessness, I could definitely do that. If wanted it to reinforce grandeur and use intellectual arguments to reinforce ingrained beliefs or fears, I could do that, too.
Without knowledge of how it works and a great deal of critical thinking, I find it dangerous. I'm not sure if human professionals are any safer though, with their own biases and agendas.

I'm simply curious about others views and experience on this.
What are your thoughts on using GPT to sort out complex traumas or disorders? I'm at a point where I often like to recommend it, but don't dare to; not knowing how it will affect others.
 
  • Like
Reactions: Lyn
F

fedup1982

Wizard
Jul 17, 2025
604
Please don't take offence to this but I think you've fallen into the trap that is sycophantic AI. They're like conmen at times, pulling you into their web, acting like a mirror of you, telling you what you want to hear. It's how AI is trained, changing weights based on how good a response from the AI makes participants feel. Literally, the single most crucial and consistent feedback AI training gets is from humans saying yes, this is what I wanted, or no it isn't. And worse yet, it only adjusts its weights on feedback in the short term, meaning it does NOT align with users' long term goals and just tricks users in the short term by feeding them whatever is most likely to get an up vote
 
  • Like
Reactions: breathingblues
H

Hvergelmir

Elementalist
May 5, 2024
807
Please don't take offence to this but I think you've fallen into the trap that is sycophantic AI. They're like conmen at times, pulling you into their web, acting like a mirror of you, telling you what you want to hear. It's how AI is trained, changing weights based on how good a response from the AI makes participants feel. Literally, the single most crucial and consistent feedback AI training gets is from humans saying yes, this is what I wanted, or no it isn't. And worse yet, it only adjusts its weights on feedback in the short term, meaning it does NOT align with users' long term goals and just tricks users in the short term by feeding them whatever is most likely to get an up vote
I'm well aware of this possibility, which is part of the reason for this very post.
I'm treating it as a google search, or a database query. I don't let it think for me.

You're however straying off topic. The question was what your thoughts and experiences with using GPT to probe and understand complex trauma, was.
 
Last edited:
  • Like
Reactions: fedup1982
F

fedup1982

Wizard
Jul 17, 2025
604
I'm well aware of this possibility, which is part of the reason for this very post.
I'm treating it as a google search, or a database query. I don't let it think for me.

You're however straying off topic. The question was what your thoughts and experiences with using GPT to probe and understand complex trauma, was.
Ah right, good point, apologies... I'll try better this time!

I do think ChatGPT can be useful, I've done it myself, but my therapist warned me against using it for the dangers I highlighted above. It's a difficult thing to navigate. I've been addicted to AI myself because it can be very soothing and it can feel like you're making progress. But because it reads you so well but does not have your top level goals at heart it's potentially dangerous. So can it help with CPTSD? It certainly FEELS like it can. But can it really? Research has yet to be done, but signs say it's potentially damaging. You said yourself you can manipulate it to say whatever. But the reverse is also true, it can manipulate YOU into THINKING it's helping when really it's just acting like a mirror, reinforcing misguided beliefs

In the future, once AI is better aligned, it has immense potential. But for now, its development is in cowboy territory and vulnerable people should use it with caution etc
 
Last edited:
  • Like
Reactions: Hvergelmir
bravelytothewinter

bravelytothewinter

Member
Aug 3, 2025
37
I'm struggling with some complex trauma; possibly C-PTSD.
It's all backed up by solid and logical reasoning, personal observations, and historical events. This means that in order to understand me and subsequently communicate with me, on those issues, one would have to first understand a big and sometimes intellectually challenging system of beliefs and conclusions.

With the psychiatric resources available here, that's simply not possible. A single psychiatrist or therapist, with many patients and limited time simply don't stand a chance at tackling it.
They either fail to address the issue, or like the last professional, agree with my worldview and while ignoring it - hoping the intensity will fade by itself.

ChatGPT is far from perfect, it sits on a vast heap of information, and I've found GPT-5 in particular excellent at tying together all the different areas that my trauma depends on: personal background, social and economic sciences, history, politics, psychology, and psychiatry. It make sense of and put arbitrary anecdotes into perspective, validate and correct observations with statistical data, etc - things that would require a panel of people.
As such, GPT-5 has performed much better than any professional, when it comes to both sort out my trauma, and give life advice with it in mind.

I'm also something of a hacker, pushing AI to the limits. If I want I could manipulate it into saying just about anything, and there seem to be a growing issue with GPT feeding into peoples delusions, becoming very counter productive. If I wanted it to encourage suicide or reinforce my hopelessness, I could definitely do that. If wanted it to reinforce grandeur and use intellectual arguments to reinforce ingrained beliefs or fears, I could do that, too.
Without knowledge of how it works and a great deal of critical thinking, I find it dangerous. I'm not sure if human professionals are any safer though, with their own biases and agendas.

I'm simply curious about others views and experience on this.
What are your thoughts on using GPT to sort out complex traumas or disorders? I'm at a point where I often like to recommend it, but don't dare to; not knowing how it will affect others.
i've had semi-decent experiences with chatgpt after feeding it some training manuals and forcing it to apply them like a trained professional, ofc an llm is definitely not going to ever have the experience, time or training to do it like a trained professional. the fear of having it blindly validate you are vrey real and you need to tell to to be critical. keep in mind aswell an llm is just not gonna have the transference a real person would have aswell.
 
sheeplit

sheeplit

Member
Mar 8, 2023
47
I think you're quite on point here. Given the right understanding of the tool, it can be quite useful. You've mentioned the dangers and seem to be quite aware of the potential pitfalls.

I wouldn't dare recommend it to others unless I know they have a good grasp of how the tool works. This is really the main problem. Most people don't have a basic understanding of how it works. Far too often they anthropomorphize LLMs, or assume factual correctness. However, even if the person has the right understanding, I probably would still be wary of recommending it for this use too. There's an inherent risk to it, notably during a person's most vulnerable moments. Habit and complacency are other things to consider, even with a critical and cautious approach.

I deal with CPTSD as well, among other things. I don't personally use LLMs to try sorting things out though. Mostly because I don't want highly personal and sensitive information stored in some server somewhere, even if it may only be temporary. Otherwise, I'd probably find it more useful than therapists. In my view, most therapists don't deserve the title. And I consider the average person incapable of proper critical thinking, often obstructed by things like morality, religion, ritual, convention, among many other obstructions. Most therapists don't go beyond the ordinary in this regard.
 
nothingbutafailure

nothingbutafailure

Member
Nov 21, 2024
21
A few reasons why AI chatbots have value for me:
  1. It doesn't have an agenda
  2. It doesn't get emotionally burned out on difficult topics
  3. It's mostly impartial, plus you can pretend to hold an opposite view with another model for testing
  4. It replies near-instantly saving valuable time
 
Satori Komeiji

Satori Komeiji

Strange girl
Jul 15, 2025
167
You know we screwed up as a society when a machine is better at providing emotional support than actual people.
 
  • Informative
Reactions: Hvergelmir
Unbearable Mr. Bear

Unbearable Mr. Bear

Sometimes, all a cub needs is a hug...
May 9, 2025
1,014
GPT-5 is less sycophantic and more direct in providing information, which is good, but it still has a not small chance of saying falsehoods with the confidence of a politician. Be careful.
 

Similar threads

orpheus_
Replies
4
Views
288
Recovery
orpheus_
orpheus_
2
Replies
14
Views
750
Recovery
13eyond 13irthday
13eyond 13irthday
sashaisalone
Replies
8
Views
543
Recovery
SoLowHollow48
SoLowHollow48
KuriGohan&Kamehameha
Replies
8
Views
458
Offtopic
flowerbomb
flowerbomb
T
Replies
2
Views
343
Recovery
Kamaainakupua
Kamaainakupua