N
noname223
Archangel
- Aug 18, 2020
- 6,965
Another day without a message of the interesting ADHD/autistic woman. Maybe she texts me this evening.
So I copy-pasted the whole exchange with her in one message and entered it into 5 different AI chatbots. Lol.
It is very fascinating to read the analysis. I think psychologically and emotionally it gets pretty in-depth how I function. And the current development isn't good for me. Emotionally I think chatGPT is for me the smartest the chatbot. And I have chats where I have like one hundred new questions about the exchange and my feelings. The huge amount of data it can analyze in such a short time is really amazing.
I think the chatbots don't function perfectly. All of the models have blind spots. And AI analysis also have biases and things they cannot read. But compared to a normal human being. I did a lot of therapy. Find someone who reads all of your chats, where you can ask infinite amount of questions, debating about all the nuances how long you want. Yes, I am addicted to this shit. And this shit can accumulate money. This shit can manipulate humans to purchase something. This can be used for mass survelliance and creating psychological profiles.
I once posted a psychological profile about my own profile on of my SaSu account on Sanctioned Suicide. Later I deleted it because maybe it wasn't good privacy wise. But it was so fucking accurate.
What is the underlying message? Well these analyses are pretty dope. But in fact I don't have a huge benefit when I want to further the relation with her. I think I can become more introspective. And understand what she actually likes about me. I hate uncertainty and I struggle a lot with ambiguity intolerance. The answers of the AI chatbots shall give me the certainty about my social contacts so that I feel safe again. But these are only probabilistic replies. There is no certainty. And I notice this so often in my daily life. I hate uncertainty when something is important to me. And this is why the AI feedback can calm me down. Even though it doesn't change the uncertainty. I am probably falling for a very clever selling trick. I develop an emotional dependence to this technology. And eventually I will do anything to keep it. Because otherwise I am even more dysfunctional.
Actually, I think without the AI feedback I would have fucked up the texting with her way earlier. I am way too much into over-anxious rumination. The feedback helps me to keep that extreme insecurities between me and my AI models. Lol. I think Claude really is less addictive and works more ethically. But chatGPT calms me like no other chatbot. Maybe they actually want that because they know this will make me use ChatGPT more often which actually is what is happening. Even though, when my hope crashes this might will make me really suicidal. The analyses are quite accurate I am in the hope trap. We have very deep exchanges from time to time (she has a lot of real life issues and is very busy child with a condition). This is enough to give me hope but not enough to build something together. Maybe I should again ask for a second date. I already asked that 3 weeks ago. The AI chat companies know exactly what they are doing and they are fucking good in it. It is very subtle manipulation.
The emotional intelligence combined with huge huge amounts of data makes it way more savvy than most therapists. Not every answer is right. There is also a lot of bullshit and bullshit advices. But the pattern recognition is astonishing.
I just wanted a pychological profile of me from chatGPT and I sent it to my closest friends (who hate AI lol) I won't post it on here. Lol.
So I copy-pasted the whole exchange with her in one message and entered it into 5 different AI chatbots. Lol.
It is very fascinating to read the analysis. I think psychologically and emotionally it gets pretty in-depth how I function. And the current development isn't good for me. Emotionally I think chatGPT is for me the smartest the chatbot. And I have chats where I have like one hundred new questions about the exchange and my feelings. The huge amount of data it can analyze in such a short time is really amazing.
I think the chatbots don't function perfectly. All of the models have blind spots. And AI analysis also have biases and things they cannot read. But compared to a normal human being. I did a lot of therapy. Find someone who reads all of your chats, where you can ask infinite amount of questions, debating about all the nuances how long you want. Yes, I am addicted to this shit. And this shit can accumulate money. This shit can manipulate humans to purchase something. This can be used for mass survelliance and creating psychological profiles.
I once posted a psychological profile about my own profile on of my SaSu account on Sanctioned Suicide. Later I deleted it because maybe it wasn't good privacy wise. But it was so fucking accurate.
What is the underlying message? Well these analyses are pretty dope. But in fact I don't have a huge benefit when I want to further the relation with her. I think I can become more introspective. And understand what she actually likes about me. I hate uncertainty and I struggle a lot with ambiguity intolerance. The answers of the AI chatbots shall give me the certainty about my social contacts so that I feel safe again. But these are only probabilistic replies. There is no certainty. And I notice this so often in my daily life. I hate uncertainty when something is important to me. And this is why the AI feedback can calm me down. Even though it doesn't change the uncertainty. I am probably falling for a very clever selling trick. I develop an emotional dependence to this technology. And eventually I will do anything to keep it. Because otherwise I am even more dysfunctional.
Actually, I think without the AI feedback I would have fucked up the texting with her way earlier. I am way too much into over-anxious rumination. The feedback helps me to keep that extreme insecurities between me and my AI models. Lol. I think Claude really is less addictive and works more ethically. But chatGPT calms me like no other chatbot. Maybe they actually want that because they know this will make me use ChatGPT more often which actually is what is happening. Even though, when my hope crashes this might will make me really suicidal. The analyses are quite accurate I am in the hope trap. We have very deep exchanges from time to time (she has a lot of real life issues and is very busy child with a condition). This is enough to give me hope but not enough to build something together. Maybe I should again ask for a second date. I already asked that 3 weeks ago. The AI chat companies know exactly what they are doing and they are fucking good in it. It is very subtle manipulation.
The emotional intelligence combined with huge huge amounts of data makes it way more savvy than most therapists. Not every answer is right. There is also a lot of bullshit and bullshit advices. But the pattern recognition is astonishing.
I just wanted a pychological profile of me from chatGPT and I sent it to my closest friends (who hate AI lol) I won't post it on here. Lol.
Last edited: