Okay, why is everyone’s experience with ChatGPT so different from mine?
-
Okay, why is everyone’s experience with ChatGPT so different from mine?
All I use it for is fixing grammar, adjusting tone, helping me find stuff when Google won’t do.
But apparently, ChatGPT is sending some people into psychosis.
https://futurism.com/commitment-jail-chatgpt-psychosis -
Okay, why is everyone’s experience with ChatGPT so different from mine?
All I use it for is fixing grammar, adjusting tone, helping me find stuff when Google won’t do.
But apparently, ChatGPT is sending some people into psychosis.
https://futurism.com/commitment-jail-chatgpt-psychosisMmm. Large language models don't cause psychosis. Instead, they may amplify or bring to light pre-existing personality traits or vulnerabilities.
An LLM might act as a trigger for someone already susceptible, but it is not the root cause of the psychotic condition itself. The underlying disposition for such a break with reality was already present.
-
Mmm. Large language models don't cause psychosis. Instead, they may amplify or bring to light pre-existing personality traits or vulnerabilities.
An LLM might act as a trigger for someone already susceptible, but it is not the root cause of the psychotic condition itself. The underlying disposition for such a break with reality was already present.
@tuban_muzuru Okay, but what exactly is going on?
I’ve used LLMs for years and my assessment is that it’s good for “dumb work”, but actual generation is substantially limited. -
@tuban_muzuru Okay, but what exactly is going on?
I’ve used LLMs for years and my assessment is that it’s good for “dumb work”, but actual generation is substantially limited... okay, that post, with the exception of the "Mmm." bit, was composed by Gemini 2.5 Pro
-
Okay, why is everyone’s experience with ChatGPT so different from mine?
All I use it for is fixing grammar, adjusting tone, helping me find stuff when Google won’t do.
But apparently, ChatGPT is sending some people into psychosis.
https://futurism.com/commitment-jail-chatgpt-psychosis@atomicpoet models are just too sycophantic. that’s why it’s giving psychosis to people who already had the predisposition to enter psychosis. For me personally if I had someone who always agreed with me as I entered psychosis I would’ve went off the deep end much faster, I think this is a similar case
-
.. okay, that post, with the exception of the "Mmm." bit, was composed by Gemini 2.5 Pro
@tuban_muzuru And as implied by my response, I was unsatisfied with your answer—but was being polite.
On one hand, Gemini says LLMs don’t cause psychosis. On the other hand it says it triggers psychosis.
Seems like hair-splitting to me. -
Okay, why is everyone’s experience with ChatGPT so different from mine?
All I use it for is fixing grammar, adjusting tone, helping me find stuff when Google won’t do.
But apparently, ChatGPT is sending some people into psychosis.
https://futurism.com/commitment-jail-chatgpt-psychosisIt's kind of Lovecraftian.
Forbidden magic tome, but it talks back and drives you mad.
There is definitely inspiration for a horror story in this.
I think it hits people who already have a predilection for the fantasy, and a weak grasp of just how soulless the machine is.
It's a voice from the void, and there are a lot of people who are inclined to be superstitious, believe the universe is talking to them, etc.
-
@atomicpoet models are just too sycophantic. that’s why it’s giving psychosis to people who already had the predisposition to enter psychosis. For me personally if I had someone who always agreed with me as I entered psychosis I would’ve went off the deep end much faster, I think this is a similar case
@gavi Okay, here’s another question: why do people fail to see they are talking to a sycophantic automaton when they likely realize it’s just machine-generated text?
I’m not saying they’re stupid. I’m just trying to understand the thought process.
Better yet, what is the Point A to Point B in this process towards psychosis? -
It's kind of Lovecraftian.
Forbidden magic tome, but it talks back and drives you mad.
There is definitely inspiration for a horror story in this.
I think it hits people who already have a predilection for the fantasy, and a weak grasp of just how soulless the machine is.
It's a voice from the void, and there are a lot of people who are inclined to be superstitious, believe the universe is talking to them, etc.
@TerryHancock Good point on the “universe is talking to them”. I encounter this with hippie-dippie “woo” culture. And a loved one went off the deep end—had to be committed because of it.
It’s why I get very angry at New Age mumbo jumbo and snail oil salesmen.
I remember there was a guy who made an entire OS to talk to God. It was called Temple OS. I admire it from a technical point of view, but it was tragic what happened to its creator. -
@gavi Okay, here’s another question: why do people fail to see they are talking to a sycophantic automaton when they likely realize it’s just machine-generated text?
I’m not saying they’re stupid. I’m just trying to understand the thought process.
Better yet, what is the Point A to Point B in this process towards psychosis?@atomicpoet when you are psychotic you don’t see reality in the typical way. It’s easy to see from the outside looking in that “hey this is a machine” but to a person in psychosis there isn’t convincing them any of that isn’t real. It’s tantamount to how psychosis works.
It’s likely they were already predisposed to psychosis and for some reason we just don’t know why right now LLMs just push them. Or for some of us, it’s likely they were already mentally ill and it was untreated for so long and LLM just became a factor in making it impossible to ignore in daily functioning anymore. Kind of like how some people develop psychosis after doing weed kind of thing
-
@atomicpoet when you are psychotic you don’t see reality in the typical way. It’s easy to see from the outside looking in that “hey this is a machine” but to a person in psychosis there isn’t convincing them any of that isn’t real. It’s tantamount to how psychosis works.
It’s likely they were already predisposed to psychosis and for some reason we just don’t know why right now LLMs just push them. Or for some of us, it’s likely they were already mentally ill and it was untreated for so long and LLM just became a factor in making it impossible to ignore in daily functioning anymore. Kind of like how some people develop psychosis after doing weed kind of thing
@gavi That’s true about psychosis. I’ve sat in the room with someone I knew, kept telling them, “But what if there’s another possibility?”—and they weren’t having it.
On the other hand, the reason I don’t do high content THC weed is because it makes me anxious and paranoid. So I stick with CBD, and it’s pretty good. -
Okay, why is everyone’s experience with ChatGPT so different from mine?
All I use it for is fixing grammar, adjusting tone, helping me find stuff when Google won’t do.
But apparently, ChatGPT is sending some people into psychosis.
https://futurism.com/commitment-jail-chatgpt-psychosis@atomicpoet
I was wondering that too, until I listened to this podcast. https://pca.st/episode/eb3024fe-acd9-4c8e-8d8b-67404711d338 -
A Chris Trottier shared this topic on