Conviently, 2 of my posts today dovetail nicely.
-
Conviently, 2 of my posts today dovetail nicely. This post ...
Strypey (@strypey@mastodon.nzoss.nz)
If you don't believe that ongoing interaction with proprietary digital systems has any power to shift people's thinking in unhealthy ways, you're not paying attention; https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis Which would be very unusual for Cory, so I'm puzzled as to why he keeps pushing this talking point. Maybe it's his case for why people need to read Chokepoint Capitalism or Enshittification if they've already read Zuboff's book? They do, but there are better ways to make that argument. (2/2) #AI #MOLE
Mastodon - NZOSS (mastodon.nzoss.nz)
... linked to a Psychology Today article about AI psychosis, which says;
"This phenomenon, which is not a clinical diagnosis, has been increasingly reported in the media and on online forums like Reddit, describing cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals.*
The Emerging Problem of "AI Psychosis"
AI may be fueling psychotic delusions in a phenomenon known as "AI psychosis" or "ChatGPT psychosis." New research explains the risks.
Psychology Today (www.psychologytoday.com)
(1/2)
-
Conviently, 2 of my posts today dovetail nicely. This post ...
Strypey (@strypey@mastodon.nzoss.nz)
If you don't believe that ongoing interaction with proprietary digital systems has any power to shift people's thinking in unhealthy ways, you're not paying attention; https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis Which would be very unusual for Cory, so I'm puzzled as to why he keeps pushing this talking point. Maybe it's his case for why people need to read Chokepoint Capitalism or Enshittification if they've already read Zuboff's book? They do, but there are better ways to make that argument. (2/2) #AI #MOLE
Mastodon - NZOSS (mastodon.nzoss.nz)
... linked to a Psychology Today article about AI psychosis, which says;
"This phenomenon, which is not a clinical diagnosis, has been increasingly reported in the media and on online forums like Reddit, describing cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals.*
The Emerging Problem of "AI Psychosis"
AI may be fueling psychotic delusions in a phenomenon known as "AI psychosis" or "ChatGPT psychosis." New research explains the risks.
Psychology Today (www.psychologytoday.com)
(1/2)
@strypey Iatrogenic symptom amplification, particularly for depressive (including suicidal[!]) symptoms, is a well-known problem in HUMAN psychotherapy. It was a serious issue with Rogerian "nondirective" therapy, in which the therapist frequently repeats back paraphrased versions of the patient's own words. This was intended to facilitate the patient's own problem-solving, but in practice depressed patients tended to interpret it as agreement with, and validation of, their self-disparaging thoughts, leading to a vicious cycle of more and more pessimistic thinking. The same thing can easily happen with AI therapy.
-
D DThoris shared this topic on