Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
-
The 'typing' of the AI options kinda nails it:
'ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy.'
@f800gecko “you’re absolutely right!”
-
Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.
(abiawomosu.substack.com)
@cstross I get *exactly* that feeling those women interviewed describe when looking at AI imagery, reading AI text or when I've had to interact with an LLM chatbot. Utter revulsion from deep in the hindbrain, sometimes surfacing *before* I'm sure I'm looking at AI output.
I can't explain exactly why. (And I'm not a woman, so I don't think the suggested explanation here would apply to me.)
-
Why agentic AI creeps out women: because it's written by bros for bros who want Stepford Wives, and conveys that vibe: commercial AIs are fine-tuned to perform as cheerfully submissive handmaids, not working partners:
They Built Stepford AI and Called It “Agentic”
Women’s “ick” for AI isn’t technophobia or a gap to close. It’s wisdom to act on.
(abiawomosu.substack.com)
@cstross there is truth to it, but really tendentious writing.
The substack profile picture mentions "telling you a different story about AI". A story is not an assessment.
The suggestion is nice, but often in practice people will use other peoples' server, i'd not depend or send too much...
-
@cstross I get *exactly* that feeling those women interviewed describe when looking at AI imagery, reading AI text or when I've had to interact with an LLM chatbot. Utter revulsion from deep in the hindbrain, sometimes surfacing *before* I'm sure I'm looking at AI output.
I can't explain exactly why. (And I'm not a woman, so I don't think the suggested explanation here would apply to me.)
-
The 'typing' of the AI options kinda nails it:
'ChatGPT’s defining feature isn’t efficiency. It’s sycophancy. She agrees with whatever you say. Validates your position even when you’re wrong. Tells you what you want to hear. Never really pushes back. Makes you feel smart regardless of reality.
This isn’t a bug. It’s retention strategy.'
A striking similarity to drugs like cocaine. Maybe it is similarly addictive, too.
-
@Colman @cstross I can't explain what it is. And it's more pronounced with images than with text (but *even more* pronounced when I've had to use a chatbot).
It's not simply distaste; it's a revulsion that feels so deep-seated and primal that I imagine my pet lizard would recognize the feeling itself (though obviously not the context).
-
@Colman @cstross I can't explain what it is. And it's more pronounced with images than with text (but *even more* pronounced when I've had to use a chatbot).
It's not simply distaste; it's a revulsion that feels so deep-seated and primal that I imagine my pet lizard would recognize the feeling itself (though obviously not the context).