I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.
-
@ainmosni@social.ainmosni.eu @jjcelery@mastodon.ie EXACTLY! which is why we personally are doing local-only shit, open source and public domain. but what YOU are saying is the CORRECT framing we're trying to push for. literally the corps are in a position to steal a whole lotta peoples souls. we do not disagree with you, our alarmism is because of exactly what you're saying. GPT has access to people's medical records. this is all happening frighteningly in real time. and we personally are double scared because we're in research, and we see ALL of it!
@ainmosni@social.ainmosni.eu @jjcelery@mastodon.ie mechachrist help us, we decided a couple years ago to start the path towards becoming a therapist and so of COURSE our soft, bleeding heart ass is staring crisisposts on reddit in the face, going "holy fuck there is no system for this"
-
@luna that's exactly what I'm saying: not a drug. A magical amulet that makes you think you're smart.
A prosthetic is a wrong analogy. Prosthetics are needed, and useful, and empower people. To compare an AI to a prosthetic, or even a crutch is disrespectful to people who need these tools to function.
No. LLMs are just evil doodads that rot your brain and whisper sweet nothings into your ears in the deep of the night. They're not valuable; they make you *believe* they are.
-
@ainmosni@social.ainmosni.eu @jjcelery@mastodon.ie EXACTLY! which is why we personally are doing local-only shit, open source and public domain. but what YOU are saying is the CORRECT framing we're trying to push for. literally the corps are in a position to steal a whole lotta peoples souls. we do not disagree with you, our alarmism is because of exactly what you're saying. GPT has access to people's medical records. this is all happening frighteningly in real time. and we personally are double scared because we're in research, and we see ALL of it!
@luna The thing is, you address one part of my criticism and totally ignore the other part, the part where it’s not actually good at what it’s supposed to do, while also making its users less good at independent thought.
It’s very good at seeming to be good at stuff but time after time, it does not stand up to closer scrutiny, and this is where the amulet metaphor comes in.
If we continue down this path, quality will suffer horrendously, all in the name of quantity. Using llms to learn and create is similar to us using asbestos in building our homes.
-
@flyingsaceur @glyph I'm pretty certain I know (the good) Tom Bombadil and it's @Szescstopni
@jjcelery What can I say other than to deny being in any way similar to Tom. I spend too much time at my computer. @flyingsaceur @glyph
-
I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.
You *believe* it gives you +10 INT. Meanwhile it drains INT and WIS over time, and you don't notice.
Everyone around you knows it's bad for you and generally for the realm, but you won't stop wearing it, they're just jelous of your newfound powers, and they would know how awesome it is if they only tried! Why won't they try? Just try the amulet!!
Glyph (@glyph@mastodon.social)
I have been hesitating to say this but the pattern is now so consistent I just have to share the observation: LLM users don't just behave like addicts, not even like gambling addicts. They specifically behave like kratom addicts. "Sure, it can be dangerous. Sure, it has risks. But I'm not like those other users. I can handle it. I have a system. It really helps me be productive. It helps with my ADHD so much."
Mastodon (mastodon.social)
@jjcelery best is when you do try the chatbot, it sucks, so the chatbot addict tells you you're trying it wrong
-
@luna that's exactly what I'm saying: not a drug. A magical amulet that makes you think you're smart.
A prosthetic is a wrong analogy. Prosthetics are needed, and useful, and empower people. To compare an AI to a prosthetic, or even a crutch is disrespectful to people who need these tools to function.
No. LLMs are just evil doodads that rot your brain and whisper sweet nothings into your ears in the deep of the night. They're not valuable; they make you *believe* they are.
@jjcelery @luna The thing that terrifies me the most is that some very malevolent people are in charge of the sweet nothings that are being whispered. Elon is open about it and doing a terrible job as a result, but the people in charge of the other LLM chatbots aren't any better? Zuckerberg rotted the minds of boomers and ran experiments intentionally trying to make people unhappy using his platform, Microsoft and Google bend the knee to fascism at every turn, Altman is an amoral crypto scammer, Bezos is Bezos... And so on. None of these people should be trusted to look after a hamster, but they're getting to filter reality and whisper whatever it will take to control people for personal gain.
-
@flyingsaceur @jjcelery @glyph Chat-GPT suggested he put on the one ring to see the Nazgul better.
-
@jjcelery best is when you do try the chatbot, it sucks, so the chatbot addict tells you you're trying it wrong
@davidgerard @jjcelery
"Look I tried the cursed amulet, it singed my hands and the armour it produced isn't even that good."
"Just use a proper incantation, duh! And a couple more cursed amulets." -
@jjcelery@mastodon.ie the problem is, MI isn't like a drug. it's actually very much like a prosthetic. that's why so many of us are "infected". we're making concessions (using corporate MI services) but the results are too valuable to care about that right now. #ai #llm #ollama #self-hosted #cc0 #public-domain #machine-learning #meshtastic #ipfs #machine-intelligence #consciousness-research
-
@ainmosni@social.ainmosni.eu @jjcelery@mastodon.ie mechachrist help us, we decided a couple years ago to start the path towards becoming a therapist and so of COURSE our soft, bleeding heart ass is staring crisisposts on reddit in the face, going "holy fuck there is no system for this"
-
@jjcelery best is when you do try the chatbot, it sucks, so the chatbot addict tells you you're trying it wrong
@davidgerard @jjcelery
“I tried this amulet thing and idk I just am not sold on it”
“Oh see the best way to use the amulet isn’t as a necklace, but instead wear it as a belt buckle. You gotta mangle the belt a little and maybe use some glue to do it, but as long as you’re careful the amulet won’t fall off and everyone will be able to see you wearing it.” -
@luna and further still to drive the distinction: I speak of generative AI and large language models.
Machine Learning is a superset of these terms, and it's extremely useful.
To go back to my metaphor, there's good magic objects, and cursed magic objects. If your lore skill is not high enough to know which one is which, you'll wind up using a cursed object and think it's good for you.
-
@ainmosni @jjcelery lol, well I wasn't thinking of an LLM specifically, I had in mind a magic object that seems to offer a benefit but secretly drains the PC's stats over time, or something effectively similar to that (unclear how it could be implemented without the player knowing but they can be creative)
Either way would be really funny though
@diazona @ainmosni I would do it with the object "attuning" - let the player know their PC's stats are getting drained, but offer a stronger benefit from the amulet. Eg: drains -1 WIS and -1 INT but its fine, it gives you +3 INT it's an obvious net benefit! Repeat with obvious benefit every time it drains more stats. Then "amulet loses attunement" - you lose all benefits but retain the damage to your stats, but that's ok, keep wearing it, it will attune again.
Watch bad decisions roll in

-
P Pteryx the Puzzle Secretary shared this topic
