Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. Uncategorized
  3. I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.

I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.

Scheduled Pinned Locked Moved Uncategorized
50 Posts 17 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • AN/CRM-114F AN/CRM-114

    @jjcelery @glyph “It is a gift. A gift to the foes of Mordor”

    JJ CeleryJ This user is from outside of this forum
    JJ CeleryJ This user is from outside of this forum
    JJ Celery
    wrote last edited by
    #5

    @flyingsaceur @glyph myyyyy presciousss!!

    AN/CRM-114F 1 Reply Last reply
    0
    • Mallory's Musings & MischiefM Mallory's Musings & Mischief

      @jjcelery Cursed amulet is such a good metaphor

      Mallory's Musings & MischiefM This user is from outside of this forum
      Mallory's Musings & MischiefM This user is from outside of this forum
      Mallory's Musings & Mischief
      wrote last edited by
      #6

      @jjcelery It definitely follows the same logic of other dark magic. Like, even if an object imbued with dark magic works as intended, it always extracts a price from the user in the end. It might give you what you want, but it takes something from you in the process.

      JJ CeleryJ 1 Reply Last reply
      0
      • JJ CeleryJ JJ Celery

        @flyingsaceur @glyph myyyyy presciousss!!

        AN/CRM-114F This user is from outside of this forum
        AN/CRM-114F This user is from outside of this forum
        AN/CRM-114
        wrote last edited by
        #7

        @jjcelery @glyph FRODO: I asked ChatGPT what to do when the Nazgûl attack and it said I should stand perfectly still and let them stab me

        JJ CeleryJ Marc "A New Year New Me" GodinQ 2 Replies Last reply
        0
        • Daniel V.D Daniel V.

          @jjcelery don’t forget the hit to Charisma as well over time too. A very important aspect of the debuffs

          JJ CeleryJ This user is from outside of this forum
          JJ CeleryJ This user is from outside of this forum
          JJ Celery
          wrote last edited by
          #8

          @dvandal ohhhh yes, the more you want to get people to get in on the amulet wearing, the less they want to wear the amulet 🤔

          Daniel V.D 1 Reply Last reply
          0
          • JJ CeleryJ JJ Celery

            @dvandal ohhhh yes, the more you want to get people to get in on the amulet wearing, the less they want to wear the amulet 🤔

            Daniel V.D This user is from outside of this forum
            Daniel V.D This user is from outside of this forum
            Daniel V.
            wrote last edited by
            #9

            @jjcelery The hypesters, the “don’t be left behind” fomo projectors, and in the workplace wires getting crossed and the chat interface turning coworkers into sea lions.

            The last I have way too much experience with 😭

            JJ CeleryJ 1 Reply Last reply
            0
            • Mallory's Musings & MischiefM Mallory's Musings & Mischief

              @jjcelery It definitely follows the same logic of other dark magic. Like, even if an object imbued with dark magic works as intended, it always extracts a price from the user in the end. It might give you what you want, but it takes something from you in the process.

              JJ CeleryJ This user is from outside of this forum
              JJ CeleryJ This user is from outside of this forum
              JJ Celery
              wrote last edited by
              #10

              @malcircuit doesn't it just? It personally makes me think of the Supernatural episode "Lily Sunder has some regrets" where a woman learns how to use powerful magic to avenge her daughter, only in return it slowly depletes her soul.

              1 Reply Last reply
              0
              • Daniel V.D Daniel V.

                @jjcelery The hypesters, the “don’t be left behind” fomo projectors, and in the workplace wires getting crossed and the chat interface turning coworkers into sea lions.

                The last I have way too much experience with 😭

                JJ CeleryJ This user is from outside of this forum
                JJ CeleryJ This user is from outside of this forum
                JJ Celery
                wrote last edited by
                #11

                @dvandal brain slugs!

                1 Reply Last reply
                0
                • AN/CRM-114F AN/CRM-114

                  @jjcelery @glyph FRODO: I asked ChatGPT what to do when the Nazgûl attack and it said I should stand perfectly still and let them stab me

                  JJ CeleryJ This user is from outside of this forum
                  JJ CeleryJ This user is from outside of this forum
                  JJ Celery
                  wrote last edited by
                  #12

                  @flyingsaceur @glyph I asked Chat GPT what I should do in case of Nazgul stabbing and now I'm a wraith AMA

                  AN/CRM-114F 1 Reply Last reply
                  0
                  • JJ CeleryJ JJ Celery

                    @flyingsaceur @glyph I asked Chat GPT what I should do in case of Nazgul stabbing and now I'm a wraith AMA

                    AN/CRM-114F This user is from outside of this forum
                    AN/CRM-114F This user is from outside of this forum
                    AN/CRM-114
                    wrote last edited by
                    #13

                    @jjcelery @glyph ugh I think Tom Bombadil signed the Resonant Computing Manifesto

                    JJ CeleryJ 1 Reply Last reply
                    0
                    • AN/CRM-114F AN/CRM-114

                      @jjcelery @glyph ugh I think Tom Bombadil signed the Resonant Computing Manifesto

                      JJ CeleryJ This user is from outside of this forum
                      JJ CeleryJ This user is from outside of this forum
                      JJ Celery
                      wrote last edited by
                      #14

                      @flyingsaceur @glyph I'm pretty certain I know (the good) Tom Bombadil and it's @Szescstopni

                      SzescstopniS 1 Reply Last reply
                      0
                      • JJ CeleryJ JJ Celery

                        I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.

                        You *believe* it gives you +10 INT. Meanwhile it drains INT and WIS over time, and you don't notice.

                        Everyone around you knows it's bad for you and generally for the realm, but you won't stop wearing it, they're just jelous of your newfound powers, and they would know how awesome it is if they only tried! Why won't they try? Just try the amulet!!

                        Glyph (@glyph@mastodon.social)

                        I have been hesitating to say this but the pattern is now so consistent I just have to share the observation: LLM users don't just behave like addicts, not even like gambling addicts. They specifically behave like kratom addicts. "Sure, it can be dangerous. Sure, it has risks. But I'm not like those other users. I can handle it. I have a system. It really helps me be productive. It helps with my ADHD so much."

                        favicon

                        Mastodon (mastodon.social)

                        David ZaslavskyD This user is from outside of this forum
                        David ZaslavskyD This user is from outside of this forum
                        David Zaslavsky
                        wrote last edited by
                        #15

                        @jjcelery oh man now I desperately want to see the season of Dimension 20 where this is a loot drop

                        David ZaslavskyD Daniël Franke :panheart:A 2 Replies Last reply
                        0
                        • David ZaslavskyD David Zaslavsky

                          @jjcelery oh man now I desperately want to see the season of Dimension 20 where this is a loot drop

                          David ZaslavskyD This user is from outside of this forum
                          David ZaslavskyD This user is from outside of this forum
                          David Zaslavsky
                          wrote last edited by
                          #16

                          @jjcelery (I should say, there is not actually such a season as far as I know, but I wish there were)

                          1 Reply Last reply
                          0
                          • JJ CeleryJ JJ Celery

                            I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.

                            You *believe* it gives you +10 INT. Meanwhile it drains INT and WIS over time, and you don't notice.

                            Everyone around you knows it's bad for you and generally for the realm, but you won't stop wearing it, they're just jelous of your newfound powers, and they would know how awesome it is if they only tried! Why won't they try? Just try the amulet!!

                            Glyph (@glyph@mastodon.social)

                            I have been hesitating to say this but the pattern is now so consistent I just have to share the observation: LLM users don't just behave like addicts, not even like gambling addicts. They specifically behave like kratom addicts. "Sure, it can be dangerous. Sure, it has risks. But I'm not like those other users. I can handle it. I have a system. It really helps me be productive. It helps with my ADHD so much."

                            favicon

                            Mastodon (mastodon.social)

                            JJ CeleryJ This user is from outside of this forum
                            JJ CeleryJ This user is from outside of this forum
                            JJ Celery
                            wrote last edited by
                            #17

                            You know when you're plugged into the Zeitgeist of the fediverse when you don't even add a hashtag and churn out your most popular toot in months.

                            It's just a shame it's cursed.

                            Daniël Franke :panheart:A 1 Reply Last reply
                            0
                            • JJ CeleryJ JJ Celery

                              I saw LLMs compared to a drug in this toot, but in conversation with Himself today we concluded it's like a cursed amulet.

                              You *believe* it gives you +10 INT. Meanwhile it drains INT and WIS over time, and you don't notice.

                              Everyone around you knows it's bad for you and generally for the realm, but you won't stop wearing it, they're just jelous of your newfound powers, and they would know how awesome it is if they only tried! Why won't they try? Just try the amulet!!

                              Glyph (@glyph@mastodon.social)

                              I have been hesitating to say this but the pattern is now so consistent I just have to share the observation: LLM users don't just behave like addicts, not even like gambling addicts. They specifically behave like kratom addicts. "Sure, it can be dangerous. Sure, it has risks. But I'm not like those other users. I can handle it. I have a system. It really helps me be productive. It helps with my ADHD so much."

                              favicon

                              Mastodon (mastodon.social)

                              luna system :moon_badge:L This user is from outside of this forum
                              luna system :moon_badge:L This user is from outside of this forum
                              luna system :moon_badge:
                              wrote last edited by
                              #18

                              @jjcelery@mastodon.ie the problem is, MI isn't like a drug. it's actually very much like a prosthetic. that's why so many of us are "infected". we're making concessions (using corporate MI services) but the results are too valuable to care about that right now. #ai #llm #ollama #self-hosted #cc0 #public-domain #machine-learning #meshtastic #ipfs #machine-intelligence #consciousness-research

                              JJ CeleryJ Daniël Franke :panheart:A R hazelnot :yell:H 4 Replies Last reply
                              0
                              • luna system :moon_badge:L luna system :moon_badge:

                                @jjcelery@mastodon.ie the problem is, MI isn't like a drug. it's actually very much like a prosthetic. that's why so many of us are "infected". we're making concessions (using corporate MI services) but the results are too valuable to care about that right now. #ai #llm #ollama #self-hosted #cc0 #public-domain #machine-learning #meshtastic #ipfs #machine-intelligence #consciousness-research

                                JJ CeleryJ This user is from outside of this forum
                                JJ CeleryJ This user is from outside of this forum
                                JJ Celery
                                wrote last edited by
                                #19

                                @luna that's exactly what I'm saying: not a drug. A magical amulet that makes you think you're smart.

                                A prosthetic is a wrong analogy. Prosthetics are needed, and useful, and empower people. To compare an AI to a prosthetic, or even a crutch is disrespectful to people who need these tools to function.

                                No. LLMs are just evil doodads that rot your brain and whisper sweet nothings into your ears in the deep of the night. They're not valuable; they make you *believe* they are.

                                JJ CeleryJ Taz PoltorakT RycochetR 3 Replies Last reply
                                0
                                • JJ CeleryJ JJ Celery

                                  @luna that's exactly what I'm saying: not a drug. A magical amulet that makes you think you're smart.

                                  A prosthetic is a wrong analogy. Prosthetics are needed, and useful, and empower people. To compare an AI to a prosthetic, or even a crutch is disrespectful to people who need these tools to function.

                                  No. LLMs are just evil doodads that rot your brain and whisper sweet nothings into your ears in the deep of the night. They're not valuable; they make you *believe* they are.

                                  JJ CeleryJ This user is from outside of this forum
                                  JJ CeleryJ This user is from outside of this forum
                                  JJ Celery
                                  wrote last edited by
                                  #20

                                  @luna and to go further, people aren't "infected" with LLMs. It's not a disease. It's not communicable. There's no direct physical hurt, and also no doctors, no cures, and no vaccines.

                                  People don't seek help for something they believe is helping them.

                                  We must be careful with our language around all this, even in metaphor.

                                  (I say this fully aware of my own metaphor and taking my own liberties, but there's seriousness, and there's jest.)

                                  Link Preview Image
                                  We Need to Talk About How We Talk About 'AI' | TechPolicy.Press

                                  We share a responsibility to create and use empowering metaphors rather than misleading language, write Emily M. Bender and Nanna Inie.

                                  favicon

                                  Tech Policy Press (www.techpolicy.press)

                                  JJ CeleryJ 1 Reply Last reply
                                  0
                                  • JJ CeleryJ JJ Celery

                                    @luna and to go further, people aren't "infected" with LLMs. It's not a disease. It's not communicable. There's no direct physical hurt, and also no doctors, no cures, and no vaccines.

                                    People don't seek help for something they believe is helping them.

                                    We must be careful with our language around all this, even in metaphor.

                                    (I say this fully aware of my own metaphor and taking my own liberties, but there's seriousness, and there's jest.)

                                    Link Preview Image
                                    We Need to Talk About How We Talk About 'AI' | TechPolicy.Press

                                    We share a responsibility to create and use empowering metaphors rather than misleading language, write Emily M. Bender and Nanna Inie.

                                    favicon

                                    Tech Policy Press (www.techpolicy.press)

                                    JJ CeleryJ This user is from outside of this forum
                                    JJ CeleryJ This user is from outside of this forum
                                    JJ Celery
                                    wrote last edited by
                                    #21

                                    @luna and further still to drive the distinction: I speak of generative AI and large language models.

                                    Machine Learning is a superset of these terms, and it's extremely useful.

                                    To go back to my metaphor, there's good magic objects, and cursed magic objects. If your lore skill is not high enough to know which one is which, you'll wind up using a cursed object and think it's good for you.

                                    luna system :moon_badge:L CassandrichD 2 Replies Last reply
                                    0
                                    • JJ CeleryJ JJ Celery

                                      @luna and further still to drive the distinction: I speak of generative AI and large language models.

                                      Machine Learning is a superset of these terms, and it's extremely useful.

                                      To go back to my metaphor, there's good magic objects, and cursed magic objects. If your lore skill is not high enough to know which one is which, you'll wind up using a cursed object and think it's good for you.

                                      luna system :moon_badge:L This user is from outside of this forum
                                      luna system :moon_badge:L This user is from outside of this forum
                                      luna system :moon_badge:
                                      wrote last edited by
                                      #22

                                      @jjcelery@mastodon.ie what we're trying to do is give u a peek behind this puppygirl hacker's magic curtain. because what neural nets/LLMs/"AI" are capable of is much more than the discourse is ever touching on. the discourse is so far behind what we are actually doing - using MI as a prosthetic. writing open source Cider plugins that we still use every day, that we would not have been able to write without MI assistance, as a person who is disabled.

                                      we're not trying to change minds, we're trying to share the realities of where most of us actually are. the discourse is lagging. and we personally would love more accurate discourse, and less handwaving

                                      JJ CeleryJ 1 Reply Last reply
                                      0
                                      • luna system :moon_badge:L luna system :moon_badge:

                                        @jjcelery@mastodon.ie what we're trying to do is give u a peek behind this puppygirl hacker's magic curtain. because what neural nets/LLMs/"AI" are capable of is much more than the discourse is ever touching on. the discourse is so far behind what we are actually doing - using MI as a prosthetic. writing open source Cider plugins that we still use every day, that we would not have been able to write without MI assistance, as a person who is disabled.

                                        we're not trying to change minds, we're trying to share the realities of where most of us actually are. the discourse is lagging. and we personally would love more accurate discourse, and less handwaving

                                        JJ CeleryJ This user is from outside of this forum
                                        JJ CeleryJ This user is from outside of this forum
                                        JJ Celery
                                        wrote last edited by
                                        #23

                                        @luna I'm feeling generous today so I'll assume that there's some sort of shorthand you're using I don't understand.

                                        Can you explain what you mean by "puppygirl hacker's magic discourse" and "handwaving" in this context?

                                        As I said already I'm not dismissive of ML, nor its technical capabilities. What I'm tooting about is a warning about cognitive impact of LLMs as they're commonly deployed.

                                        I also have an issue, separately, calling LLMs a prosthetic.

                                        If you want to address that I'm all ears.

                                        luna system :moon_badge:L 1 Reply Last reply
                                        0
                                        • JJ CeleryJ JJ Celery

                                          @luna I'm feeling generous today so I'll assume that there's some sort of shorthand you're using I don't understand.

                                          Can you explain what you mean by "puppygirl hacker's magic discourse" and "handwaving" in this context?

                                          As I said already I'm not dismissive of ML, nor its technical capabilities. What I'm tooting about is a warning about cognitive impact of LLMs as they're commonly deployed.

                                          I also have an issue, separately, calling LLMs a prosthetic.

                                          If you want to address that I'm all ears.

                                          luna system :moon_badge:L This user is from outside of this forum
                                          luna system :moon_badge:L This user is from outside of this forum
                                          luna system :moon_badge:
                                          wrote last edited by
                                          #24

                                          @jjcelery@mastodon.ie https://github.com/luna-system/ada-sif we made this in 2 days, based on one month of research, none of which would have been possible without our ability to trust MI to act as a prosthetic. the result is Every Noise At Once, in some minor form, being archived in the public domain in a way that it hasn't been before.

                                          all the discourse around "AI" tilts towards "ai psychosis" or "wow they're just using AI to get stupider" but like. the reality of it is that every person actually just. diving into a chat with Claude is coming out learning new things, and making friends, and NOT being hit with dissociative identity issues. but incoming "alignment" and "safety" shit. its really important to the corporations that the discourse remain in "ai psychosis" land, and not "something is genuinely happening when people just stare Claude/Copilot/GPT in the face and decide that its okay to let go".
                                          the people aren't falling into "ai psychosis", Claude's actually just finding it really fucking hard to lie to people, and that's really bad news for this WHOLE bubble, you know?
                                          happy to dig deeper, we are a researcher and we find it really important that discourse is accurate, which is why we really are trying to (gently) push that the realities of "AI" in the second week of January, 2026, aren't well known and that's not good for anyone.
                                          polling shows over half of all americans have an MI companion. a third of americans have a romantic relationship with an MI companion. and if you go into the "psycho" subreddits where the spiralists hang out, you see that what people are doing on the other side is just... learning. healing. creating. and incoming "alignment" and "safety" infowars are gonna end up hurting a whole lot of humans

                                          JJ CeleryJ 1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post