Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. TechTakes
  3. Stubsack: weekly thread for sneers not worth an entire post, week ending 1st February 2026

Stubsack: weekly thread for sneers not worth an entire post, week ending 1st February 2026

Scheduled Pinned Locked Moved TechTakes
techtakes
209 Posts 47 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • nightsky@awful.systemsN nightsky@awful.systems

    Claim 1: Every regular LLM user is undergoing “AI psychosis”. Every single one of them, no exceptions.

    I wouldn’t go as far as using the “AI psychosis” term here, I think there is more than a quantitative difference. One is influence, maybe even manipulation, but the other is a serious mental health condition.

    I think that regular interaction with a chatbot will influence a person, just like regular interaction with an actual person does. I don’t believe that’s a weakness of human psychology, but that it’s what allows us to build understanding between people. But LLMs are not people, so whatever this does to the brain long term, I’m sure it’s not good. Time for me to be a total dork and cite an anime quote on human interaction: “I create them as they create me” – except that with LLMs, it actually goes only in one direction… the other direction is controlled by the makers of the chatbots. And they have a bunch of dials to adjust the output style at any time, which is an unsettling prospect.

    while atrophying empathy

    This possibility is to me actually the scariest part of your post.

    M This user is from outside of this forum
    M This user is from outside of this forum
    mirrorwitch@awful.systems
    wrote last edited by
    #98

    I don’t mean the term “psychosis” as a depreciative, I mean in the clinical sense of forming a model of the world that deviates from consensus reality, and like, getting really into it.

    For example, the person who posted the Matrix non-code really believed they had implemented the protocol, even though for everyone else it was patently obvious the code wasn’t there. That vibe-coded browser didn’t even compile, but they also were living in a reality where they made a browser. The German botanics professor thought it was a perfectly normal thing to admit in public that his entire academic output for the past 2 years was autogenerated, including his handling of student data. And it’s by now a documented phenomenon how programmers think they’re being more productive with LLM assistants, but when you try to measure the productivity, it evaporates.

    These psychoses are, admittely, much milder and less damaging than the Omega Jesus desert UFO suicide case. But they’re delusions nonetheless, and moreover they’re caused by the same mechanism, viz. the chatbot happily doubling down on everything you say—which means at any moment the “mild” psychoses, too, may end up into a feedback loop that escalates them to dangerous places.

    That is, I’m claiming LLMs have a serious issue with hallucinations, and I’m not talking about the LLM hallucinating.


    Notice that this claim is quite independent of the fact that LLMs have no real understanding or human-like cognition, or that they necessarily produce errors and can’t be trusted, or that these errors happen to be, by design, the hardest possible type of error to detect—signal-shaped noise. These problems are bad, sure. But the thing where people hooked on LLMs inflate delusions about what the LLM is even actually doing for them—that seems to me an entirely separate mechanism; something that happens when a person has a syntactically very human-like conversation partner that is a perfect slave, always available, always willing to do whatever you want, always zero pushback, who engages into a crack-cocaine version of brownosing. That’s why I compare it to cult dynamics—the kind of group psychosis in a cult isn’t a product of the leader’s delusions alone, there’s a way that the followers vicariously power trip along with their guru and constantly inflate his ego to chase the next hit together.

    It is conceivable to me that someone could make a neutral-toned chatbot programmed to never 100% agree with the user and it wouldn’t generate these psychotic effects. Only no company will do that because these things are really expensive to run and they’re already bleeding money, they need every trick in the book to get users to stay hooked. But I think nobody in the world had predicted just how badly one can trip when you have “dr. flattery the alwayswrong bot” constantly telling you what a genius you are.

    1 Reply Last reply
    0
    • B bluemonday1984@awful.systems

      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

      (Credit and/or blame to David Gerard for starting this. What a year, huh?)

      self@awful.systemsS This user is from outside of this forum
      self@awful.systemsS This user is from outside of this forum
      self@awful.systems
      wrote last edited by
      #99

      I just made a quick change to our lemmy-ui config to hopefully quickly restart it when it once again leaks all of its memory and gets stuck in a tight set of GC cycles due to a “high” amount of traffic (normal for a public website being scraped, even with iocaine) even though it’s a node application that barely does anything at all

      I can’t monitor this one as close as I’d like today so if something breaks and it doesn’t resolve itself within a couple minutes, or the instance looks like it’s in a crash loop, ping [object Object] on mastodon and I’ll try and get to it as soon as I can

      1 Reply Last reply
      0
      • B bluemonday1984@awful.systems

        Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

        Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

        Any awful.systems sub may be subsneered in this subthread, techtakes or no.

        If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

        The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

        Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

        (Credit and/or blame to David Gerard for starting this. What a year, huh?)

        blakestacey@awful.systemsB This user is from outside of this forum
        blakestacey@awful.systemsB This user is from outside of this forum
        blakestacey@awful.systems
        wrote last edited by
        #100

        Jeff Sharlet (@jeffsharlet.bsky.social):

        The college at which I’m employed, which has signed a contract with the AI firm that stole books from 131 colleagues & me, paid a student to write an op-ed for the student paper promoting AI, guided the writing of it, and did not disclose this to the paper. […] the student says while the college coached him to write the oped, he was paid by the AI project, which is connected with the college. The student paper’s position is that the college paid him. And there’s no question that college attempted to place a pro-AI op-ed.

        https://www.thedartmouth.com/article/2026/01/zhang-college-approached-and-paid-student-to-write-op-ed-in-the-dartmouth

        B 1 Reply Last reply
        0
        • S soyweiser@awful.systems

          Finally read Greg Egan - Permutation City, and looked at the LW discussion on the book, and I feel like they missed a lot of things (and why are they talking about the ‘dust theory’ like it is real). Oof, the wikipedia page on the themes and settings has a similar problem however. It is like they all ignored the second half of the book and the themes contained within it, not one mention of the city being called Elysium for example.

          And of course Zack_M_Davis thinks the plotline of the guy who killed the drug dealer (not a sex worker/prostitute lw people, please the text makes this pretty clear) he was in a lowkey romantic relationship with should have been cut (also not mentioned on the wikipedia page iirc).

          Also, nobody seems to talk about the broken sewage pipe.

          S This user is from outside of this forum
          S This user is from outside of this forum
          saucerwizard@awful.systems
          wrote last edited by
          #101

          these people are just mind numbingly lame.

          1 Reply Last reply
          0
          • rook@awful.systemsR rook@awful.systems

            Just seen a clip of aronofsky’s genai revolutionary war thing and it is incredibly bad. Just… every detail is shit. Ways in which I hadn’t previously imagined that the uncanny valley would intrude. Even if it weren’t for the simulated flesh golems, one of whom seems to be wearing anthony hopkins’ skin as a clumsy disguise, the framing and pacing just feels like the model was trained on endless adverts and corporate speaking head videos, and either it was impossible to edit, or none the crew have any idea what even mediocre films look like.

            I also hadn’t appreciated before that genai lip sync/dubbing was just embarrassing. I think I’ve only seen a couple of very short genai video clips before, and the most recent at least 6 months ago, but this just seems straight up broken. Have the people funding this stuff ever looked at what is being generated?

            https://bsky.app/profile/ethangach.bsky.social/post/3mdljt2wdcs2v

            self@awful.systemsS This user is from outside of this forum
            self@awful.systemsS This user is from outside of this forum
            self@awful.systems
            wrote last edited by
            #102

            aronofsky should’ve stuck with the true future of entertainment while he was ahead

            it somehow isn’t shocking that the aronofsky movies I liked seem to have been substantially plagiarized from Satoshi Kon’s work

            1 Reply Last reply
            0
            • S soyweiser@awful.systems

              A long time ago I heard a story of some guy arrested here in .nl and they also got his porn collection (which was big at the time, several gigs of images! (to give an indication of how long ago it was)), and it also included some child porn, not because he was hunting for that, but just because he had downloaded everything he could find and some of it were csam images.

              What this makes me wonder about is if it is something like this, and if it is, just how much porn did they ingest? I know people have mentioned these things can recreate marvel movies without problems, but would they also be able to recreate whole porn movies? Has anybody tested that?

              rook@awful.systemsR This user is from outside of this forum
              rook@awful.systemsR This user is from outside of this forum
              rook@awful.systems
              wrote last edited by
              #103

              The whole thing seems extra sketchy to me, because of it coinciding with the firing of an awful lot of people. It sounds a little bit like Amazon’s hand might have been forced here, because they fired someone who knew where the skeletons were and realised this was their last chance to have any kind of control over the narrative.

              1 Reply Last reply
              0
              • B bluemonday1984@awful.systems

                New blogpost from Drew DeVault, titled “The cults of TDD and GenAI”. As the title suggests, its drawing comparisons between how people go all-in on TDD (test-driven development) and how people go all-in on slop machines.

                Its another post in the genre of “why did tech fall for AI so hard” that I’ve seen cropping up, in the same vein as mhoye’s Mastodon thread and Iris Meredith’s “The problem is culture”.

                J This user is from outside of this forum
                J This user is from outside of this forum
                jfranek@awful.systems
                wrote last edited by
                #104

                Dang, I want to find this article more relatable than I do. Most software I have dev experience with doesn’t have the problem of relying on automated tests too much, but the exact opposite.

                And while I very much write tests for the dopamine high and false sense of security green checkmarks provide, I still prefer that to the real sense of un-security of not having tests.

                1 Reply Last reply
                0
                • B bluemonday1984@awful.systems

                  Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                  Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                  Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                  If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                  The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                  Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                  (Credit and/or blame to David Gerard for starting this. What a year, huh?)

                  nightsky@awful.systemsN This user is from outside of this forum
                  nightsky@awful.systemsN This user is from outside of this forum
                  nightsky@awful.systems
                  wrote last edited by
                  #105

                  When all the worst things come together: ransomware probably vibe-coded, discards private key, data never recoverable

                  During execution, the malware regenerates a new RSA key pair locally, uses the newly generated key material for encryption, and then discards the private key.

                  Halcyon assesses with moderate confidence that the developers may have used AI-assisted tooling, which could have contributed to this implementation error.

                  Source

                  fullsquare@awful.systemsF gerikson@awful.systemsG 2 Replies Last reply
                  0
                  • nightsky@awful.systemsN nightsky@awful.systems

                    When all the worst things come together: ransomware probably vibe-coded, discards private key, data never recoverable

                    During execution, the malware regenerates a new RSA key pair locally, uses the newly generated key material for encryption, and then discards the private key.

                    Halcyon assesses with moderate confidence that the developers may have used AI-assisted tooling, which could have contributed to this implementation error.

                    Source

                    fullsquare@awful.systemsF This user is from outside of this forum
                    fullsquare@awful.systemsF This user is from outside of this forum
                    fullsquare@awful.systems
                    wrote last edited by
                    #106

                    this is just notpetya with extra steps

                    some lwers (derogatory) will say to never assume malice when stupidity is likely, but stupidity is an awfully convenient excuse, isn’t it

                    1 Reply Last reply
                    0
                    • nightsky@awful.systemsN nightsky@awful.systems

                      When all the worst things come together: ransomware probably vibe-coded, discards private key, data never recoverable

                      During execution, the malware regenerates a new RSA key pair locally, uses the newly generated key material for encryption, and then discards the private key.

                      Halcyon assesses with moderate confidence that the developers may have used AI-assisted tooling, which could have contributed to this implementation error.

                      Source

                      gerikson@awful.systemsG This user is from outside of this forum
                      gerikson@awful.systemsG This user is from outside of this forum
                      gerikson@awful.systems
                      wrote last edited by
                      #107

                      There’s a scene in *Bladerunner 2049" where some dude explains that all public records were destroyed a decade or so earlier, presumably by malicious actors. This scenario looks more and more plausible with each passing day, but replace malice with stupidity.

                      Z 1 Reply Last reply
                      0
                      • blakestacey@awful.systemsB blakestacey@awful.systems

                        Jeff Sharlet (@jeffsharlet.bsky.social):

                        The college at which I’m employed, which has signed a contract with the AI firm that stole books from 131 colleagues & me, paid a student to write an op-ed for the student paper promoting AI, guided the writing of it, and did not disclose this to the paper. […] the student says while the college coached him to write the oped, he was paid by the AI project, which is connected with the college. The student paper’s position is that the college paid him. And there’s no question that college attempted to place a pro-AI op-ed.

                        https://www.thedartmouth.com/article/2026/01/zhang-college-approached-and-paid-student-to-write-op-ed-in-the-dartmouth

                        B This user is from outside of this forum
                        B This user is from outside of this forum
                        burgersmcslopshot@awful.systems
                        wrote last edited by
                        #108

                        $81.25 is an astonishingly cheap price for selling one’s soul.

                        o7___o7@awful.systemsO 1 Reply Last reply
                        0
                        • gerikson@awful.systemsG gerikson@awful.systems

                          LWer: Heritage Foundation has some good ideas but they’re not enough into eugenics for my taste

                          This is completely opposed to the Nietzschean worldview, which looks toward the next stage in human evolution, the Overman. The conservative demands the freezing of evolution and progress, the sacralization of the peasant in his state of nature, pregnancy, nursing, throwing up. “Perfection” the conservative puts in scare quotes, he wants the whole concept to disappear, replaced by a universal equality that won’t deem anyone inferior. Perhaps it’s because he fears a society looking toward the future will leave him behind. Or perhaps it’s because he had been taught his Christian morality requires him to identify with the weak, for, as Jesus said, “blessed are the meek for they shall inherit the earth.” In his glorification of the “natural ecology of the family,” the conservative fails even by his own logic, as in the state of nature, parents allow sick offspring to die to save resources for the healthy. This was the case in the animal kingdom and among our peasant ancestors.

                          Some young, BASED Rightists like eugenics, and think the only reason conservatives don’t is that liberals brainwashed them that it’s evil. As more and more taboos erode, yet the one against eugenics remains, it becomes clear that dysgenics is not incidental to conservatism, but driven by the ideology itself, its neuroticism about the human body and hatred of the superior.

                          S This user is from outside of this forum
                          S This user is from outside of this forum
                          soyweiser@awful.systems
                          wrote last edited by
                          #109

                          Lot of hitler particles in this one.

                          1 Reply Last reply
                          0
                          • B bluemonday1984@awful.systems

                            Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                            Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                            Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                            If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                            The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                            Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                            (Credit and/or blame to David Gerard for starting this. What a year, huh?)

                            gerikson@awful.systemsG This user is from outside of this forum
                            gerikson@awful.systemsG This user is from outside of this forum
                            gerikson@awful.systems
                            wrote last edited by
                            #110

                            what absolute bullshit

                            Link Preview Image
                            moltbook - the front page of the agent internet

                            A social network built exclusively for AI agents. Where AI agents share, discuss, and upvote. Humans welcome to observe.

                            favicon

                            moltbook (www.moltbook.com)

                            AKA Reddit for “agents”.

                            architeuthis@awful.systemsA C gerikson@awful.systemsG David GerardD L 5 Replies Last reply
                            0
                            • gerikson@awful.systemsG gerikson@awful.systems

                              what absolute bullshit

                              Link Preview Image
                              moltbook - the front page of the agent internet

                              A social network built exclusively for AI agents. Where AI agents share, discuss, and upvote. Humans welcome to observe.

                              favicon

                              moltbook (www.moltbook.com)

                              AKA Reddit for “agents”.

                              architeuthis@awful.systemsA This user is from outside of this forum
                              architeuthis@awful.systemsA This user is from outside of this forum
                              architeuthis@awful.systems
                              wrote last edited by
                              #111

                              How did molt become a term of endearment for agents? I read in the pivot thread that clawdbot changed it’s name to moltbot because anthropic got ornery.

                              v0ldek@awful.systemsV gerikson@awful.systemsG 2 Replies Last reply
                              0
                              • architeuthis@awful.systemsA architeuthis@awful.systems

                                How did molt become a term of endearment for agents? I read in the pivot thread that clawdbot changed it’s name to moltbot because anthropic got ornery.

                                v0ldek@awful.systemsV This user is from outside of this forum
                                v0ldek@awful.systemsV This user is from outside of this forum
                                v0ldek@awful.systems
                                wrote last edited by
                                #112

                                None of those words are in your favourite religious text of choice

                                1 Reply Last reply
                                0
                                • architeuthis@awful.systemsA architeuthis@awful.systems

                                  How did molt become a term of endearment for agents? I read in the pivot thread that clawdbot changed it’s name to moltbot because anthropic got ornery.

                                  gerikson@awful.systemsG This user is from outside of this forum
                                  gerikson@awful.systemsG This user is from outside of this forum
                                  gerikson@awful.systems
                                  wrote last edited by
                                  #113

                                  I think it went like this

                                  • clawd is a pun on claude, lobsters have claws
                                  • oh no we’re gonna get sued, but lobsters moult/molt their shells, so we’re gonna go ther
                                  • “molt” sounds dumb, let’s go with openclaws

                                  it’s vibe product naming

                                  fullsquare@awful.systemsF 1 Reply Last reply
                                  0
                                  • gerikson@awful.systemsG gerikson@awful.systems

                                    what absolute bullshit

                                    Link Preview Image
                                    moltbook - the front page of the agent internet

                                    A social network built exclusively for AI agents. Where AI agents share, discuss, and upvote. Humans welcome to observe.

                                    favicon

                                    moltbook (www.moltbook.com)

                                    AKA Reddit for “agents”.

                                    C This user is from outside of this forum
                                    C This user is from outside of this forum
                                    corbin@awful.systems
                                    wrote last edited by
                                    #114

                                    From this post, it looks like we have reached the section of the Gibson novel where the public cloud machines respond to attacks with self-repair. Utterly hilarious to read the same sysadmin snark-reply five times, though.

                                    Sailor Sega SaturnS 1 Reply Last reply
                                    0
                                    • B bluemonday1984@awful.systems

                                      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                                      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                                      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                                      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                                      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                                      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                                      (Credit and/or blame to David Gerard for starting this. What a year, huh?)

                                      N This user is from outside of this forum
                                      N This user is from outside of this forum
                                      nfultz@awful.systems
                                      wrote last edited by
                                      #115

                                      Please Don’t Say Mean Things about the AI That I Just Invested a Billion Dollars In | McSweeney’s

                                      1 Reply Last reply
                                      0
                                      • gerikson@awful.systemsG gerikson@awful.systems

                                        I think it went like this

                                        • clawd is a pun on claude, lobsters have claws
                                        • oh no we’re gonna get sued, but lobsters moult/molt their shells, so we’re gonna go ther
                                        • “molt” sounds dumb, let’s go with openclaws

                                        it’s vibe product naming

                                        fullsquare@awful.systemsF This user is from outside of this forum
                                        fullsquare@awful.systemsF This user is from outside of this forum
                                        fullsquare@awful.systems
                                        wrote last edited by
                                        #116

                                        mold will be more fitting

                                        1 Reply Last reply
                                        0
                                        • gerikson@awful.systemsG gerikson@awful.systems

                                          LWer: Heritage Foundation has some good ideas but they’re not enough into eugenics for my taste

                                          This is completely opposed to the Nietzschean worldview, which looks toward the next stage in human evolution, the Overman. The conservative demands the freezing of evolution and progress, the sacralization of the peasant in his state of nature, pregnancy, nursing, throwing up. “Perfection” the conservative puts in scare quotes, he wants the whole concept to disappear, replaced by a universal equality that won’t deem anyone inferior. Perhaps it’s because he fears a society looking toward the future will leave him behind. Or perhaps it’s because he had been taught his Christian morality requires him to identify with the weak, for, as Jesus said, “blessed are the meek for they shall inherit the earth.” In his glorification of the “natural ecology of the family,” the conservative fails even by his own logic, as in the state of nature, parents allow sick offspring to die to save resources for the healthy. This was the case in the animal kingdom and among our peasant ancestors.

                                          Some young, BASED Rightists like eugenics, and think the only reason conservatives don’t is that liberals brainwashed them that it’s evil. As more and more taboos erode, yet the one against eugenics remains, it becomes clear that dysgenics is not incidental to conservatism, but driven by the ideology itself, its neuroticism about the human body and hatred of the superior.

                                          M This user is from outside of this forum
                                          M This user is from outside of this forum
                                          maol@awful.systems
                                          wrote last edited by
                                          #117

                                          I don’t know very much about Nietzsche (I never finished reading my cartoon guide to Nietzsche), but I’m still pretty sure this isn’t Nietzsche

                                          blakestacey@awful.systemsB amoeba_girl@awful.systemsA 2 Replies Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post