Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. TechTakes
  3. Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd February 2026

Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd February 2026

Scheduled Pinned Locked Moved TechTakes
techtakes
129 Posts 34 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B bluemonday1984@awful.systems

    Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

    Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

    Any awful.systems sub may be subsneered in this subthread, techtakes or no.

    If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

    The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

    Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

    (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

    S This user is from outside of this forum
    S This user is from outside of this forum
    soyweiser@awful.systems
    wrote last edited by
    #37

    AI bros do new experiments in making themselves even stupider. Going from ‘explain what you did but dumb it down for me and my degraded attention span’ into ‘just make a simplified cartoon out of it’.

    Proud of not understanding what is going on. None of these people could hack the Gibson.

    E: If they all hate programming so much, perhaps a change of job is in question, sure might not pay as much, but it might make them happier.

    J L I 3 Replies Last reply
    0
    • B bluemonday1984@awful.systems

      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

      (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

      S This user is from outside of this forum
      S This user is from outside of this forum
      soyweiser@awful.systems
      wrote last edited by
      #38

      Somebody on bsky talking about various Ben Goertzel Epstein file emails.

      architeuthis@awful.systemsA 1 Reply Last reply
      0
      • B bluemonday1984@awful.systems

        Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

        Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

        Any awful.systems sub may be subsneered in this subthread, techtakes or no.

        If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

        The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

        Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

        (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

        F This user is from outside of this forum
        F This user is from outside of this forum
        froztbyte@awful.systems
        wrote last edited by
        #39

        in today’s news about magical prompts that super totes give you superpowers:

        We introduced SKILLSBENCH, the first benchmark to systematically evaluate Agent Skills as first-class artifacts. Across 84 tasks, 7 agent-model configurations, and 7,308 trajectories under three conditions (no Skills, curated Skills, self-generated Skills), our evaluation yields four key findings: (1) curated Skills provide substantial but variable benefit (+16.2 percentage points average, with high variance across domains and configurations); (2) self-generated Skills provide negligible or negative benefit (–1.3pp average), demonstrating that effective Skills require human-curated domain expertise

        I am jack’s surprised face

        …and given I have other yaks, I shall not step on my “software and tools don’t have to suck” soapbox right now

        I 1 Reply Last reply
        0
        • M macroplastic@sh.itjust.works

          who up continvoucly morging they branches

          F This user is from outside of this forum
          F This user is from outside of this forum
          froztbyte@awful.systems
          wrote last edited by
          #40

          Timn

          yes, I certainly do know how to handle software development over Timn

          it is actually kinda incredible that this shit has invented a way to be terrible that we can’t actually easily riff off by what’s expressable in unicode. an unholy clusterfuck of what would otherwise be be joked about as keming (but isn’t because it’s straight-up an artefact of the process used to encode visual data from source data, badly), a mindless automaton outputting garbage, and then also the shitty model

          and people keep telling me this shit is good

          v0ldek@awful.systemsV 1 Reply Last reply
          0
          • Y yournetworkishaunted@awful.systems

            *morgin’ timn

            F This user is from outside of this forum
            F This user is from outside of this forum
            froztbyte@awful.systems
            wrote last edited by
            #41

            morgin’ all muh featues

            1 Reply Last reply
            0
            • S soyweiser@awful.systems

              AI bros do new experiments in making themselves even stupider. Going from ‘explain what you did but dumb it down for me and my degraded attention span’ into ‘just make a simplified cartoon out of it’.

              Proud of not understanding what is going on. None of these people could hack the Gibson.

              E: If they all hate programming so much, perhaps a change of job is in question, sure might not pay as much, but it might make them happier.

              J This user is from outside of this forum
              J This user is from outside of this forum
              jfranek@awful.systems
              wrote last edited by
              #42

              I think I understand it. Think of an alcoholic that’s trying every sort of miracle hangover “cure” instead of drinking less.

              1 Reply Last reply
              0
              • B bluemonday1984@awful.systems

                Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                fullsquare@awful.systemsF This user is from outside of this forum
                fullsquare@awful.systemsF This user is from outside of this forum
                fullsquare@awful.systems
                wrote last edited by
                #43

                i’ve collided with an article* https://harshanu.space/en/tech/ccc-vs-gcc/

                you might be wondering why it doesn’t highlight that it fails to compile linux kernel, or why it states that using pieces of gcc where vibecc fails is “fair”, or why it neglects to say that failing linker means it’s not useful in any way, or why just relying on “no errors” isn’t enough when it’s already known that vibecc will happily eat invalid c. it’s explained by:

                Disclaimer

                Part of this work was assisted by AI. The Python scripts used to generate benchmark results and graphs were written with AI assistance. The benchmark design, test execution, analysis and writing were done by a human with AI helping where needed.

                even with all this slant, by their own vibecoded benchmark, vibecc is still complete dogshit with sqlite compiled with it being slower up to 150000x times in some cases

                L 1 Reply Last reply
                0
                • S soyweiser@awful.systems

                  Somebody on bsky talking about various Ben Goertzel Epstein file emails.

                  architeuthis@awful.systemsA This user is from outside of this forum
                  architeuthis@awful.systemsA This user is from outside of this forum
                  architeuthis@awful.systems
                  wrote last edited by
                  #44

                  Goertzel discusses with Epstein future scenario for “an AGI economy”:

                  “for the AGI parasite to overcome to regular-human-economy host (so it can grow to be more than a parasite and eventually gain its own superhuman autonomy) it first needs to suck off the resources of the host”

                  The Rationalists!

                  1 Reply Last reply
                  0
                  • F froztbyte@awful.systems

                    Timn

                    yes, I certainly do know how to handle software development over Timn

                    it is actually kinda incredible that this shit has invented a way to be terrible that we can’t actually easily riff off by what’s expressable in unicode. an unholy clusterfuck of what would otherwise be be joked about as keming (but isn’t because it’s straight-up an artefact of the process used to encode visual data from source data, badly), a mindless automaton outputting garbage, and then also the shitty model

                    and people keep telling me this shit is good

                    v0ldek@awful.systemsV This user is from outside of this forum
                    v0ldek@awful.systemsV This user is from outside of this forum
                    v0ldek@awful.systems
                    wrote last edited by
                    #45

                    and people keep telling me this shit is good

                    I mean, this one is really good, I got like half an hour of jokes with my friend off it

                    F 1 Reply Last reply
                    0
                    • M mirrorwitch@awful.systems

                      wait, was this brain-rotting cognitive hazard posted at the linked page on microsoft dot com documentation? if so they have already removed it

                      edit: archive caught it

                      v0ldek@awful.systemsV This user is from outside of this forum
                      v0ldek@awful.systemsV This user is from outside of this forum
                      v0ldek@awful.systems
                      wrote last edited by
                      #46

                      I checked yesterday and it was there, can confirm

                      Jack Riddle[Any/All]J 1 Reply Last reply
                      0
                      • B bluemonday1984@awful.systems

                        Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                        Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                        Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                        If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                        The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                        Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                        (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                        B This user is from outside of this forum
                        B This user is from outside of this forum
                        burgersmcslopshot@awful.systems
                        wrote last edited by
                        #47

                        Scott Shambaugh mulls about an AI alignment issue following his run-in with a bot last week

                        gerikson@awful.systemsG 1 Reply Last reply
                        0
                        • B burgersmcslopshot@awful.systems

                          Scott Shambaugh mulls about an AI alignment issue following his run-in with a bot last week

                          gerikson@awful.systemsG This user is from outside of this forum
                          gerikson@awful.systemsG This user is from outside of this forum
                          gerikson@awful.systems
                          wrote last edited by
                          #48

                          See also https://awful.systems/post/7311930

                          gerikson@awful.systemsG 1 Reply Last reply
                          0
                          • fullsquare@awful.systemsF fullsquare@awful.systems

                            i’ve collided with an article* https://harshanu.space/en/tech/ccc-vs-gcc/

                            you might be wondering why it doesn’t highlight that it fails to compile linux kernel, or why it states that using pieces of gcc where vibecc fails is “fair”, or why it neglects to say that failing linker means it’s not useful in any way, or why just relying on “no errors” isn’t enough when it’s already known that vibecc will happily eat invalid c. it’s explained by:

                            Disclaimer

                            Part of this work was assisted by AI. The Python scripts used to generate benchmark results and graphs were written with AI assistance. The benchmark design, test execution, analysis and writing were done by a human with AI helping where needed.

                            even with all this slant, by their own vibecoded benchmark, vibecc is still complete dogshit with sqlite compiled with it being slower up to 150000x times in some cases

                            L This user is from outside of this forum
                            L This user is from outside of this forum
                            lagrangeinterpolator@awful.systems
                            wrote last edited by
                            #49

                            This is why CCC being able to compile real C code at all is noteworthy. But it also explains why the output quality is far from what GCC produces. Building a compiler that parses C correctly is one thing. Building one that produces fast and efficient machine code is a completely different challenge.

                            Every single one of these failures is waved away because supposedly it’s impressive that the AI can do this at all. Do they not realize the obvious problem with this argument? The AI has been trained on all the source code that Anthropic could get their grubby hands on! This includes GCC and clang and everything remotely resembling a C compiler! If I took every C compiler in existence, shoved them in a blender, and spent $20k on electricity blending them until the resulting slurry passed my test cases, should I be surprised or impressed that I got a shitty C compiler? If an actual person wrote this code, they would be justifiably mocked (or they’re a student trying to learn by doing, and LLMs do not learn by doing). But AI gets a free pass because it’s impressive that the slop can come in larger quantities now, I guess. These Models Will Improve. These Issues Will Get Fixed.

                            I v0ldek@awful.systemsV 2 Replies Last reply
                            0
                            • v0ldek@awful.systemsV v0ldek@awful.systems

                              and people keep telling me this shit is good

                              I mean, this one is really good, I got like half an hour of jokes with my friend off it

                              F This user is from outside of this forum
                              F This user is from outside of this forum
                              froztbyte@awful.systems
                              wrote last edited by
                              #50

                              okay I can’t argue with that outcome

                              1 Reply Last reply
                              0
                              • S soyweiser@awful.systems

                                AI bros do new experiments in making themselves even stupider. Going from ‘explain what you did but dumb it down for me and my degraded attention span’ into ‘just make a simplified cartoon out of it’.

                                Proud of not understanding what is going on. None of these people could hack the Gibson.

                                E: If they all hate programming so much, perhaps a change of job is in question, sure might not pay as much, but it might make them happier.

                                L This user is from outside of this forum
                                L This user is from outside of this forum
                                lagrangeinterpolator@awful.systems
                                wrote last edited by
                                #51

                                my current favorite trick for reducing “cognitive debt” (h/t @simonw ) is to ask the LLM to write two versions of the plan:

                                1. The version for it (highly technical and detailed)
                                2. The version for me (an entertaining essay designed to build my intuition)

                                I don’t know about them, but I would be offended if I was planning something with a collaborator, and they decide to give me a dumbed down, entertaining, children’s storybook version of their plan while keeping all the technical details to themselves.

                                Also, this is absolutely not what “cognitive debt” means. I’ve heard technical debt refers to bad design decisions in software where one does something cheap and easy now but has to constantly deal with the maintenance headaches afterwards. But the very concept of working through technical details? That’s what we call “thinking”. These people want to avoid the burden of thinking.

                                architeuthis@awful.systemsA S 2 Replies Last reply
                                0
                                • L lagrangeinterpolator@awful.systems

                                  my current favorite trick for reducing “cognitive debt” (h/t @simonw ) is to ask the LLM to write two versions of the plan:

                                  1. The version for it (highly technical and detailed)
                                  2. The version for me (an entertaining essay designed to build my intuition)

                                  I don’t know about them, but I would be offended if I was planning something with a collaborator, and they decide to give me a dumbed down, entertaining, children’s storybook version of their plan while keeping all the technical details to themselves.

                                  Also, this is absolutely not what “cognitive debt” means. I’ve heard technical debt refers to bad design decisions in software where one does something cheap and easy now but has to constantly deal with the maintenance headaches afterwards. But the very concept of working through technical details? That’s what we call “thinking”. These people want to avoid the burden of thinking.

                                  architeuthis@awful.systemsA This user is from outside of this forum
                                  architeuthis@awful.systemsA This user is from outside of this forum
                                  architeuthis@awful.systems
                                  wrote last edited by
                                  #52

                                  Eh, one might say that going by the broad strokes version while letting the expert do their thing is basically what management is all about, especially if they ignore the part where he wants his version to be light and entertaining.

                                  This isn’t about managing subordinates though, this is about devising ways to be complacent about not double checking what the LLM generates in your name.

                                  1 Reply Last reply
                                  0
                                  • L lagrangeinterpolator@awful.systems

                                    This is why CCC being able to compile real C code at all is noteworthy. But it also explains why the output quality is far from what GCC produces. Building a compiler that parses C correctly is one thing. Building one that produces fast and efficient machine code is a completely different challenge.

                                    Every single one of these failures is waved away because supposedly it’s impressive that the AI can do this at all. Do they not realize the obvious problem with this argument? The AI has been trained on all the source code that Anthropic could get their grubby hands on! This includes GCC and clang and everything remotely resembling a C compiler! If I took every C compiler in existence, shoved them in a blender, and spent $20k on electricity blending them until the resulting slurry passed my test cases, should I be surprised or impressed that I got a shitty C compiler? If an actual person wrote this code, they would be justifiably mocked (or they’re a student trying to learn by doing, and LLMs do not learn by doing). But AI gets a free pass because it’s impressive that the slop can come in larger quantities now, I guess. These Models Will Improve. These Issues Will Get Fixed.

                                    I This user is from outside of this forum
                                    I This user is from outside of this forum
                                    istewart@awful.systems
                                    wrote last edited by
                                    #53

                                    spent $20k on electricity blending them

                                    They would probably be even more impressed that you only spent $20k

                                    1 Reply Last reply
                                    0
                                    • F froztbyte@awful.systems

                                      in today’s news about magical prompts that super totes give you superpowers:

                                      We introduced SKILLSBENCH, the first benchmark to systematically evaluate Agent Skills as first-class artifacts. Across 84 tasks, 7 agent-model configurations, and 7,308 trajectories under three conditions (no Skills, curated Skills, self-generated Skills), our evaluation yields four key findings: (1) curated Skills provide substantial but variable benefit (+16.2 percentage points average, with high variance across domains and configurations); (2) self-generated Skills provide negligible or negative benefit (–1.3pp average), demonstrating that effective Skills require human-curated domain expertise

                                      I am jack’s surprised face

                                      …and given I have other yaks, I shall not step on my “software and tools don’t have to suck” soapbox right now

                                      I This user is from outside of this forum
                                      I This user is from outside of this forum
                                      istewart@awful.systems
                                      wrote last edited by
                                      #54

                                      This reminds me of when Steve Jobs would introduce every new Mac release by talking about how fast it could render in Photoshop. I wonder how he would do in our brave new era of completely ass-pulling your own bespoke benchmark frameworks.

                                      1 Reply Last reply
                                      0
                                      • M macroplastic@sh.itjust.works

                                        who up continvoucly morging they branches

                                        B This user is from outside of this forum
                                        B This user is from outside of this forum
                                        bluemonday1984@awful.systems
                                        wrote last edited by
                                        #55

                                        That slopped-out “diagram” plagiarised Vincent Driessen’s “A successful Git branching model”, BTW.

                                        o7___o7@awful.systemsO 1 Reply Last reply
                                        0
                                        • S soyweiser@awful.systems

                                          AI bros do new experiments in making themselves even stupider. Going from ‘explain what you did but dumb it down for me and my degraded attention span’ into ‘just make a simplified cartoon out of it’.

                                          Proud of not understanding what is going on. None of these people could hack the Gibson.

                                          E: If they all hate programming so much, perhaps a change of job is in question, sure might not pay as much, but it might make them happier.

                                          I This user is from outside of this forum
                                          I This user is from outside of this forum
                                          istewart@awful.systems
                                          wrote last edited by
                                          #56

                                          E: If they all hate programming so much, perhaps a change of job is in question, sure might not pay as much, but it might make them happier.

                                          Surely at least a few of them have worked up enough seed capital to try their hand at used-car dealerships. I can attest that the juicier markets just outside the Bay Area are fairly saturated, but maybe they could push into lesser-served locales like Lost Hills or Weaverville.

                                          1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post