Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. TechTakes
  3. Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd February 2026

Stubsack: weekly thread for sneers not worth an entire post, week ending 22nd February 2026

Scheduled Pinned Locked Moved TechTakes
techtakes
129 Posts 34 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M mirrorwitch@awful.systems

    what I’m thinking about is for how many years now they have been promising that just one more datacenter will fix the “hallucinations”, yet this mess is indistinguishable from nonsense output from three years ago. I see “AI” is going well

    M This user is from outside of this forum
    M This user is from outside of this forum
    macroplastic@sh.itjust.works
    wrote last edited by
    #63

    you can count on microslop to always be behind the curve

    1 Reply Last reply
    0
    • B bluemonday1984@awful.systems

      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

      (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

      blakestacey@awful.systemsB This user is from outside of this forum
      blakestacey@awful.systemsB This user is from outside of this forum
      blakestacey@awful.systems
      wrote last edited by
      #64

      A longread on AI greenwashing begins thusly:

      The expansion of data centres - which is driven in large part by AI growth - is creating a shocking new demand for fossil fuels. The tech companies driving AI expansion try to downplay AI’s proven climate impacts by claiming that AI will eventually help solve climate change. Our analysis of these claims suggests that rather than relying on credible and substantiated data, these companies are writing themselves a blank cheque to pollute on the empty promise of future salvation. While the current negative effects of AI on the climate are clear, proven and growing, the promise of large-scale solutions is often based on wishful thinking, and almost always presented with scant evidence.

      (Via.)

      1 Reply Last reply
      0
      • B bluemonday1984@awful.systems

        Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

        Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

        Any awful.systems sub may be subsneered in this subthread, techtakes or no.

        If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

        The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

        Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

        (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

        blakestacey@awful.systemsB This user is from outside of this forum
        blakestacey@awful.systemsB This user is from outside of this forum
        blakestacey@awful.systems
        wrote last edited by
        #65

        The phrase “ambient AI listening in our hospital” makes me hear the “Dies Irae” in my head.

        B 1 Reply Last reply
        0
        • L lurker@awful.systems

          holy fuck I actually just burst out laughing at this

          S This user is from outside of this forum
          S This user is from outside of this forum
          sinedpick@awful.systems
          wrote last edited by
          #66

          can all of rationalism be reduced to logorrhea with load-bearing extreme handwaving (in this case, agentic self preservation arises through RL scaling)?

          fullsquare@awful.systemsF 1 Reply Last reply
          0
          • Jack Riddle[Any/All]J Jack Riddle[Any/All]

            OT: I tried switching away from my instance to quokk.au because it becomes harder and harder to justify being on a pro-ai instance, but it seems that your posts stopped federating recently. Which makes that a bit harder to do. Any clue why?

            Jack Riddle[Any/All]J This user is from outside of this forum
            Jack Riddle[Any/All]J This user is from outside of this forum
            Jack Riddle[Any/All]
            wrote last edited by
            #67

            bluemonday1984@awful.systems update: quokka reached out to me and apparently you had been banned on another instance for report abuse and that ban had synchronised to quokk.au. You should be unbanned now which means the next stubsack should federate again.

            E: I do not know how tags work E2: why does that format to a mailto link?

            1 Reply Last reply
            0
            • B bluemonday1984@awful.systems

              Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

              Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

              Any awful.systems sub may be subsneered in this subthread, techtakes or no.

              If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

              The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

              Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

              (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

              sc_griffith@awful.systemsS This user is from outside of this forum
              sc_griffith@awful.systemsS This user is from outside of this forum
              sc_griffith@awful.systems
              wrote last edited by
              #68

              new episode of odium symposium. we look at rousseau’s program for using universal education to turn woman into drones

              Just a moment...

              favicon

              (www.patreon.com)

              1 Reply Last reply
              0
              • blakestacey@awful.systemsB blakestacey@awful.systems

                The phrase “ambient AI listening in our hospital” makes me hear the “Dies Irae” in my head.

                B This user is from outside of this forum
                B This user is from outside of this forum
                bluemonday1984@awful.systems
                wrote last edited by
                #69

                The phrase “ambient AI listening in our hospital” makes me hear the “Dies Irae” in my head.

                I’m personally hearing “Morceaux” myself.

                1 Reply Last reply
                0
                • N nfultz@awful.systems

                  Link Preview Image
                  The Dangerous Economics of Walk-Away Wealth in the AI Talent War

                  Watch now | How firms are accidentally paying their best employees to become their biggest competitors.

                  favicon

                  (softcurrency.substack.com)

                  1. Anthropic (Medium Risk) Until mid-February of 2026, Anthropic appeared to be happy, talent-retaining. When an AI Safety Leader publicly resigns with a dramatic letter stating “the world is in peril,” the facade of stability cracks. Anthropic is a delayed fuse, just earlier on the vesting curve than OpenAI. The equity is massive ($300B+ valuation) but largely illiquid. As soon as a liquidity event occurs, the safety researchers will have the capital to fund their own, even safer labs.

                  WTF is “even safer” ??? how bout we like just don’t create the torment nexus.

                  Wonder if the 50% attrition prediction comes to pass though…

                  S This user is from outside of this forum
                  S This user is from outside of this forum
                  scruiser@awful.systems
                  wrote last edited by
                  #70

                  So they’ve highlighted an interesting pattern to compensation packages, but I find their entire framing of it gross and disgusting, in a capitalist techbro kinda way.

                  Like the way the describe Part III’s case study:

                  The uncapped payouts were so large that it fractured the relationship between Capital (Activision) and Labor (Infinity Ward).

                  Acitivision was trying to cheat its labor after they made them massively successful profits! Describing it as a fracture relationship denies the agency on the Acitivision’s part to choose to be greedy capitalist pigs.

                  The talent that left formed the core of the team that built Titanfall and Apex Legends, franchises that have since generated billions in revenue, competing directly in the same first-person shooter market as Call of Duty.

                  Activision could have paid them what they owed them, and kept paying them incentive based payouts, and come out billions of dollars ahead instead of engaging in short-sighted greedy behavior.

                  I would actually find this article interesting and tolerable if they framed it as “here are the perverse incentives capitalism encourages businesses to create” instead of “here is how to leverage the perverse incentives in your favor by paying your employees just enough, but not enough to actually reward them a fair share” (not that they were honest enough to use those words).

                  WTF is “even safer” ??? how bout we like just don’t create the torment nexus.

                  I think the writer isn’t even really evaluating that aspect, just thinking in terms of workers becoming capital owners and how companies should try to prevent that to maximize their profits. The idea that Anthropic employees might care on any level about AI safety (even hypocritically and ineffectually) doesn’t enter into the reasoning.

                  1 Reply Last reply
                  0
                  • S sinedpick@awful.systems

                    can all of rationalism be reduced to logorrhea with load-bearing extreme handwaving (in this case, agentic self preservation arises through RL scaling)?

                    fullsquare@awful.systemsF This user is from outside of this forum
                    fullsquare@awful.systemsF This user is from outside of this forum
                    fullsquare@awful.systems
                    wrote last edited by
                    #71

                    no there’s also racist twitter

                    1 Reply Last reply
                    0
                    • B bluemonday1984@awful.systems

                      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                      (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                      sc_griffith@awful.systemsS This user is from outside of this forum
                      sc_griffith@awful.systemsS This user is from outside of this forum
                      sc_griffith@awful.systems
                      wrote last edited by
                      #72

                      need a word for the sort of tech ‘innovation’ that consists of inventing and monetizing new types of externalities which regulators aren’t willing to address. like how bird scooters aren’t a scam, but they profit off of littering sidewalk space so that ppl with disabilities can’t get around

                      EDIT: a similar, perhaps the same concept is innovation which functions by capturing or monopolizing resources that aren’t as yet understood to be resources. in the bird example, we don’t think of sidewalk space as a capturable resource, and yet

                      F Y 2 Replies Last reply
                      0
                      • sc_griffith@awful.systemsS sc_griffith@awful.systems

                        need a word for the sort of tech ‘innovation’ that consists of inventing and monetizing new types of externalities which regulators aren’t willing to address. like how bird scooters aren’t a scam, but they profit off of littering sidewalk space so that ppl with disabilities can’t get around

                        EDIT: a similar, perhaps the same concept is innovation which functions by capturing or monopolizing resources that aren’t as yet understood to be resources. in the bird example, we don’t think of sidewalk space as a capturable resource, and yet

                        F This user is from outside of this forum
                        F This user is from outside of this forum
                        froztbyte@awful.systems
                        wrote last edited by
                        #73

                        parasitech?

                        F sc_griffith@awful.systemsS 2 Replies Last reply
                        0
                        • F froztbyte@awful.systems

                          parasitech?

                          F This user is from outside of this forum
                          F This user is from outside of this forum
                          froztbyte@awful.systems
                          wrote last edited by
                          #74

                          I guess that doesn’t emphasise the “innovation” aspect much

                          1 Reply Last reply
                          0
                          • o7___o7@awful.systemsO o7___o7@awful.systems

                            It’s funny that such a thing is rare enough in the corpus to come out so recognizably in the output.

                            F This user is from outside of this forum
                            F This user is from outside of this forum
                            froztbyte@awful.systems
                            wrote last edited by
                            #75

                            ime much like the SAFe diagrams, this diagram is all over a certain type of “this is how your corporation should be developing software” thotleader posts

                            (although I imagine the lag 2~3y all those heads have pivoted to promptpraise)

                            F 1 Reply Last reply
                            0
                            • F froztbyte@awful.systems

                              parasitech?

                              sc_griffith@awful.systemsS This user is from outside of this forum
                              sc_griffith@awful.systemsS This user is from outside of this forum
                              sc_griffith@awful.systems
                              wrote last edited by
                              #76

                              maybe “parasitic innovation”?

                              F 1 Reply Last reply
                              0
                              • B bluemonday1984@awful.systems

                                Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                                Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                                Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                                If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                                The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                                Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                                (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                                B This user is from outside of this forum
                                B This user is from outside of this forum
                                bluemonday1984@awful.systems
                                wrote last edited by
                                #77

                                Baldur Bjarnason gives his thoughts on the software job market, predicting a collapse regardless of how AI shakes out:

                                If you model the impact of working LLM coding tools (big increase in productivity, little downside) where the bottlenecks are largely outside of coding, increases in coding automation mostly just reduce the need for labour. I.e. 10x increase means you need 10x fewer coders, collapsing the job market

                                If you model the impact of working LLM coding tools with no bottlenecks, then the increase in productivity massively increases the supply of undifferentiated software and the prices you can charge for any software drops through the floor, collapsing the job market

                                If the models increase output but are flawed, as in they produce too many defects or have major quality issues, Akerlof’s market for lemons kicks in, bad products drive out good, value of software in the market heads south, collapsing the job market

                                If the model impact is largely fictitious, meaning this is all a scam and the perceived benefit is just a clusterfuck of cognitive hazards, then the financial bubble pop will be devastating, tech as an industry will largely be destroyed, and trust in software will be zero, collapsing the job market

                                I can only think of a few major offsetting forces:

                                • If the EU invests in replacing US software, bolstering the EU job market.
                                • China might have substantial unfulfilled domestic demand for software, propping up their job market
                                • Companies might find that declining software quality harms their bottom-line, leading to a Y2K-style investment in fixing their software stacks

                                But those don’t seem likely to do more than partially offset the decline. Kind of hoping I’m missing something

                                1 Reply Last reply
                                0
                                • B bluemonday1984@awful.systems

                                  Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                                  Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                                  Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                                  If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                                  The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                                  Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                                  (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                                  C This user is from outside of this forum
                                  C This user is from outside of this forum
                                  cinnasverses@awful.systems
                                  wrote last edited by
                                  #78

                                  Do we have any idea why some of the Zizians ended up in Vermont? The only thing in their network that comes to mind is the Monastic Academy for the Preservation of Life on Earth (MAPLE, a Buddhist-flavoured CFAR offshoot with the usual Medium post accusing leaders of sexual and psychological abuse)

                                  Vermont and New Hampshire have clusters of generic Libertarians.

                                  1 Reply Last reply
                                  0
                                  • sc_griffith@awful.systemsS sc_griffith@awful.systems

                                    need a word for the sort of tech ‘innovation’ that consists of inventing and monetizing new types of externalities which regulators aren’t willing to address. like how bird scooters aren’t a scam, but they profit off of littering sidewalk space so that ppl with disabilities can’t get around

                                    EDIT: a similar, perhaps the same concept is innovation which functions by capturing or monopolizing resources that aren’t as yet understood to be resources. in the bird example, we don’t think of sidewalk space as a capturable resource, and yet

                                    Y This user is from outside of this forum
                                    Y This user is from outside of this forum
                                    yournetworkishaunted@awful.systems
                                    wrote last edited by
                                    #79

                                    In economic terms it’s less rent seeking and more rent creation. Like, taking advantage of public sidewalk space may not be a rent in the strictest sense given that the revenue model is still people paying for the service, but the ability to provide that service is absolutely predicated on taking over and monopolizing this public resource to the maximal degree possible.

                                    By historical allegory, harkening back to the original destruction of the Commons, we’re looking at Enclosure 2: Frisco Drift.

                                    Let’s also not lose sight of the fact that those sidewalks aren’t a natural formation, and that it’s the city government who ultimately takes on the burden of their construction and maintenance. This kind of neo-enclosure of public resources is then another kind of invisible subsidy.

                                    1 Reply Last reply
                                    0
                                    • B bluemonday1984@awful.systems

                                      Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                                      Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                                      Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                                      If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                                      The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                                      Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                                      (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                                      L This user is from outside of this forum
                                      L This user is from outside of this forum
                                      lurker@awful.systems
                                      wrote last edited by
                                      #80

                                      this article that tries to argue that we have managed to achieve AGI already is a real hoot

                                      self@awful.systemsS 1 Reply Last reply
                                      0
                                      • B bluemonday1984@awful.systems

                                        Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

                                        Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

                                        Any awful.systems sub may be subsneered in this subthread, techtakes or no.

                                        If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

                                        The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

                                        Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

                                        (Credit and/or blame to David Gerard for starting this. Also, hope you had a wonderful Valentine’s Day!)

                                        N This user is from outside of this forum
                                        N This user is from outside of this forum
                                        nfultz@awful.systems
                                        wrote last edited by
                                        #81

                                        ai;dr https://futurism.com/artificial-intelligence/aidr-meaning

                                        1 Reply Last reply
                                        0
                                        • L lurker@awful.systems

                                          this article that tries to argue that we have managed to achieve AGI already is a real hoot

                                          self@awful.systemsS This user is from outside of this forum
                                          self@awful.systemsS This user is from outside of this forum
                                          self@awful.systems
                                          wrote last edited by
                                          #82

                                          In March 2025, the large language model (LLM) GPT-4.5, developed by OpenAI in San Francisco, California, was judged by humans in a Turing test to be human 73% of the time — more often than actual humans were. Moreover, readers even preferred literary texts generated by LLMs over those written by human experts.

                                          do you know how hard it is to write something that aged poorly months before it was written? it’s in the public consciousness that LLMs write like absolute shit in ways that are very easy to pick out once you’ve been forced to read a bunch of LLM-extruded text. inb4 some asshole with AI psychosis pulls out “technically ChatGPT’s more human than you are, look at the statistics” regarding the 73% figure I guess. but you know when statistics don’t count!

                                          A March 2025 survey by the Association for the Advancement of Artificial Intelligence in Washington DC found that 76% of leading researchers thought that scaling up current AI approaches would be ‘unlikely’ or ‘very unlikely’ to yield AGI

                                          […] What explains this disconnect? We suggest that the problem is part conceptual, because definitions of AGI are ambiguous and inconsistent; part emotional, because AGI raises fear of displacement and disruption; and part practical, as the term is entangled with commercial interests that can distort assessments.

                                          no you see it’s the leading researchers that are wrong. why are you being so emotional over AGI. we surveyed Some Assholes and they were pretty sure GPT was a human and you were a bot so… so there!

                                          1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post