Stubsack: weekly thread for sneers not worth an entire post, week ending 1st February 2026
-
I’d say even the part where the article tries to formally state the theorem is not written well. Even then, it’s very clear how narrow the formal statement is. You can say that two agents agree on any statement that is common knowledge, but you have to be careful on exactly how you’re defining “agent”, “statement”, and “common knowledge”. If I actually wanted to prove a point with Aumann’s agreement theorem, I’d have to make sure my scenario fits in the mathematical framework. What is my state space? What are the events partitioning the state space that form an agent? Etc.
The rats never seem to do the legwork that’s necessary to apply a mathematical theorem. I doubt most of them even understand the formal statement of Aumann’s theorem. Yud is all about “shut up and multiply,” but has anyone ever see him apply Bayes’s theorem and multiply two actual probabilities? All they seem to do is pull numbers out of their ass and fit superexponential curves to 6 data points because the superintelligent AI is definitely coming in 2027.
the get smart quick scheme in its full glory
-
I know what it says and it’s commonly misused. Aumann’s Agreement says that if two people disagree on a conclusion then either they disagree on the reasoning or the premises. It’s trivial in formal logic, but hard to prove in Bayesian game theory, so of course the Bayesians treat it as some grand insight rather than a basic fact. That said, I don’t know what that LW post is talking about and I don’t want to think about it, which means that I might disagree with people about the conclusion of that post~
if two people disagree on a conclusion then either they disagree on the reasoning or the premises.
I don’t think that’s an accurate summary. In Aumann’s agreement theorem, the different agents share a common prior distribution but are given access to different sources of information about the random quantity under examination. The surprising part is that they agree on the posterior probability provided that their conclusions (not their sources) are common knowledge.
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
LWer: Heritage Foundation has some good ideas but they’re not enough into eugenics for my taste
This is completely opposed to the Nietzschean worldview, which looks toward the next stage in human evolution, the Overman. The conservative demands the freezing of evolution and progress, the sacralization of the peasant in his state of nature, pregnancy, nursing, throwing up. “Perfection” the conservative puts in scare quotes, he wants the whole concept to disappear, replaced by a universal equality that won’t deem anyone inferior. Perhaps it’s because he fears a society looking toward the future will leave him behind. Or perhaps it’s because he had been taught his Christian morality requires him to identify with the weak, for, as Jesus said, “blessed are the meek for they shall inherit the earth.” In his glorification of the “natural ecology of the family,” the conservative fails even by his own logic, as in the state of nature, parents allow sick offspring to die to save resources for the healthy. This was the case in the animal kingdom and among our peasant ancestors.
Some young, BASED Rightists like eugenics, and think the only reason conservatives don’t is that liberals brainwashed them that it’s evil. As more and more taboos erode, yet the one against eugenics remains, it becomes clear that dysgenics is not incidental to conservatism, but driven by the ideology itself, its neuroticism about the human body and hatred of the superior.
-
they prompted so hard and that’s all they get, so obviously there’s nothing better and they stop st that
they’ll do anything except actually learn shit or put in effort
All roads lead down from the local maximum
-
i actually find the content pretty amusing, since it amounts to “have you guys tried using words correctly every once in a while?”
the comments also do not get it at all. The guy going “don’t say you have 99% percent credence in something¹, for most people saying that it’s a virtual certainty is the same thing” these people cannot speak as a non-cultmember for even a second
¹yeah don’t, nobody says that
-
LWer: Heritage Foundation has some good ideas but they’re not enough into eugenics for my taste
This is completely opposed to the Nietzschean worldview, which looks toward the next stage in human evolution, the Overman. The conservative demands the freezing of evolution and progress, the sacralization of the peasant in his state of nature, pregnancy, nursing, throwing up. “Perfection” the conservative puts in scare quotes, he wants the whole concept to disappear, replaced by a universal equality that won’t deem anyone inferior. Perhaps it’s because he fears a society looking toward the future will leave him behind. Or perhaps it’s because he had been taught his Christian morality requires him to identify with the weak, for, as Jesus said, “blessed are the meek for they shall inherit the earth.” In his glorification of the “natural ecology of the family,” the conservative fails even by his own logic, as in the state of nature, parents allow sick offspring to die to save resources for the healthy. This was the case in the animal kingdom and among our peasant ancestors.
Some young, BASED Rightists like eugenics, and think the only reason conservatives don’t is that liberals brainwashed them that it’s evil. As more and more taboos erode, yet the one against eugenics remains, it becomes clear that dysgenics is not incidental to conservatism, but driven by the ideology itself, its neuroticism about the human body and hatred of the superior.
the overman¹
¹better known by it’s german name
-
LWer: Heritage Foundation has some good ideas but they’re not enough into eugenics for my taste
This is completely opposed to the Nietzschean worldview, which looks toward the next stage in human evolution, the Overman. The conservative demands the freezing of evolution and progress, the sacralization of the peasant in his state of nature, pregnancy, nursing, throwing up. “Perfection” the conservative puts in scare quotes, he wants the whole concept to disappear, replaced by a universal equality that won’t deem anyone inferior. Perhaps it’s because he fears a society looking toward the future will leave him behind. Or perhaps it’s because he had been taught his Christian morality requires him to identify with the weak, for, as Jesus said, “blessed are the meek for they shall inherit the earth.” In his glorification of the “natural ecology of the family,” the conservative fails even by his own logic, as in the state of nature, parents allow sick offspring to die to save resources for the healthy. This was the case in the animal kingdom and among our peasant ancestors.
Some young, BASED Rightists like eugenics, and think the only reason conservatives don’t is that liberals brainwashed them that it’s evil. As more and more taboos erode, yet the one against eugenics remains, it becomes clear that dysgenics is not incidental to conservatism, but driven by the ideology itself, its neuroticism about the human body and hatred of the superior.
the conservative… wants… a universal equality that won’t deem anyone inferior.
perhaps it’s because he had been taught his Christian morality requires him to identify with the weak
Which conservatives are these. This is just a libertarian fantasy, isn’t it.
-
the overman¹
¹better known by it’s german name
Technically superman is a more correct translation for that word (similarly to how superscript is the thing beyond the script)
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
Signaling in the Age of AI: Evidence from Cover Letters
Abstract We study the impact of generative AI on labor market signaling using the introduction of an AI-powered cover letter writing tool on a large online labor platform. Our data track both access to the tool and usage at the application level. Difference-in-differences estimates show that access to the tool increased textual alignment between cover letters and job posts and raised callback rates. Time spent editing AI-generated cover letter drafts is positively correlated with hiring success. After the tool’s introduction, the correlation between cover letters’ textual alignment and callbacks fell by 51%, consistent with what theory predicts if the AI technology reduces the signal content of cover letters. In response, employers shifted toward alternative signals, including workers’ prior work histories.
-
Is Pee Stored in the Balls? Vibe Coding Science with OpenAI’s Prism
Carl T. Bergstrom (@carlbergstrom.com)
So initial experiments with Open AI's vibe-coding science tool Prism are going about as well as expected.
Bluesky Social (bsky.app)
“Pee-er review” :kelly:
-
Amazon’s latest round of 16k layoffs for AWS was called “Project Dawn” internally, and the public line is that the layoffs are because of increased AI use. AI has become useful, but as a way to conceal business failure. They’re not cutting jobs because their financials are in the shitter, oh no, it’s because they’re just too amazing at being efficient. So efficient they sent the corporate fake condolences email before informing the people they’re firing, referencing a blog post they hadn’t yet published.
It’s Schrodinger’s Success. You can neither prove nor disprove the effects of AI on the decision, or if the layoffs are an indication of good management or fundamental mismanagement. And the media buys into it with headlines like “Amazon axes 16,000 jobs as it pushes AI and efficiency” that are distinctly ambivalent on how 16k people could possibly have been redundant in a tech company that’s supposed to be a beacon of automation.
They’re not cutting jobs because their financials are in the shitter
Their financials are not even in the shitter! except insofar as their increased AI capex isn’t delivering returns, so they need to massage the balance sheet by doing rolling layoffs to stop the feral hogs from clamoring and stampeding on the next quarterly earnings call.
-
the conservative… wants… a universal equality that won’t deem anyone inferior.
perhaps it’s because he had been taught his Christian morality requires him to identify with the weak
Which conservatives are these. This is just a libertarian fantasy, isn’t it.
I had to do a triple take on that “won’t deem anyone inferior” like what the fuck are you talking about. The core of conservatism is the belief in rigid hierarchies! Hierarchies have superiors and inferiors by definition!
-
They’re not cutting jobs because their financials are in the shitter
Their financials are not even in the shitter! except insofar as their increased AI capex isn’t delivering returns, so they need to massage the balance sheet by doing rolling layoffs to stop the feral hogs from clamoring and stampeding on the next quarterly earnings call.
In retrospect the word quarterlies is what I should have chosen for accuracy, but I’m glad I didn’t purely because I wouldn’t have then had your vivid hog simile.
-
I had to do a triple take on that “won’t deem anyone inferior” like what the fuck are you talking about. The core of conservatism is the belief in rigid hierarchies! Hierarchies have superiors and inferiors by definition!
That depends on if you consider the “inferior” to be human, if they’re even still alive after the eugenics part.
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
New blogpost from Drew DeVault, titled “The cults of TDD and GenAI”. As the title suggests, its drawing comparisons between how people go all-in on TDD (test-driven development) and how people go all-in on slop machines.
Its another post in the genre of “why did tech fall for AI so hard” that I’ve seen cropping up, in the same vein as mhoye’s Mastodon thread and Iris Meredith’s “The problem is culture”.
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
OT: today the respiratory illness I’ve had for five days tested positive for Covid the first time just now.
My symptoms are fairly mild, probably because I reinforced my vaccine three months ago. But I’m trying to learn more about these recent “swallowing razors” variants and dang! the online situation is bad. Finding reliable medical information in the post-slop, post-Trump Internet is a nightmare.
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
Just seen a clip of aronofsky’s genai revolutionary war thing and it is incredibly bad. Just… every detail is shit. Ways in which I hadn’t previously imagined that the uncanny valley would intrude. Even if it weren’t for the simulated flesh golems, one of whom seems to be wearing anthony hopkins’ skin as a clumsy disguise, the framing and pacing just feels like the model was trained on endless adverts and corporate speaking head videos, and either it was impossible to edit, or none the crew have any idea what even mediocre films look like.
I also hadn’t appreciated before that genai lip sync/dubbing was just embarrassing. I think I’ve only seen a couple of very short genai video clips before, and the most recent at least 6 months ago, but this just seems straight up broken. Have the people funding this stuff ever looked at what is being generated?
AmericanTruckSongs10 (@ethangach.bsky.social)
You don't have the be an avowed AI hater to be impressed without lifeless and unwatchable this is.
Bluesky Social (bsky.app)
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
Finally read Greg Egan - Permutation City, and looked at the LW discussion on the book, and I feel like they missed a lot of things (and why are they talking about the ‘dust theory’ like it is real). Oof, the wikipedia page on the themes and settings has a similar problem however. It is like they all ignored the second half of the book and the themes contained within it, not one mention of the city being called Elysium for example.
And of course Zack_M_Davis thinks the plotline of the guy who killed the drug dealer (not a sex worker/prostitute lw people, please the text makes this pretty clear) he was in a lowkey romantic relationship with should have been cut (also not mentioned on the wikipedia page iirc).
Also, nobody seems to talk about the broken sewage pipe.
-
Just seen a clip of aronofsky’s genai revolutionary war thing and it is incredibly bad. Just… every detail is shit. Ways in which I hadn’t previously imagined that the uncanny valley would intrude. Even if it weren’t for the simulated flesh golems, one of whom seems to be wearing anthony hopkins’ skin as a clumsy disguise, the framing and pacing just feels like the model was trained on endless adverts and corporate speaking head videos, and either it was impossible to edit, or none the crew have any idea what even mediocre films look like.
I also hadn’t appreciated before that genai lip sync/dubbing was just embarrassing. I think I’ve only seen a couple of very short genai video clips before, and the most recent at least 6 months ago, but this just seems straight up broken. Have the people funding this stuff ever looked at what is being generated?
AmericanTruckSongs10 (@ethangach.bsky.social)
You don't have the be an avowed AI hater to be impressed without lifeless and unwatchable this is.
Bluesky Social (bsky.app)
Mateusz Fafinski (@calthalas.bsky.social):
Happy to see that there is no need to worry about the historical accuracy of new 1776 AI slop because it happens in the mystical land of Λamereedd.
-
Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.
Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting this. What a year, huh?)
Amazon Found ‘High Volume’ Of Child Sex Abuse Material in AI Training Data
The tech giant reported hundreds of thousands of cases of suspected child sexual abuse material, but won’t say where it came from
I’ll bet.