Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst drafts
-
I like the way Ted Chiang puts it:
Some might say that the output of large language models doesn’t look all that different from a human writer’s first draft, but, again, I think this is a superficial resemblance. Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly, and it is accompanied by your amorphous dissatisfaction, your awareness of the distance between what it says and what you want it to say. That’s what directs you during rewriting, and that’s one of the things lacking when you start with text generated by an A.I.
There’s nothing magical or mystical about writing, but it involves more than placing an existing document on an unreliable photocopier and pressing the Print button.
I think our materialist culture forgets that minds exist. The output from writing something is not just “the thing you wrote”, but also your thoughts about the thing you wrote.
Your first draft isn’t an unoriginal idea expressed clearly; it’s an original idea expressed poorly
I like this a lot. I’m going to thieve it.
-
Nah, can’t agree. I have postponed few ideas for years, was able to vibe them in a week during evenings, now i have something usable. 70% of it was vibed, just had to fix stupid stuff that was partially on my queries.
That’s the difference between an amateur writer and a professional writer.
-
The dialog pushing AI media seems to start from this assumption that I consume media just to have colors and words and sounds enter my face holes. In fact, I consume art and media because I like hearing, seeing, and reading about how other humans experience the same world I do. It’s a form of communication. I like the product but also the process of people trying to capture the bonkers, ineffable experience we all seem to be sharing in ways I would never think of, but can instantly verify.
What’s funny is, due to the nature of media, it’s kind of impossible to not communicate something, even if the artwork itself is empty. When I see AI media I see the communication of a mind that doesn’t know or give a shit about any of this. So in their attempt make filler they are in fact making art about how inarticulate they are. It’s unintentional, corporate dadaism.
Yes, this is it right here. The whole point of art is communication and connection with another human being.
-
It would probably just have been less dialogue
They already said less shit.
-
This post did not contain any content.
Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst drafts
My worst drafts are a 5/10 but I might have lower standards.
PC Gamer (www.pcgamer.com)
I can see how it could be useful, or mandatory in future rpgs. It can generate a framework for a real writer, with extremely large amounts of logical branching, a billion times faster. Then you go over the top of it and use the framework as concepts to use or revise. This streamlines the process, unifies the creative vision, and allows for such a large game without procedural generation that would haven taken a team 10 years or not at all, done in 2.
-
Starfield is estimated to have sold 3 millions copies. Baldur’s gate 3, 15 million. Microsoft/Bethesda marketing budgets makes a difference, but not being garbage makes a much bigger difference.
-
I can see how it could be useful, or mandatory in future rpgs. It can generate a framework for a real writer, with extremely large amounts of logical branching, a billion times faster. Then you go over the top of it and use the framework as concepts to use or revise. This streamlines the process, unifies the creative vision, and allows for such a large game without procedural generation that would haven taken a team 10 years or not at all, done in 2.
This is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.
If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.
-
I love how you idiots think this tech hasn’t already hit its ceiling. It’s been functionally stagnant for some time now.
Hey. I’m just one idiot. Who else are you talking about?
-
That means in about 6 months or so the AI content quality will be about an 8/10. The processors spread machine “learning” incredibly fast. Some might even say exponentially fast. Pretty soon it’ll be like that old song “If you wonder why your letters never get a reply, when you tell me that you love me, I want to see you write it”. “Letters” is an old version of one-on-one tweeting, but with no character limit.
What improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.
I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.
-
Hey. I’m just one idiot. Who else are you talking about?
You’re not just one, you’re one of many. All saying the same shit.
-
You’re not just one, you’re one of many. All saying the same shit.
Lots and lots of people have told me that.
-
Lots and lots of people have told me that.
That tracks.
-
This is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.
If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.
It’s assuming the ai output isn’t very good. It assumes it can create a framework that necessarily still needs the actual writers, but now they don’t have to come up with 100% of the framework, but instead work on the actual content only. Storyboarding and frameworking is a hodgepodge of nonsense anyway with humans. The goal is to achieve non-linear scaling, not replace quality writers or have the final product Ai written.
-
It’s assuming the ai output isn’t very good. It assumes it can create a framework that necessarily still needs the actual writers, but now they don’t have to come up with 100% of the framework, but instead work on the actual content only. Storyboarding and frameworking is a hodgepodge of nonsense anyway with humans. The goal is to achieve non-linear scaling, not replace quality writers or have the final product Ai written.
This sounds like it takes away a huge amount of creative freedom from the writers if the AI is specifying the framework. It’d be like letting the AI write the plot, but then having real writers fill in details along the way, which sounds like a good way to have the story go nowhere interesting.
I’m not a writer, but if I was to apply this strategy to programming, which I am familiar with, it’d be like letting the AI decide what all the features are, and then I’d have to go and build them. Considering more than half my job is stuff other than actually writing code, this seems overly reductive, and underestimates how much human experience matters in deciding a framework and direction.
-
Well yeah, I’m not going to argue that a well made game that respects the player isn’t going to do well. But that doesn’t matter to the publishers and their shareholders when they can pump out AI slop garbage year after year and still have people that drink it up. Just look at the yearly shooters and sports games, they sell enough.
Besides, what happens when this sort of slop has been normalised? Look at the mobile market, no one bats an eye at the intensely predatory microtransactions, and you’ll even find people defending things like gacha games.
There was a time where people scoffed at the notion of paying $2~ for some shitty cosmetics, but now people don’t even blink at the idea. Hell, it’s downright cheap in some cases. The AAA industry just has to slop things up for long enough for people to stop caring, because they will stop caring and then continue to shell out for the dubious privilege of guzzling their mediocre, uninspiring garbage.
-
Yes, this is it right here. The whole point of art is communication and connection with another human being.
Not even necessarily a human being! I’d appreciate the fuck out of art if any species made it. But there must be more than uncaring, unfeeling, probabilistic interpretation of input data.
-
This post did not contain any content.
Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst drafts
My worst drafts are a 5/10 but I might have lower standards.
PC Gamer (www.pcgamer.com)
I think the reason so many AI bros are conservative is that conservatives have historically had really bad taste in art/media, so they see the drivel AI creates and think, “oh wow, it looks just like what the artists make,” not realizing that they don’t have the eye to see what it’s missing.
-
Writer having toyed with AI, here : yeah, AI writing sucks. It is consensual and bland, never goes into unexpected territory, or completely fails to understand human nature.
So, we’d better stop calling AI “intelligence”. It’s text-prediction machine learning on steroïds, nothing more, and the fact that we’re still calling that “intelligence” says how gullible we all are.
It’s just another speculative bubble from the tech bros, as cryptos were, except this time the tech bros have made their nazi coming out.
I remember reading a longer post on lemmy. The person was describing their slow realization that the political beliefs they were raised with were leading down a dark path. It was a process that took many years, and the story was full of little moments where cracks in his world view widened and the seed of doubt grew.
And someone who was bored/overwhelmed with having to read a post over three sentences long fed the story into AI to make a short summary. They then posted that summary as a “fixed your post, bro” moment. So basically all the humanity removed. Reminds me of that famous “If the Gettysburg Address were a PowerPoint” https://norvig.com/Gettysburg/
-
Sure, but human-written shit still had that human touch. It could be unintentionally funny, it could be a mixed bag that reaches unexpected heights at times. AI writing is just the bland kind of bad, not the interesting kind of bad.
All your base are belong to us.
-
All your base are belong to us.
I should have been the one to fill your dark soul with liiiiight!