Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
-
And still have your house burn down due to it just being a 2080 that has 9.8 jiggawats pushed into it.
There isn’t a single reason to get any of the 5 series imo, they don’t offer anything. And i say that as a 3d artist for games.
Edit: nevermind i remember some idiots got roped into 4k for gaming and are now paying the price like marketing wanted them to.
What’s wrong with 4k gaming? Just curious
-
Colour me surprised
Resumes gaming with a 1000-series card
Still rocking an EVGA 980 here.
-
What’s wrong with 4k gaming? Just curious
Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.
“You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.
-
Yeah, my 2080ti can run everything sans ray traced stuff perfectly, though I also haven’t had any issues with Indiana Jones or Doom: The Dark Ages.
Akschually, Doom DA needs to have raytracing enabled at all times, and your vcard is in the first nvidia gen that has it. While 10xx and 20xx haven’t shown much of a difference, and both series are still okay for average gaming, there’s the planned divide vcard producers wanted. RTX IS ON ads visuals were fancy at best (imho) while consuming too much resources, and now there’s the first game that doesn’t function without it, pushing consumers to either updgrade their hardware or miss out on big hits. Not the first time it happened, but it gives a sense why there were a lot of media noise about that technology in the beginning.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Fuck Nvidia anyways. #teamred
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Still on a 1060 over here.
Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.
-
unless they’re rich or actually need it for something important
Fucking youtubers and crypto miners.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.
(Lowest price I can find)
… That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.
…
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.
This reality is a farce.
…
Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.
RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.
If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.
That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.
Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.
-
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
I’m in the same boat.
In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.
I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.
-
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
-
But then the Nvidia xx90 series have never been for the average consumer and I dont know what gave you that idea.
-
In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.
(Lowest price I can find)
… That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.
…
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.
This reality is a farce.
…
Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.
RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.
If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.
That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.
Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.
Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Still rocking my 3060 ti since launch. Thinking of getting a 9060 XT.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
The PC industry has turned into a scuzzy hellscape for average joes that just want to have decent options at realistic prices. They don’t even care about gaming anymore, it’s about YouTube and BitcoinBruhzz now.
I’ve still got a still pretty decent setup (5800x3d 4070ti), but it’s the last stand for this guy I’m afraid. Looking over the past decade or so, I’ve honestly had better gaming experiences on consoles for mere fractions of the price of a PC build. Mods and PC master race nonsense aside. Sure you don’t need a subscription for online PC playing (I rarely play online), but you can barely get a processor for what a PS5 costs anymore. Let alone a video card, which is upwards of a lot of people’s take home pay for a month, the way things are going.
-
Fuck Nvidia anyways. #teamred
Fuck those guys too honestly. AMD is fueling this bullshit just as much as Nvidia.
-
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
Those cards were like what though, $199?
-
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.
-
The good games don’t need a high end GPU.
Terraria minimum specs: “don’t worry bro”
-
Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.
“You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.
It’s just kind of unnecessary. Gaming in 1440p on something the size of your average computer monitor, hell even just good ol’ 1080 HD, is more than sufficient. I mean 1080 to 4k sure there’s a difference, but 1440p it’s a lot harder to tell. Nobody cares about your mud puddle reflections cranking along in a game at 120 fps. At least not the normies.
Putting on my dinosaur hat for a second, I spent the first decade of my life gaming in 8/16 bit and 4 color CGA, and I’ve probably spent the last thirty years and god only knows how much money trying to replicate those experiences.