Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
The good games don’t need a high end GPU.
Problem is preordering has been normalized, as has releasing games in pre-alpha state.
-
I mean, as written the headline statement is always true.
I am horrified by some of the other takeaways, though:
Nearly 3 in 4 gamers (73%) would choose NVIDIA if all GPU brands performed equally. 57% of gamers have been blocked from buying a GPU due to price hikes or scalping, and 43% have delayed or canceled purchases due to other life expenses like rent and bills. Over 1 in 4 gamers (25%) say $500 is their maximum budget for a GPU today. Nearly 2 in 3 gamers (62%) would switch to cloud gaming full-time if latency were eliminated, and 42% would skip future GPU upgrades entirely if AI upscaling or cloud services met their performance needs.
That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.
-
Still on a 1060 over here.
Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.
Hey, I’m also on a 1060 still! Admittedly I hardly game anymore, although I am considering another Skyrim playthrough.
-
That last one is especially horrifying. You don’t own games when you cloud game, you simply lease them. We all know what that’s done for the preservation of games. Not to mention encouraging the massive amounts of shovel ware that we get flooded with.
You don’t own games when you cloud game, you simply lease them.
That’s also how it is with a game you purchased to play on your own PC, though. Unless you have it on physical media, your access could be revoked at any time.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
Yeah no shit, what a weird fucking take
-
Fuck those guys too honestly. AMD is fueling this bullshit just as much as Nvidia.
Nvidia is one of the most evil companies out there, responsible for killing nearly all other GPU producers destroying the market.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I’m on a 2080 or 2090 (I forget which). I thought I’d upgrade to the 40xx now that 5090s are out. I looked at the prices and absolutely not. The 5090s are around 500k JPY, and ordering from the US would work out to about the same with exchange, tax, and any possible tariff that exists this week. Salaries here are also much lower than in the west as well on average even for those of us in software.
4070s are still around 100k which is cheaper than last time I looked at 250k ish.
Price aggregator site in Japan if you want to play around: https://kakaku.com/pc/videocard/itemlist.aspx?pdf_Spec103=500 On the left, you’ll see the cards to select and the prices are obvious on the screen.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I downgraded from a gtx1060 to a ryzen 5000g igpu terraria & factorio don’t need much.
-
Those cards were like what though, $199?
That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.
-
Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.
Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.
I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that much for running raytracing in 4k.
Does that sound about right?
Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:
Consoles cannot really do what they claim to do at 4K… at actual 4K.
They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.
Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.
-
Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.
I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that much for running raytracing in 4k.
Does that sound about right?
Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:
Consoles cannot really do what they claim to do at 4K… at actual 4K.
They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.
Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.
1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Rimworld doesn’t need a new gpu
-
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.
Nowadays the new cards are 10% faster for 15% more money.
I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.
-
Rimworld doesn’t need a new gpu
What it needs is a new multi-threaded engine so I can actually use all these extra cores. XD
-
Fuck Nvidia anyways. #teamred
#teamred
Temu Nvidia is so much better, true. Please support the “underdog” billion dollar company.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
My new gpu was a steam deck.
-
I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.
I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.
5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.
-
#teamred
Temu Nvidia is so much better, true. Please support the “underdog” billion dollar company.
I support the lesser evil option, yes. It’s not like I have much other choices now, do I? Thanks to fucking Nvidia.
-
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
I just picked up a used RX 6800 XT after doing some research and comparing prices.
The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti
-
I’m on a 2080 or 2090 (I forget which). I thought I’d upgrade to the 40xx now that 5090s are out. I looked at the prices and absolutely not. The 5090s are around 500k JPY, and ordering from the US would work out to about the same with exchange, tax, and any possible tariff that exists this week. Salaries here are also much lower than in the west as well on average even for those of us in software.
4070s are still around 100k which is cheaper than last time I looked at 250k ish.
Price aggregator site in Japan if you want to play around: https://kakaku.com/pc/videocard/itemlist.aspx?pdf_Spec103=500 On the left, you’ll see the cards to select and the prices are obvious on the screen.
Don’t think they made a 2090. 2080 or 2080 ti I guess.