Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.
My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
-
Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…
It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.
-
Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…
The majority sure, but there are some gems though.
Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example
You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.
-
Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.
2k could be argued is solid but even then the ppi is so dense already it does not really matter.
Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.
Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution
-
GTX 1060 6Gb still going strong!
Runs FFXIV at 1440p.
Runs HL Alyx on my Rift.
Runs everything prior to this gen.
If I need to run a more modern game, I’ll use my PS5.
-
So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.
Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.
You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.
And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.
-
I don’t mean to embarrass you, but you were also supposed to say “AI!”
Points with a finger and laughs
Look at that loser not using AI
-
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???
Bullshitted pixels per second seem to be the new currency.
It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.
Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.
-
My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I remember when High-end-GPUs were around 500 €.
-
unless they’re rich or actually need it for something important
Fucking youtubers and crypto miners.
Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:
Fucking youtubers and AI.
-
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
I ran vr on one of those. Not well, but well enough.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
-
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
-
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.
-
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.
The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?
As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.
-
I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…
I went to 2k 100hz uw from 1080p 144hz. I stopped noticing the increased framerate pretty quickly as the “mouse so smooth” effects wear off fast. But the ultrawide huge fov is a massive plus. I don’t notice the resolution increase at all beyond lower frames and more text on screen in docs.
Laptops 4k is just 1080p with extra battery drain and worse performance.
-
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
fake frames
And that’s my main problem with what the industry has become. Nvidia always had sizable jumps generation to generation, in raw performance. They STILL get better raw performance, but now it’s nowhere near impressive enough and they have to add their fake frame technologies into their graphs. Don’t get me wrong, they always had questionable marketing tactics, but now it’s getting even worse.
No idea when I’m replacing my 3060ti, but it won’t be nVidia.