Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…
I went to 2k 100hz uw from 1080p 144hz. I stopped noticing the increased framerate pretty quickly as the “mouse so smooth” effects wear off fast. But the ultrawide huge fov is a massive plus. I don’t notice the resolution increase at all beyond lower frames and more text on screen in docs.
Laptops 4k is just 1080p with extra battery drain and worse performance.
-
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
fake frames
And that’s my main problem with what the industry has become. Nvidia always had sizable jumps generation to generation, in raw performance. They STILL get better raw performance, but now it’s nowhere near impressive enough and they have to add their fake frame technologies into their graphs. Don’t get me wrong, they always had questionable marketing tactics, but now it’s getting even worse.
No idea when I’m replacing my 3060ti, but it won’t be nVidia.
-
I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.
The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?
As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.
One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.
I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).
I feel like another is… marketing?
Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.
-
I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.
The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?
As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.
Technically Intel is also releasing some cheapo GPUs in similar capability to nVidia but they all have the same manufacturers anyways.
-
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.
bitcoin mining
That’s a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it’s nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.
AI is driving up prices or rather, it’s reducing availability, which then translates into higher prices.
Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They’ll sell less volume but they’ll make more profit per unit.
-
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.
But it was expensive.
It seemed like horrible value at the time, but in hindsight a 4090 was not the worst investment, hah.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Food, not ROBLOX.
-
Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.
5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.
4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.
Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.
The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.
-
The progress is just not there.
I’ve got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn’t care less about.
bit better RT performance about which I couldn’t care less about.
Yeah raytracing is not really relevant on these cards, the performance hit is just too great.
The RX 9070 XT is the first AMD GPU where you can consider turning it on.
-
I’m still surviving on my RX580 4GB. Limping along these days, but no way I can justify the price of a new GPU.
What about the used market? The Nvidia 1080/1080 TI or AMD 6000/7000 series is not too bad.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I was gifted a 2080Ti about a year or so ago and I have no intention on upgrading anytime soon. The former owner of my card is a friend who had it in their primary gaming rig, back when SLI wasn’t dead, he had two.
So when he built a new main rig with a single 4090 a few years back he gifted me one and the other one he left in his old system and started using that as a spare/guest computer for having impromptu LANs. It’s still a decent system, so I don’t blame him.
In any case, that upgraded my primary computer from a 1060 3G… So it was a welcome change to have sufficient video memory again.
The cards keep getting more and more power hungry and I don’t see any benefit in upgrading… Not that I can afford it… I haven’t been in school for a long time, and lately, I barely have time to enjoy YouTube videos, nevermind a full assed game. I literally have to walk away from a game for so long between sessions that I forget the controls. So either I can beat the game in one sitting, or the controls are similar enough to the defaults I’m used to (left click to fire, right click to ADS, WASD for movement, ctrl or C for crouch, space to jump, E to interact, F for flashlight, etc etc…); that way I don’t really need to relearn anything.
This is a big reason why I haven’t finished some titles that I really wanted to, like TLoU, or Doom Eternal… Too many buttons to remember. It’s especially bad with doom, since if you don’t remember how, and when to use your specials, you’ll run out of life, armor, ammo, etc pretty fast. Remembering which special gives what and how to trigger it… Uhhh … Is it this button? Gets slaughtered by an imp … Okay, not that button. Reload let’s try this… Killed by the same imp not that either… Hmmm. Goes and looks at the key mapping ohhhhhh. Okay. Reload I got it this time… Dies anyways due to other reasons
Whelp. Quit maybe later.
-
I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.
You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.
After some research I decided to side-grade to the 6800 and it’s a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn’t high end but it’s no slouch either. AMD is king on Linux.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Ah capitalism…
Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
When a new gpu was 500-900 usd it was fine.
But yeah, 2070rtx keeps chugging on
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
the most i use my gpu for at this point is minecraft shaders, i dont plan on upgrading in 10+ years
-
bit better RT performance about which I couldn’t care less about.
Yeah raytracing is not really relevant on these cards, the performance hit is just too great.
The RX 9070 XT is the first AMD GPU where you can consider turning it on.
But I wouldnt turn it on and actually play with it even if I could because I will always take the better performance.
I’ve actually tried Path Tracing in CP2077 running at native Steam Deck resolution streamed to my Steam Deck OLED from my PC at max settings and it could do 30FPS locked fairly well (overlocked by 20% though). But the game looks absolutelly horrible in motion with it’s terrible LOD so no amount of RT or PT can save it. It looks dope for screenshots though. But that’s PT, RT is basically almost indistinguishable. And PT is many, many years away for it to be viable for majority of people to use.
(The game reports W10 but it was Fedora42 actually)
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I am just using my GTX 1650 4GB VRAM (GDDR6) and it works fine for most of the things i do I can use Linux + FSR Hack to squeeze framerates out of games that perform poorly
and it runs my SCP:SL and tf2 fine
SCP:SL am using FSR HACK To squeeze more framerate until Nvidia fixes VKD3D
Maybe my next card is RX 6650 XT/AMD but still i might stick with my GTX 1650 -
Well I am shocked, SHOCKED I say! Well, not that shocked.
Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.
So my next card is probably gonna be an RX 9070XT.
-
The good games don’t need a high end GPU.
Clair obscur runs like shit on my 3090 at 4k