Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures
You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech
I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.
Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.
Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…
…and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…
… and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.
So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.
…
That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.
If you don’t, older light rendering methods work almost as well, and run much, much faster.
Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.
Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.
Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.
…
I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.
-
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.
But it was expensive.
Also built a dream machine in 2022. I have a 4090, a 7700X, 32GB of DDR5 6000, and 8TB of NVME storage. It’s got plenty of power for my needs; as long as I keep getting 90+ FPS @ 4K and programs keep opening instantly, I’m happy. And since I bought into the AM5 platform right at the beginning of it, I can still upgrade my CPU in a few years and have a brand new, high end PC again for just a few hundred bucks.
-
I have a 3080 and am surviving lol. never had an issue
I have a 3080 also. It’s only just starting to show it’s age with some of these new UE5 games. A couple weeks ago discovered dlssg-to-fsr3 and honestly i’ll take the little bit of latency for some smoother gameplay
-
hahahahahahahaha.
rx580
RX580 remains a power efficient champ. The old hot hatch of the GPU world.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
The progress is just not there.
I’ve got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn’t care less about.
-
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures
You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech
Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.
Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.
There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I’m still using my GTX 1070. There just aren’t enough new high-spec games that I’m interested in to justify paying the outrageous prices that NVIDIA is demanding and that AMD follows too closely behind on. Even if there were enough games, I’d refuse to upgrade out of principle, I will not reward price gouging. There are so many older/lower-spec games that I haven’t yet played that run perfectly for me to care. So many games, in fact, that I couldn’t get through all of them in my lifetime.
-
Have you tried 4k? The difference is definitely noticeable unless you play on like a 20" screen
Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.
2k could be argued is solid but even then the ppi is so dense already it does not really matter.
Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.
-
Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.
Ex-fucking-actly!
Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Oh totes. NVIDIA continuing to lie even more blatantly to their face, driver bricking issues on updates, missing GPU ROPS performance, even more burn problems with a connector they knew continued to be problematic and lied about it, they and their retail partners releasing very limited inventory and then serving internal scalping while also being increasingly hostile to the rest of their consumers, ray tracing performance improvements they have to exclusive push in certain games and the newest most expensive hardware to actually get any benefit from their cards, false MSRP pricing and no recourse for long time loyal customers except a lottery in the US while the rest of the regions get screwed. Totes just that it’s “too expensive”, because when have gamers ever splurged on their hobby?
-
I have a 3080 and am surviving lol. never had an issue
Pretty wise, that’s the generation before the 12HVPWR connectors started burning up.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
All I want is more VRAM, it can already play all the games I want.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I’m sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain’t no universe where I would pay full price for the newest gens. I don’t need to render anything for work with my PC, so a 2-3 year old GPU will do just fine
-
Pretty wise, that’s the generation before the 12HVPWR connectors started burning up.
Afaik the 2080was the last FE with a regular PCIe power connector.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Don’t think I’ll be moving on from my 7900XTX for a long while. Quite pleased with it.
-
I’m sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain’t no universe where I would pay full price for the newest gens. I don’t need to render anything for work with my PC, so a 2-3 year old GPU will do just fine
Exact same here and I’m upgrading any time soon. MHWilds runs like ass no matter the card and I’m back to playing Hoi4 in the mean time.
-
All I want is more VRAM, it can already play all the games I want.
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
Why not just buy a cheaper one? X060 or X070 series is usually fine in price and runs everything at high enough settings. Flagship is for maxed out everything on 4k+ resolutions. And in those cases, everything else is larger and more expensive as well; the monitor needs to be 4k, huge ass PSU, large case to fit the PSU and card in, even the power draw and energy… costs just start growing exponentially.
-
Why not just buy a cheaper one? X060 or X070 series is usually fine in price and runs everything at high enough settings. Flagship is for maxed out everything on 4k+ resolutions. And in those cases, everything else is larger and more expensive as well; the monitor needs to be 4k, huge ass PSU, large case to fit the PSU and card in, even the power draw and energy… costs just start growing exponentially.
For me, with a 2080 already, I would have to spend much more than what my gpu was worth to have any significant upgrade. It’s just not worth it.