Gamers Are Reportedly Skipping GPU Upgrades Due to Soaring Prices — Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
-
Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.
2k could be argued is solid but even then the ppi is so dense already it does not really matter.
Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.
I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…
-
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???
I don’t mean to embarrass you, but you were also supposed to say “AI!”
-
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
Ain’t nobody got time (money) for that!
-
Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.
Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.
There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)
That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…
idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.
They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.
On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.
I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.
…
On the other end of the engine spectrum:
Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.
Compare that to oh I dunno, the Source engine.
Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.
Still looks great, runs very efficiently, can scale down to older hardware.
Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.
Looks great, runs efficiently.
None of them use RT.
Because you don’t need to, if you take the time to actually optimize both your engine and game design.
-
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.
My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
-
Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…
It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.
-
Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…
The majority sure, but there are some gems though.
Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example
You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.
-
Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.
2k could be argued is solid but even then the ppi is so dense already it does not really matter.
Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.
Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution
-
GTX 1060 6Gb still going strong!
Runs FFXIV at 1440p.
Runs HL Alyx on my Rift.
Runs everything prior to this gen.
If I need to run a more modern game, I’ll use my PS5.
-
So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.
Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.
You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.
And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.
-
I don’t mean to embarrass you, but you were also supposed to say “AI!”
Points with a finger and laughs
Look at that loser not using AI
-
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???
Bullshitted pixels per second seem to be the new currency.
It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.
Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.
-
My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I remember when High-end-GPUs were around 500 €.
-
unless they’re rich or actually need it for something important
Fucking youtubers and crypto miners.
Crypto mining with GPUs is dead, the only relevant mining uses ASICs now, so it would be more accurate to say:
Fucking youtubers and AI.
-
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
I ran vr on one of those. Not well, but well enough.
-
Well I am shocked, SHOCKED I say! Well, not that shocked.
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
-
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
-
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.