AMD Says You Don't Need More VRAM
-
He said most people are playing at 1080p, and last month’s Steam survey had 55% of users with that as their primary display resolution, so he’s right about that. Ignore what’s needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
8GB VRAM is definitely a problem even at 1080p. There are already benchmarks showing this from various outlets. Not in every game of course, and it definitely hurts AAA mre than others. but it will get worse with time, and people buy GPUs to last several years, so it shouldn’t have a major issue on the day you buy it!
-
- Why would they be buying a new card to play how they’re already playing?
- What does the long term trend line look like?
You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.
It’s ok if well informed consumers are fine with a compromise for their use case.
Misrepresenting the product category, and misleading less informed consumers to believe that it’s not a second rate product in the current generation is deeply anti-consumer.
You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.
People have been saying that for years, my 8gb card is chugging along just fine. The race to vram that people were expecting just hasn’t happened. There’s little reason to move on from 1080p and the 75+ million ps5s aren’t going anywhere anytime soon.
-
I agree if that would mean that those options are cheaper. You can be a very active gamer and not need more, why pay more.
-
Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.
-
Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.
The big deal is that the vast majority of gamers aren’t techies. They don’t know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn’t make it obvious that it’s low-end is exploiting consumers’ lack of knowledge
-
The big deal is that the vast majority of gamers aren’t techies. They don’t know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn’t make it obvious that it’s low-end is exploiting consumers’ lack of knowledge
I run most games just fine with my 3070 8gb. While I would’ve preferred to have more when I bought it, it’s held up just fine.
While the 9060 XT isn’t released yet, everything I’ve seen so far has made the difference pretty clear. I have no problem with offering a lesser sku if the difference is clear. Not like Nvidia and their 1060 3gb and 6gb where they also cut the cores and memory bandwidth. If these differ on release my stance would be different.
Also gaming isn’t the only reason to have a GPU, I still use my 1060 in my server for transcoding and it works just fine. If I needed something to replace it, or if I was building a new one from scratch, 9060 XT 8gb or an arc would be a fine choice.
-
Perhaps AMD could convince game developers/publishers to spend some time learning optimisation. Many of the popular games I’ve seen in the past 5 years have been embarrassingly bad at working with RAM and storage specs that were considered massive until relatively recently. The techniques for doing this well have existed for generations.
-
8GB VRAM is definitely a problem even at 1080p. There are already benchmarks showing this from various outlets. Not in every game of course, and it definitely hurts AAA mre than others. but it will get worse with time, and people buy GPUs to last several years, so it shouldn’t have a major issue on the day you buy it!
Good thing there are only trash AAA games and the gems are indie games which would run on GladOS.
-
The big deal is that the vast majority of gamers aren’t techies. They don’t know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn’t make it obvious that it’s low-end is exploiting consumers’ lack of knowledge
What a load of crap… Only the first sentence is true. VRAM is usually written big on ads pages. Then using alternative fact you create imaginary crimes…
-
I can agree that the tweet was completely unnecessary, and the naming is extremely unfair given both variants have the exact same brand name. Even their direct predecessor does not do this.
The statement that AMD could easily sell the 16 GiB variant for 50 dollars less and that $300 gives “plenty of room” is wildly misleading, and from that I can tell they’ve not factored in BOM at all.
They blanketly state that GDDR6 is cheap and I’m not sure how they figure.
-
Perhaps AMD could convince game developers/publishers to spend some time learning optimisation. Many of the popular games I’ve seen in the past 5 years have been embarrassingly bad at working with RAM and storage specs that were considered massive until relatively recently. The techniques for doing this well have existed for generations.
-
Good thing there are only trash AAA games and the gems are indie games which would run on GladOS.
Not all indie games are low graphics requirements.
-
What’s a trend without outliers?
-
AMD doesn’t know about Unreal slop
-
I’ve been playing Jedi Survivor off Game Pass, and the 12GB I have is getting maxed out. Wishing I woulda had the extra money to get one of the 16GB gpus.
-
At least we know nVidia aren’t the only ones being shitty, they just lead the market in it.
-
Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.
The problem is, that games, especially the new big titles, are quite badly optimised quickly burning through your VRAM. So even if you maybe could play a game with 8gb you can’t, because the game is unoptimised.
-
The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.
-
You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.
People have been saying that for years, my 8gb card is chugging along just fine. The race to vram that people were expecting just hasn’t happened. There’s little reason to move on from 1080p and the 75+ million ps5s aren’t going anywhere anytime soon.
There’s no NEED to move on from 1080p, but there are a lot of reasons.
I wouldn’t even object to his position on 1080p if he said “the RX 5700 is fine for most users, don’t waste your money and reduce e-waste”.
He’s telling consumers that they should expect to pay a premium price to be playing a slightly better than 2019 experience until 2028 or 2030. There are gullible people who won’t understand that he’s selling snake oil.
-
There’s no NEED to move on from 1080p, but there are a lot of reasons.
I wouldn’t even object to his position on 1080p if he said “the RX 5700 is fine for most users, don’t waste your money and reduce e-waste”.
He’s telling consumers that they should expect to pay a premium price to be playing a slightly better than 2019 experience until 2028 or 2030. There are gullible people who won’t understand that he’s selling snake oil.
He’s telling consumers that they should expect to pay a premium price
What exactly do you expect him to say? Get the same GPU, but with more vram, at no extra cost?
If you want more performance, you pay more, that’s the way it’s always worked, and he’s correct in that 8gb is good enough for most people.