NVIDIA GPU market domination hits almost 100%, AMD dwindles, Intel non-existent
-
I love my high end 1440p165hz IPs G-Sync monitor.
What does that have to do with anything? Pretty much all monitors also support FreeSync which works just as well.
-
This post did not contain any content.
„The Market” is not a good measure. Hell, its not even a measure at all. No consumer is able to pull any good info from this article.
Its the equivalent of „How much money has been spent of products by company xy”, completely disregarding if the products sold are even competing with each other, let alone if the production of one company is even trying to sell at that scale
Now regarding the article: they are not differentiating between enterprise and personal grade products. Of course Intel is non existent in Enterprise GPU sales, because they don’t even sell fucking Enterprise GPUs. Same with amd.
This is like comparing a local steel working company with weckerle machines who mostly makes industry Make-up equipment (out of steel) and saying that Weckerle dominates the Market
Or like saying „Gamers Beware: Pre-built PCs are dominating the market”, then showing a study about „ Computing devices”, and showing that the 2 main sources are Enterprise buying bulk and NUCs, both of which have nothing to do with what the article is implying in the first place, since, and say this with me
- Enterprise devices are completely different from consumer devices, both in terms of price and in volume, and if compared directly (in the middle of an economic crisis) of course an Enterprise is going to spend way more money on one category.
-
„The Market” is not a good measure. Hell, its not even a measure at all. No consumer is able to pull any good info from this article.
Its the equivalent of „How much money has been spent of products by company xy”, completely disregarding if the products sold are even competing with each other, let alone if the production of one company is even trying to sell at that scale
Now regarding the article: they are not differentiating between enterprise and personal grade products. Of course Intel is non existent in Enterprise GPU sales, because they don’t even sell fucking Enterprise GPUs. Same with amd.
This is like comparing a local steel working company with weckerle machines who mostly makes industry Make-up equipment (out of steel) and saying that Weckerle dominates the Market
Or like saying „Gamers Beware: Pre-built PCs are dominating the market”, then showing a study about „ Computing devices”, and showing that the 2 main sources are Enterprise buying bulk and NUCs, both of which have nothing to do with what the article is implying in the first place, since, and say this with me
- Enterprise devices are completely different from consumer devices, both in terms of price and in volume, and if compared directly (in the middle of an economic crisis) of course an Enterprise is going to spend way more money on one category.
Y’all have pre-built phones? And even laptops? Or car computers??
Weirdos./s
But def, this type of info is at best for the investors (and even then just unstructured info about market shares), not consumers.
-
It fells between 5070 Ti and 5080 while still being cheaper than 5070 Ti.
5070 ti has come down to its normal price. You can find listings for $750. 9070 xt is better in some games than the 5070 ti in terms of raw raster. When it comes to upscaling, efficiency,and ray tracing. The 5070 ti is better.
-
This post did not contain any content.
TIL there’s a lot of people who don’t know what a dGPU is in here
-
What I mean is they need to sell reasonable high VRAM cards that aren’t a MI325X, heh.
There’s not really a motivation to target them over a 3090 or 4090 or whatever, but that would change with bigger VRAM pools.
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes. There’s no point of shoving that much vram into it if support is painful and makes it hard to develop. I’m probably biased due to my 6800xt, one of the earliest cards that’s still supported by rocm, so there’s a bunch of stuff my gpu can’t do. ZLUDA is painful to get working (and I have it easier due to my 6800xt), ROCM is mostly works but vram utilization is very inefficient for some reason and it’s Linux only, which is fine but I’d like more crossplatform options. Vulkan compute is deprecated within pytorch. AMD HIP is annoying as well but idk how much of it was just my experience with ZLUDA.
Intel actually has better cross platform support with IPEX, but that’s just pytorch. Again, fine.
-
What does that have to do with anything? Pretty much all monitors also support FreeSync which works just as well.
I have a g-sync monitor. It supports g-sync. It is very nice, but unfortunately it does NOT support freesync.
-
But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why?
Its always been this way: they want AMD and Intel to compete so Nvidia gets cheaper, not that they will ever buy AMD or Intel. Gamers seem to be the laziest, most easily influenced consumer sector ever.
People who say buy Intel and amd probably either did or will when they upgrade, which is probably not anytime soon with the way everything seems to be going.
-
Don’t forget the crypto scammers.
GPU hasnt been profitable to mine for many years now.
People just keep parroting anti-crypto talking points for years without actually knowing what’a going on
To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation
-
I only upgraded from my 380 this year
Still plenty of fun to be had with new GOG mods, etc.
-
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes. There’s no point of shoving that much vram into it if support is painful and makes it hard to develop. I’m probably biased due to my 6800xt, one of the earliest cards that’s still supported by rocm, so there’s a bunch of stuff my gpu can’t do. ZLUDA is painful to get working (and I have it easier due to my 6800xt), ROCM is mostly works but vram utilization is very inefficient for some reason and it’s Linux only, which is fine but I’d like more crossplatform options. Vulkan compute is deprecated within pytorch. AMD HIP is annoying as well but idk how much of it was just my experience with ZLUDA.
Intel actually has better cross platform support with IPEX, but that’s just pytorch. Again, fine.
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.
The AI Pro isn’t even availible! And 32GB is not enough anyway.
I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.
I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.
There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.
They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.
-
GPU hasnt been profitable to mine for many years now.
People just keep parroting anti-crypto talking points for years without actually knowing what’a going on
To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation
Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It’s true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.
You are right that crypto doesn’t drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.
-
Who the hell keeps buying nvidia? Stop it.
I will never get another AMD card after my first one just sucked ass and didn’t ever work right.
I wanted to try a Intel card but I wasn’t even sure if I could find linux drivers for it because they weren’t on the site for download and I couldn’t find anything specifying if their newer cards even worked on linux.
So yeah, Nvidia is the only viable company for me to buy a graphics card from
-
Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It’s true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.
You are right that crypto doesn’t drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.
Profitability of Bitcoin mining is dependent on the value of Bitcoin
No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent
ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum
A massive Bitcoin spike would not affect the GPU market in any appreciable way
Crypto mining is pretty dumb, but misinformation helps no one
-
GPU hasnt been profitable to mine for many years now.
People just keep parroting anti-crypto talking points for years without actually knowing what’a going on
To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation
Most people are buying Nvidia because that’s what’s commonly recommended on reviews. “Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!” Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands’ selling points aren’t well known and the general notion is that if it’s not Nvidia it sucks.
-
What does that have to do with anything? Pretty much all monitors also support FreeSync which works just as well.
From what I can find, even though a lot of FreeSync monitors support at least partially G-Sync, the opposite seems rather rare, since G-Sync is fully proprietary and hardware-based. I’ve found a couple more modern monitors that officially support both but they seem to be the exception rather than the norm.
-
I will never get another AMD card after my first one just sucked ass and didn’t ever work right.
I wanted to try a Intel card but I wasn’t even sure if I could find linux drivers for it because they weren’t on the site for download and I couldn’t find anything specifying if their newer cards even worked on linux.
So yeah, Nvidia is the only viable company for me to buy a graphics card from
That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?
On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).
-
It’s fear of failure not success because success isn’t an option.
Cause if they start to “succeed” then they actually fail since they will be crushed by Nvidia.
Their options are to either hold the status quo or lose more because they angered the green hulk in the room
Wait wait wait… If I push your theory a bit, it then means that Nvidia could crush AMD at any time, becoming a full fledged monopoly (and being able to rake in much more profits), but they are… Deciding not to? Out of the goodness in their hearts maybe?
-
That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?
On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).
That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience?
Absolutely, if a company I am trying for the first time gives me a bad experience, I will not go back. That’s me giving them a chance, and AMD fucked up that chance and I couldn’t even get a refund for like a $200 card. Choosing to try a different option resulted in me wasting time and money, and it pushed back my rig working for half a year until i could afford a working card again which really pissed me off.
I didn’t know that about intel cards, I’ll have to try one for my next upgrade if I can find on their site that they are supported.
-
That one stung XD. I went with an AMD GPU in 2023 after only owning Nvidia for decades. I went with AMD because I was not satisfied with the amount of Vram Nvidia offers and I did not want burning power connectors. Overall it’s stable and works great. There are some bugs here and there, but zero regrets.
No shame in that; AMD and Nvidia traded between ‘optimal buys’ forever. There were times where buying AMD was not the best idea, like with how amazing the Nvidia 900/1000 series was while AMD Vega was very expensive.
Others, it wasn’t obvious at the time. The old AMD 7000 series was pricey at launch, for instance, but aged ridiculously well. A 7950 would still function alright these days.
This market’s such a caricature now though. AMD/Intel are offering these obvious great values, yet being looked over through pure ignorance; I can’t remember things ever being like this, not all the way back to Nvidia Fermi at least.