NVIDIA GPU market domination hits almost 100%, AMD dwindles, Intel non-existent
-
That one stung XD. I went with an AMD GPU in 2023 after only owning Nvidia for decades. I went with AMD because I was not satisfied with the amount of Vram Nvidia offers and I did not want burning power connectors. Overall it’s stable and works great. There are some bugs here and there, but zero regrets.
No shame in that; AMD and Nvidia traded between ‘optimal buys’ forever. There were times where buying AMD was not the best idea, like with how amazing the Nvidia 900/1000 series was while AMD Vega was very expensive.
Others, it wasn’t obvious at the time. The old AMD 7000 series was pricey at launch, for instance, but aged ridiculously well. A 7950 would still function alright these days.
This market’s such a caricature now though. AMD/Intel are offering these obvious great values, yet being looked over through pure ignorance; I can’t remember things ever being like this, not all the way back to Nvidia Fermi at least.
-
7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.
The AI Pro isn’t even availible! And 32GB is not enough anyway.
I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.
I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.
There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.
They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.
I’m well aware. I’m one such tinkerer. Its a catch 22. No good software support means that no one really wants to use it. And since no one really wants to use it, amd doesn’t make stuff. Also amd is using much denser memory chips so an easy double in vram capacity isn’t as possible.
It took them a few months iirc to get proper support for 9070 in rocm.
-
AMD doesn’t support CUDA
At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.
AMD comes up with a better version of CUDA and you have the disruption needed to compete.
-
That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?
On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).
What else would a consumer base things on except their own experiences? Not like it’s a rare story either.
-
Honestly stuff like Unreal’s Lumen or Crytek’s SVOGI has obsoleted RTX. It looks freaking incredible, and runs fast, and you can put the rendering budget to literally anything else; who in their right mind would develop RTX over that?
Hardware Lumen looks way better than only software
-
Hardware Lumen looks way better than only software
It’s still infinitely faster than ‘raytracing’ though.
-
Profitability of Bitcoin mining is dependent on the value of Bitcoin
No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent
ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum
A massive Bitcoin spike would not affect the GPU market in any appreciable way
Crypto mining is pretty dumb, but misinformation helps no one
ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.
You are wrong about the market value of Bitcoin’s impact on the profitability of Bitcoin mining.
Is Bitcoin Mining Profitable?
Changes in technology and the creation of professional mining centers have affected profitability for individual Bitcoin miners.
Investopedia (www.investopedia.com)
Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn’t change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.
-
This post did not contain any content.
I know it’s not indicative of the industry as a whole, but the Steam hardware survey has Nvidia at 75%. So while they’re still selling strong, as others have indicated, I’m not confident they’re getting used for gaming.
-
What else would a consumer base things on except their own experiences? Not like it’s a rare story either.
I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.
By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.
Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing
-
5070 ti has come down to its normal price. You can find listings for $750. 9070 xt is better in some games than the 5070 ti in terms of raw raster. When it comes to upscaling, efficiency,and ray tracing. The 5070 ti is better.
In my market the 5070 Ti is still 100-200 USD more expensive than 9070 XT.
-
It’s the fucking AI tech bros
Microsoft.
Microsoft is buying them for AI.
From what I understand, chatGPT is running on azure servers.
-
Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.
4x3090 or 3060 homelabs are the standard, heh.
Who the fuck buys a consumer GPU for AI?
If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.
-
Who the fuck buys a consumer GPU for AI?
If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.
Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.
-
Thanks for explaining
-
Who the hell keeps buying nvidia? Stop it.
The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people’s minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I’d be sad to see AMD and Intel quit the dGPU space, but I wouldn’t be surprised.
-
To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.
What’s the VRAM cartel story? Think I missed that.
-
This post did not contain any content.
I think Nvidia has better marketing. I never really hear anything about AMD cards, where I would I instead hear about Nvidia.
-
This post did not contain any content.
GTX 1080 Ti strong here lol
-
I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.
By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.
Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing
A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.
-
Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.
… Yeah, for yourself.
I’m referring to anyone running an LLM for commercial purposes.
Y’know, 80% of Nvidia’s business?