NVIDIA GPU market domination hits almost 100%, AMD dwindles, Intel non-existent
-
That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?
On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).
What else would a consumer base things on except their own experiences? Not like it’s a rare story either.
-
Honestly stuff like Unreal’s Lumen or Crytek’s SVOGI has obsoleted RTX. It looks freaking incredible, and runs fast, and you can put the rendering budget to literally anything else; who in their right mind would develop RTX over that?
Hardware Lumen looks way better than only software
-
Hardware Lumen looks way better than only software
It’s still infinitely faster than ‘raytracing’ though.
-
Profitability of Bitcoin mining is dependent on the value of Bitcoin
No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent
ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum
A massive Bitcoin spike would not affect the GPU market in any appreciable way
Crypto mining is pretty dumb, but misinformation helps no one
ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.
You are wrong about the market value of Bitcoin’s impact on the profitability of Bitcoin mining.
Is Bitcoin Mining Profitable?
Changes in technology and the creation of professional mining centers have affected profitability for individual Bitcoin miners.
Investopedia (www.investopedia.com)
Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn’t change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.
-
This post did not contain any content.
I know it’s not indicative of the industry as a whole, but the Steam hardware survey has Nvidia at 75%. So while they’re still selling strong, as others have indicated, I’m not confident they’re getting used for gaming.
-
What else would a consumer base things on except their own experiences? Not like it’s a rare story either.
I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.
By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.
Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing
-
5070 ti has come down to its normal price. You can find listings for $750. 9070 xt is better in some games than the 5070 ti in terms of raw raster. When it comes to upscaling, efficiency,and ray tracing. The 5070 ti is better.
In my market the 5070 Ti is still 100-200 USD more expensive than 9070 XT.
-
It’s the fucking AI tech bros
Microsoft.
Microsoft is buying them for AI.
From what I understand, chatGPT is running on azure servers.
-
Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.
4x3090 or 3060 homelabs are the standard, heh.
Who the fuck buys a consumer GPU for AI?
If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.
-
Who the fuck buys a consumer GPU for AI?
If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.
Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.
-
Thanks for explaining
-
Who the hell keeps buying nvidia? Stop it.
The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people’s minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I’d be sad to see AMD and Intel quit the dGPU space, but I wouldn’t be surprised.
-
To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.
What’s the VRAM cartel story? Think I missed that.
-
This post did not contain any content.
I think Nvidia has better marketing. I never really hear anything about AMD cards, where I would I instead hear about Nvidia.
-
This post did not contain any content.
GTX 1080 Ti strong here lol
-
I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.
By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.
Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing
A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.
-
Who the fuck buys a consumer GPU for AI?
Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.
I can (just barely) run GLM-4.5 on a single 3090 desktop.
… Yeah, for yourself.
I’m referring to anyone running an LLM for commercial purposes.
Y’know, 80% of Nvidia’s business?
-
At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.
AMD comes up with a better version of CUDA and you have the disruption needed to compete.
I’m not sure that would even help that much, since tools out there already support CUDA, and even if AMD had a better version it would still require everyone to update apps to support it.
-
This post did not contain any content.
AMD needs to fix their software.
I had an AMD GPU last year for a couple weeks, but their software barely works. The overlay didn’t scale properly on a 4k screen and cut off half the info, and wouldn’t even show up at all most of the time, ‘ReLive’ with instant replay enabled caused a performance hit with stuttering in high FPS games…
Maybe they have it now, but I also couldn’t find a way to enable HDR on older games like Nvidia has.
-
A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.
Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.
Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.