Skip to content
0
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
  • Home
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Sketchy)
  • No Skin
Collapse

Wandering Adventure Party

  1. Home
  2. PC Gaming
  3. NVIDIA GPU market domination hits almost 100%, AMD dwindles, Intel non-existent

NVIDIA GPU market domination hits almost 100%, AMD dwindles, Intel non-existent

Scheduled Pinned Locked Moved PC Gaming
pcgaming
100 Posts 53 Posters 588 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B brucethemoose@lemmy.world

    Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

    4x3090 or 3060 homelabs are the standard, heh.

    M This user is from outside of this forum
    M This user is from outside of this forum
    mystikincarnate@lemmy.ca
    wrote on last edited by
    #78

    Who the fuck buys a consumer GPU for AI?

    If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

    B 1 Reply Last reply
    0
    • M mystikincarnate@lemmy.ca

      Who the fuck buys a consumer GPU for AI?

      If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

      B This user is from outside of this forum
      B This user is from outside of this forum
      brucethemoose@lemmy.world
      wrote on last edited by brucethemoose@lemmy.world
      #79

      Who the fuck buys a consumer GPU for AI?

      Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

      I can (just barely) run GLM-4.5 on a single 3090 desktop.

      M 1 Reply Last reply
      1
      • BakkodaB Bakkoda

        TIL there’s a lot of people who don’t know what a dGPU is in here

        F This user is from outside of this forum
        F This user is from outside of this forum
        flemblefabber@sh.itjust.works
        wrote on last edited by
        #80

        Thanks for explaining

        1 Reply Last reply
        9
        • 9 9tr6gyp3@lemmy.world

          Who the hell keeps buying nvidia? Stop it.

          G This user is from outside of this forum
          G This user is from outside of this forum
          gamechld@lemmy.world
          wrote on last edited by
          #81

          The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people’s minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I’d be sad to see AMD and Intel quit the dGPU space, but I wouldn’t be surprised.

          1 Reply Last reply
          3
          • B brucethemoose@lemmy.world

            To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.

            G This user is from outside of this forum
            G This user is from outside of this forum
            gamechld@lemmy.world
            wrote on last edited by
            #82

            What’s the VRAM cartel story? Think I missed that.

            B 1 Reply Last reply
            0
            • I inclementimmigrant@lemmy.world
              This post did not contain any content.
              A This user is from outside of this forum
              A This user is from outside of this forum
              ssillyssadass
              wrote on last edited by
              #83

              I think Nvidia has better marketing. I never really hear anything about AMD cards, where I would I instead hear about Nvidia.

              1 Reply Last reply
              0
              • I inclementimmigrant@lemmy.world
                This post did not contain any content.
                A This user is from outside of this forum
                A This user is from outside of this forum
                abettertomorrow@sh.itjust.works
                wrote on last edited by
                #84

                GTX 1080 Ti strong here lol

                1 Reply Last reply
                5
                • G ganryuu@lemmy.ca

                  I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.

                  By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.

                  Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing

                  N This user is from outside of this forum
                  N This user is from outside of this forum
                  njm1314@lemmy.world
                  wrote on last edited by
                  #85

                  A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.

                  G 1 Reply Last reply
                  0
                  • B brucethemoose@lemmy.world

                    Who the fuck buys a consumer GPU for AI?

                    Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

                    I can (just barely) run GLM-4.5 on a single 3090 desktop.

                    M This user is from outside of this forum
                    M This user is from outside of this forum
                    mystikincarnate@lemmy.ca
                    wrote on last edited by
                    #86

                    … Yeah, for yourself.

                    I’m referring to anyone running an LLM for commercial purposes.

                    Y’know, 80% of Nvidia’s business?

                    B 1 Reply Last reply
                    0
                    • M marthirial@lemmy.world

                      At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.

                      AMD comes up with a better version of CUDA and you have the disruption needed to compete.

                      mangopenguin@lemmy.blahaj.zoneM This user is from outside of this forum
                      mangopenguin@lemmy.blahaj.zoneM This user is from outside of this forum
                      mangopenguin@lemmy.blahaj.zone
                      wrote on last edited by
                      #87

                      I’m not sure that would even help that much, since tools out there already support CUDA, and even if AMD had a better version it would still require everyone to update apps to support it.

                      1 Reply Last reply
                      2
                      • I inclementimmigrant@lemmy.world
                        This post did not contain any content.
                        mangopenguin@lemmy.blahaj.zoneM This user is from outside of this forum
                        mangopenguin@lemmy.blahaj.zoneM This user is from outside of this forum
                        mangopenguin@lemmy.blahaj.zone
                        wrote on last edited by
                        #88

                        AMD needs to fix their software.

                        I had an AMD GPU last year for a couple weeks, but their software barely works. The overlay didn’t scale properly on a 4k screen and cut off half the info, and wouldn’t even show up at all most of the time, ‘ReLive’ with instant replay enabled caused a performance hit with stuttering in high FPS games…

                        Maybe they have it now, but I also couldn’t find a way to enable HDR on older games like Nvidia has.

                        1 Reply Last reply
                        3
                        • N njm1314@lemmy.world

                          A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.

                          G This user is from outside of this forum
                          G This user is from outside of this forum
                          ganryuu@lemmy.ca
                          wrote on last edited by
                          #89

                          Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.

                          Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.

                          N 1 Reply Last reply
                          1
                          • G ganryuu@lemmy.ca

                            Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.

                            Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.

                            N This user is from outside of this forum
                            N This user is from outside of this forum
                            njm1314@lemmy.world
                            wrote on last edited by
                            #90

                            Sure buddy, we’re all idiots for not liking the product you simp for. Got it.

                            G 1 Reply Last reply
                            0
                            • M mystikincarnate@lemmy.ca

                              … Yeah, for yourself.

                              I’m referring to anyone running an LLM for commercial purposes.

                              Y’know, 80% of Nvidia’s business?

                              B This user is from outside of this forum
                              B This user is from outside of this forum
                              brucethemoose@lemmy.world
                              wrote on last edited by
                              #91

                              I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

                              I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

                              M 1 Reply Last reply
                              0
                              • N njm1314@lemmy.world

                                Sure buddy, we’re all idiots for not liking the product you simp for. Got it.

                                G This user is from outside of this forum
                                G This user is from outside of this forum
                                ganryuu@lemmy.ca
                                wrote on last edited by
                                #92

                                Nice. Did not answer anything, did not point out where I’m simping, or being a fanboy. I’m not pro Nvidia, nor AMD, nor anything (rather than that I’m pretty anticonsumerism actually, not that you care).

                                You’re being extremely transparent in your bad faith.

                                1 Reply Last reply
                                0
                                • G gamechld@lemmy.world

                                  What’s the VRAM cartel story? Think I missed that.

                                  B This user is from outside of this forum
                                  B This user is from outside of this forum
                                  brucethemoose@lemmy.world
                                  wrote on last edited by brucethemoose@lemmy.world
                                  #93

                                  Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

                                  But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

                                  When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


                                  AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one’s buying them for CAD. No one’s buying them for compute, for sure… And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

                                  In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you’re allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

                                  …Yet they don’t.

                                  It makes no sense. They’re maintaining an anticompetitive VRAM ‘cartel’ with Nvidia instead of trying to compete.

                                  Intel has more of an excuse here, as they literally don’t manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.

                                  1 Reply Last reply
                                  0
                                  • O omega_jimes@lemmy.ca

                                    I know it’s not indicative of the industry as a whole, but the Steam hardware survey has Nvidia at 75%. So while they’re still selling strong, as others have indicated, I’m not confident they’re getting used for gaming.

                                    W This user is from outside of this forum
                                    W This user is from outside of this forum
                                    wolflink@sh.itjust.works
                                    wrote on last edited by
                                    #94

                                    Everyone and their manager wants to play with LLMs and and and Intel still don’t have a real alternative to CUDA and so are much less popular for compute applications.

                                    1 Reply Last reply
                                    0
                                    • fedegenerate@lemmynsfw.comF fedegenerate@lemmynsfw.com

                                      If Intel/AMD provides an LLM capable card, by the time I get around to making a big boy server, then that’s what I’ll get. Ideally Intel for the sweet AV1 encoding and Quick sync. Then again, if the card runs LLMs, then transcodes won’t be a problem. Cuda looks nice, but fuck Nvidia, I’ll go without.

                                      Don’t look at me, I’m an Android gamer these days, and my home lab runs on integrated Intel graphics.

                                      Maybe one day I’ll build a gaming rig, but mine’s so old now (gtx970) that I’m pretty much starting from scratch. I can’t justify the expense to make a rig from nothing when my Retroid pocket is >£300 and I can play most everything PS2/Gamecube and before.

                                      swelter_spark@reddthat.comS This user is from outside of this forum
                                      swelter_spark@reddthat.comS This user is from outside of this forum
                                      swelter_spark@reddthat.com
                                      wrote on last edited by
                                      #95

                                      AMDs are already LLM capable.

                                      1 Reply Last reply
                                      1
                                      • B brucethemoose@lemmy.world

                                        I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

                                        I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

                                        M This user is from outside of this forum
                                        M This user is from outside of this forum
                                        mystikincarnate@lemmy.ca
                                        wrote on last edited by
                                        #96

                                        The original post is about Nvidia’s domination of discrete GPUs, not consumer GPUs.

                                        So I’m not limiting myself to people running an LLM on their personal desktop.

                                        That’s what I was trying to get across.

                                        And it’s right on point for the original material.

                                        B 1 Reply Last reply
                                        0
                                        • M mystikincarnate@lemmy.ca

                                          The original post is about Nvidia’s domination of discrete GPUs, not consumer GPUs.

                                          So I’m not limiting myself to people running an LLM on their personal desktop.

                                          That’s what I was trying to get across.

                                          And it’s right on point for the original material.

                                          B This user is from outside of this forum
                                          B This user is from outside of this forum
                                          brucethemoose@lemmy.world
                                          wrote on last edited by
                                          #97

                                          I’m not sure the bulk of datacenter cards count as ‘discrete GPUs’ anymore, and they aren’t counted in that survey. They’re generally sold socketed into 8P servers with crazy interconnects, hyper specialized to what they do. Nvidia does sell some repurposed gaming silicon as a ‘low end’ PCIe server card, but these don’t get a ton of use compared to the big silicon sales.

                                          M 1 Reply Last reply
                                          0

                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post