Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Businesses

Nvidia Takes 88% of the GPU Market Share (xda-developers.com) 83

As reported by Jon Peddie Research, Nvidia now holds 88% of the GPU market after its market share jumped 8% in its most recent quarter. "This jump shaves 7% off of AMD's share, putting it down to 19% total," reports XDA Developers. "And if you're wondering where that extra 1% went, it came from all of Intel's market share, squashing it down to 0%." From the report: Dr. Jon Peddie, president of Jon Peddie Research, mentions how the GPU market hasn't really looked "normal" since the 2007 recession. Ever since then, everything from the crypto boom to COVID has messed with the usual patterns. Usually, the first quarter of a year shows a bit of a dip in GPU sales, but because of AI's influence, it may seem like that previous norm may be forever gone: "Therefore, one would expect Q2'24, a traditional quarter, to also be down. But, all the vendors are predicting a growth quarter, mostly driven by AI training systems in hyperscalers. Whereas AI trainers use a GPU, the demand for them can steal parts from the gaming segment. So, for Q2, we expect to see a flat to low gaming AIB result and another increase in AI trainer GPU shipments. The new normality is no normality."

Nvidia Takes 88% of the GPU Market Share

Comments Filter:
  • The math doesn't math.

    • by gavron ( 1300111 )

      88% for the Nvidia Kings in their Hall of Thrones
      19% for the AMD men doomed to die
      ---
      107% market share is correct so that the Dark Lord can take his 7% off the top and rule it all.

      One NPU to rule them all, one NPU to find them.
      One NPU to bring them all, and pretend AI is a real thing instead of mishmashed upscaled auto-correct.
      In the land of hedge-fund market-speak, where the humans lie.

    • The math doesn't math.

      The math is fine. An AMD GPU gets replaced by, or is supplemented with, an NVIDIA GPU.

      • by rossdee ( 243626 )

        The overall market may have increased due to multiple GPUs, but that still doesn't make it 107%

        • by drnb ( 2434720 )

          The overall market may have increased due to multiple GPUs, but that still doesn't make it 107%

          Nvidia has 88% share, AMD has 19% share. Yes, 100% is the entire market. 107% indicates not the market size but Nvidia share + AMD share, all this is showing is that there is overlap. Individual machines that are counted twice, once for each vendor.

    • I puzzled over this too, so I went to the article. The quote was a misspeak. If you look at the bar graph, AMD dropped from 19% to 12%. So the quote should have been "... putting it down to 12% total." Then, the math works.
    • by hawk ( 1151 )

      >The math doesn't math.

      Maybe they attached a modern GPU to a leftover pentium . . .

  • I was an AMD fan boy (Score:5, Interesting)

    by iAmWaySmarterThanYou ( 10095012 ) on Friday June 07, 2024 @08:58PM (#64532011)

    I felt edgy and cool and hip using the "other" gpu.

    Then I got an Nvidia for my next build because AMD didn't have a next gen part that year while Nvidia did.

    And then I learned it's not always about the hardware or price or random benchmarks. The Nvidia software was so vastly superior in every way to what AMD provided and was such a better experience from the initial card install, through driver, software config and setup and then playing games on it that I never looked back.

    And today finding myself looking at a new build, I can get an AMD 7900xtx or an Nvidia 4800 variant for only a bit more. It's a no brainer. I've got 4800 in my cart waiting for a big price drop when the 5000 series comes out hopefully later this year.

    Nvidia earned the #1 position.

    • by bill_mcgonigle ( 4333 ) * on Friday June 07, 2024 @09:15PM (#64532055) Homepage Journal

      AMD promised great open source drivers a decade ago and never delivered and made their ROCm package a nightmare to get compiled and installed when CUDA was easy and the AI boat and video production boats all sailed while they had maybe ten people on their understaffed team (as I heard it) dicking around.

      Their Ryzen APU's are fine for light work (e.g. NAS transcoding) but most professional stuff is all nVidia now because they delivered.

      I have AMD CPU's paired with nVidia GPU's of course.

      • Yeah, the cpu in my cart is AMD. There's no way to justify an Intel in a current build today.

        I haven't seen any info on their new cpu but not holding my breath.

      • by slack_justyb ( 862874 ) on Friday June 07, 2024 @10:12PM (#64532157)

        made their ROCm package a nightmare to get compiled and installed

        I'll just say that ROCm 6.0 is baked into Fedora 40 and InvokeAI works out of the box on F40. ComfyUI and A1111 are as simple as doing a CUDA install.

        My daily driver is PopOS and getting ROCm installed was a slight pain but nothing too horrible, that I haven't done before.

        I know the pain you speak of, but AMD has made strides moving past it. But I've got a box here with Fedora 40 and a 7900xt and it's been smooth as butter so far. I don't know, might change tomorrow for all I know.

        Just my two cents there.

        • I'll just say that ROCm 6.0 is baked into Fedora 40

          Fedora? AKA Redhat, AKA Microsoft. And whomever that SystemD fuckhead is.

          LOL, no thanks bro. Keep your tightly integrated ROCm, I will compile it manually.

          • Fedora? AKA Redhat, AKA Microsoft.

            Who is owned by IBM? What the fuck kind of stringing things together is this obvious flame bait bullshit? You can't even get the owner of shit correct.

            And whomever that SystemD fuckhead is

            Holy crap man. If you're going to insult someone at least spend two seconds on Google to look up who the fuck it is that you want to insult. This is basement dweller level hurling insults at Lennart Poettering. This actually makes him look good that mouth breathers like yourself can not even be brought to expend the mental energy to look up his name. An

            • Who is owned by IBM? What the fuck kind of stringing things together is this obvious flame bait bullshit? You can't even get the owner of shit correct.

              Dumbass. They sold their now-worthless monetary business to IBM, but they sold their soul to Microsoft. Where does that little dickhead work at now?

              Holy crap man. If you're going to insult someone at least spend two seconds on Google to look up who the fuck it is that you want to insult.

              Absolutely not. I could have spent a trivial amount of effort recalling that person's name, but they are not worth that effort, much less the effort required to actually type a query into Google.

              • but they are not worth that effort

                Proceeds to be two comments deep in an almost week old story still talking about him.

                Tell you the truth dude. It sounds like you get an erection from the guy.

                I'm not one to judge, but I mean, let's be honest with ourselves here.

      • by antdude ( 79039 )

        It's frustrating. Come on, AMD. Do better please. Same for Intel and others.

    • On Linux, I never got even tear-free scrolling on Nvidia 970M without hacks that broke every now and then. You would think double buffering is not that hard of a job for a GPU, but Nvidia drivers got that messed up. Nvidia is also the one that started with the price scalping, AMD almost always has a better per/$, though they also have upped their prices a lot. You used to get a great GPU for $200, now you get barely any GPU with that. Imagine if there was no competition at all, the performance would be stag
      • 970m is a 2014 gpu.

        I suspect Nvidia was built on windows PC gaming and more recently on being the raw compute engine for crypto and now AI. Pretty sure Linux display is not a major part of Nvidia, AMD or anyone else's bottom line. Sorry, but you're not the target demographic so no surprise no real work was put into making it work well or at all in your situation. That's not how they became a trillion dollar company.

      • You need triple buffering to completely stop the tearing. I was doing it on my 1070 and now I'm still doing it on a 4060. It is effective. Yes, it uses more memory, but I have 16GB of VRAM now so I can afford it.

    • by MBGMorden ( 803437 ) on Saturday June 08, 2024 @11:17AM (#64533217)

      I'm neither fanboy. I started out with 3dfx Voodoo cards and have flipped between Nvidia and ATI (eventually AMD) back and forth depending on whatever one suited me.

      At the moment I'm running AMD (an RX 6650 XT) because for midrange cards they seem to be a better value. The card it replaced was Nvidia (GTX 1660 Super).

      I will say though: we really need AMD and/or Intel to stay in the market and competitive, because without a direct competitor Nvidia's product costs will go through the roof. They largely already have. Almost every other PC component has dropped in cost (adjusted for inflation). The relative cost for what would be a "top of the line" CPU has halved in the last 10-15 years. The cost for a "top of the line" GPU has basically tripled.

      • The cost for a "top of the line" GPU has basically tripled.

        I'm more concerned about the doubling of the cost for a low-mid range GPU. I used to draw the line at $200, and I could run pretty much anything at decent quality settings. I spent more than twice that so I could get the large-memory version of the cheap card this time.

        • They've gone up in price at every tier. Not my field so I can only hazard a guess is that earlier gpus were much more primitive and easier to build with relatively low production requirements and these days they're more complex than general purpose cpus which already had time for production methods to be perfected and lower prices.

          Or it could just be that Nvidia can get away with charging these sorts of prices because there really is demand between gaming, crypto, and AI and AMD is just price matching. Pr

          • There wasn't a downward trend but they did hold for a while, generations really. They barely even went up with price inflation.

            I suspect the answer is some of each, they are now very complex AND we all want them.

            • True prices were pretty flat until, iirc, around the time of the 1000 series Nv cards, then it was sky is the limit.

              My friend bought her young teen kid a shiny new 1080 card. She can afford it (as the only female breast cancer doctor for 500 miles) but the price was shocking to me at the time.

              • That tracks, the last card that I bought that followed the old trend was a 970. My next card was a 1070 and I bought it used because the prices were so much higher.

                Now I have a 4060 16GB, ugh. That price was a big punch in the nuts. But for someone who games at 1080p and wants to play with LLMs it was a reasonable choice given the cost of the more powerful cards.

  • by Kevin108 ( 760520 ) on Friday June 07, 2024 @09:09PM (#64532041) Homepage

    As it should. The competitors are more affordable, but lack performance and/or stable drivers.

    • You’d think Intel would have the means to build a competitive GPU but the Arc series seems pretty dead.

      • You'd think they would learn from AMD's lack of success that you have to compete with Nvidia's software. There was a time when Intel was a master of that. Their compilers were second to none for their architecture for decades. But they failed somewhat spectacularly with IA64, where they promised performance based on a magic compiler which they never delivered, and they have never really been the same since. For some reason I'm reminded of some Intel motherboards for the Pentium MMX that had Mach32CT graphic

        • by fintux ( 798480 ) on Saturday June 08, 2024 @02:30AM (#64532501)
          It wasn't the compilers but illegal business practices that made Intel the number one. AMD got the definitive clue on this when they tried to offer their CPUs even for free for OEMs and they refused to take them (Intel had coerced them into exclusively using their CPUs). And they had some false advertising with benchmarks, too, and they still have a requirement for a specific disclaimer for *every* CPU benchmark information they use in marketing material. AMD went nearly bankrupt because they couldn't sell their CPUs through OEMs for such a long time. They finally won the lawsuit but at that point they had fallen behind RnD as they had no money to invest to it, and Zen was their last straw. Fortunately it turned out to be a huge success and we still have actual competition in the CPU market.
          • I'm familar with the history, and you're right, but don't try to take anything away from Intel's x86 compilers — they're not relevant today anyway. They tended to produce substantially faster results than anyone else's until fairly recently [archive.org].

          • AMD lost their fabs during this. Of course Intel and Nvidia are going to have the "better" product. They got it through illegal behavior. Well, to be honest, I do not know if Nvidia did anything illegal, the comment was more about Intel, but that affects the compettition between Nvidia and "ATI now (purchased by) AMD".

      • Intel has tried for DECADES building GPUs. Arc is the 12th generation! [wikipedia.org]

        Nvidia even poked fun of Intel's GPUs [vizworld.com] 15 years ago.

    • Both of them have performance, Intel's problem isn't stability it's that anything that isn't a brand new game runs like s*** but to be honest for so many gamers that doesn't matter because they really only play one or two games. Still Intel has had problems at launch with some major titles. .

      AMD's problem isn't the stability of their drivers per se but the fact that their hardware can't handle less than perfect conditions. If your memory has the slightest problem or your power supply isn't perfect you'r
      • Sorry but you are talking out of your ass here. Their last halo card (7900XTX) had two problems: lower than expected clockspeeds for RDNA3 which was AMD's fault and a faulty run of vapor chamber coolers on stock cards that can mostly be pinned on Sapphire (and/or one of their suppliers). None of what you said makes any damn sense.

    • AMD is too busy spending their R&D on enterprise hardware to worry about the consumer dGPU market.

  • by ArchieBunker ( 132337 ) on Friday June 07, 2024 @09:41PM (#64532105)

    Price hikes across the board.

    • Price hikes across the board.

      Hardly. They only just got to 88% because of the "price drop". I put that in quotes because rather than drop the price they released their Super series cards which were virtually identical to the non-super cards but $200 cheaper.

  • by zenlessyank ( 748553 ) on Friday June 07, 2024 @10:10PM (#64532155)

    Sounds like some shills are present. I've had my card for over 6 months and haven't had a hitch with any game or driver. I used to be an NVIDIA user but when I wanted to upgrade my GTX 660 SLI setup, I found out they discontinued SLI on lower end stuff so you could no longer buy 2 cheap cards and get the performance of the high end GTX 680 which totaled $150 more than the 2 cheapies.

    The money I saved on the AMD 7900 XTX was enough to buy a 12 core AMD chip, a 4TB M.2 SSD & 32 GB of RAM.

    My only complaint is that I am kinda old and the damn card makes the screen move so fast now it is harder to be accurate.

    • by serviscope_minor ( 664417 ) on Saturday June 08, 2024 @04:33AM (#64532599) Journal

      Not a shill, but try doing deep learning on AMD. It's possible, for sure, but a pain. It's a mess of configuration, missing documentation and compatibility matrices. And of course suboptimal software that doesn't get the whole performance of the card.

      My gosh I wish AMD would offer NVidia some serious competition. £1000 cheaper for a 24GB card? Sign me up. But I don't want to spend 2/3 of the cost on a workstation to then have to do a ton of fucking around.

      • Another happy 7900xtx owner here. Sure, but deep learning is not something most people do enough to pay the nvidia tax for. Looks like nvidia drivers improved a lot in the last year, stupid mandatory registration is gone and amd is rumoured to not release high end cards.... Guess next card might be nvidia at thia rate.
        • Registration? On Linux, that doesn't seem to be a thing.

          I mean it's true most people aren't doing DL, well kinda, there's more and more related tasks running in the background which do use it, like DLSS, various image and video editors (like pseudo green screen), and so on.

          Also, NVidia are making money hand over fist, and AMD are 95% of the way there. But that's like having a bridge that's 95% completed.

          • ISTR having to sign up for an nvidia account in order to download the CUDA driver package. And I don't remember logging in the last time, but I could just be still logged in for all I know.

            I am using the runfile for Debian 12 on Devuan 5, which works great.-

    • by thegarbz ( 1787294 ) on Saturday June 08, 2024 @12:12PM (#64533343)

      Sounds like some shills are present.

      Someone disagreeing with you or not sharing your experience doesn't make them a shill. It works for you? Great! You're precisely the success story everyone needs. Unfortunately your experience isn't universal. I had enough problems with my previous AMD card (shitty drivers causing issue with Freesync under Windows, problems with graphics and display detection under Linux, crashes in games) that I jumped straight to NVIDIA with the next upgrade.

      But I didn't need to even consider this because AMD isn't even playing the the same league anymore. There's literally no competition in the high end. Your 7900XTX is out performed by the equally priced 4080S in some cases by up to 60%+ in games where raytracing is enabled, and equalled in performance for games based on pure rasterization.

      What you spent at the time doesn't even make sense right now. In 2023 buying a 7900XTX may have made sense. Today it would be a stupid move purely on dollars/performance not considering anything else. That doesn't make everyone else a shill, it just means other people are paying attention to the market which you are currently ignoring (presumably not being interested in upgrading your card).

      Also SLI makes no sense anymore. There are no cheap cards to stack for the gains from either company.

      This is all before you consider, better driver support (AMD lags NVIDIA greatly with game ready driver releases which could be a deal breaker if you like to play things on release day), and far better programming tools (ROCm is a joke compared to CUDA). Or even just modern day support in desktop apps (Topaz AI performs far better on the 4080S than the 7900XTX because the latter simply doesn't have the hardware designed for that specific use case).

      • Your shillness is noted. I purchased the video card to play games and control multiple monitors. I could care less about AI or programming tools etc. If I wanted those kinds of features I would have purchased the hardware that performs that. NVIDIA putting that crap in a consumer card is wasteful and a form of wishful thinking. They have professional cards that perform those functions, hence the option is there to purchase those tools if you want them.

        • Just wondering if it's opposite day? Let's go through things:

          Your shillness is noted.

          Lack of shillness I suspect you mean. Stating facts doesn't make one a shill. I wish I were a shill. Getting paid just to state facts sounds like an easy job.

          I could care less about AI or programming tools etc.

          So you do care? Or you don't? Between your post ignoring what I'm saying I'm confused if legit want AI, or actually "couldn't care less". English words have meanings, use them correctly. Could care less means you do care, at least a little.

          If I wanted those kinds of features I would have purchased the hardware that performs that.

          So you're a shill? Because that's what you called me

          • Now you are a name calling shill. Great work!

            • Now you are a name calling shill. Great work!

              Of course I am. I treat people with the respect they give to others. Actually not quite. I'm far nicer than you, you only made it 4 words into your first comment before name calling, I delayed an entire post. I gave you a chance, but you showed us all the type of person you are.

              But in any case I can see you didn't refute anything I said so we can all see where you stand.

  • AMD's CPUs have been killing it for some time, but their GPUs are just lackluster. They have decent raster performance, but usually lag one or two generations behind in everything else. Raytracing performance, temporal super resolution quality, frame gen quality, all are far behind. Some new features like ray reconstruction are simply missing, and probably will be until another one or two generations have gone by. On top of all that, AMD's pricing is generally way too close to nVidia's to forgive the massiv

    • So your point is, AMD should just lie down and quietly disappear because the only reason they exist is so you can get cheaper Nvidia cards?

      Right.

      • by Guspaz ( 556486 )

        No, my point is that AMD should add more discrete hardware for raytracing and deep learning and put more resources into software. I don't want AMD to fail, I want three viable competitors keeping each other in check. AMD's poor execution is why nVidia was able to raise prices so high.

    • Most of the things you've listed I don't really care about, nor do people who play games. Pixel peepers are usually less into the games.

      Raster performance matters, frame generation increases input latency for a given framerate, since in reality the game is still running at the lower rate.

      Not needed in non-twitch games of course, but in those games you don't need decent framerates either really.

      Why should people prefer predicted potentially wrong images over actually rendered ones?

      Nvidia have really leaned i

  • Don't gloat. (Score:5, Insightful)

    by sg_oneill ( 159032 ) on Friday June 07, 2024 @10:38PM (#64532183)

    As much as we might enjoy Nvidias stellar cards, monopolies are *always* a bad thing. Remember what happened when Microsoft thought they had the browser market dominated beyond repair? They announced they where not going to do much more updating on IE6. They only really started up again when Firefox started eating away at their marketshare.

    Having strong competition forces Nvidia to innovate, and puts downwards pressure on prices. Thats why even though I know they'll never catch up I was pleased to see Intel take a swing at it with ARC, and why I was pleased to see AMD GPUs performing strongly against NVIDIA for a while. Lets hope AMD (or Intel, but I doubt it) find their mojo again.

    • >monopolies are *always* a bad thing.

      Not quite. *market power* is the bad things, and monopolies usually provide that.

      This, however, is--at least for the moment--a "contestable monopoly". Someone is on top but either there aren't significant barriers to entry, or there are already other competitors.

      getting a monopoly by building better is legal and not problematic. *Using* the market power (if any) from the resultant monopoly to stifle the others *is* a problem, and needs to be swiftly stomped.

      AMD is

  • To this day they have never fully solved the black screen crashes. Nvidia has them too but it's always just a bad power supply. With AMD the slightest problem with your RAM is enough to trigger them and board partners often ship silicon that doesn't have good enough support electronics to run at the speeds they're trying to push it so you have to undervolt. Never mind the fact that they just spent over a year wrestling with default Microsoft drivers forcibly installing themselves over the AMD drivers.

    The
    • and stop chasing nivida on price.
    • Again with the bullshit, what card are you even talking about?!?

      • And you will find tons of posts about black screen crashes. I gave up on my RX 580 and switched to a used 1080 instead of a used 5700 XT because even though the 5700 XT was a better value my 580 was giving me trouble in my brand new AMD motherboard with a Ryzen 5600. I did that because after 3 years of use out of the card I started getting black screen crashes. The card is running fine in my old 4550 Intel i5 hooked up to my TV but That's not where I wanted it...

        I don't think these problems are the fault
    • I've never seen anyone make such claims before on AMD's GPUs. If the problem was widespread, like the Nvidia melting power connectors, I would expect channels like Gamers Nexus or Hardware Unboxed to report on those.
    • by Luckyo ( 1726890 )

      This information is a decade out of date. Currently main problem is intel's instability at high end, likely due to intel pushing those CPUs too far and so they degrade in a matter of months, and then start throwing out crashes under load. AMD doesn't have this problem, because they run their CPUs at far less power as they have a superior architecture right now.

      In GPUs, pretty much the only vendor that's unstable is intel. There's no meaningful difference between nvidia and amd in stability.

      But yeah, if you

  • Am I the only one that doesn't see the math making sense here? Or did they mean AMD lost 7% from a previous 19%?

    "This jump shaves 7% off of AMD's share, putting it down to 19% total,"

  • by DrMrLordX ( 559371 ) on Saturday June 08, 2024 @12:48AM (#64532365)

    Melting 4090s comin in hot!

    Seriously, the 12VHPWR connector debacle should have been enough for people to never want to buy NV again. But hey if you wanna spend $2k+ on a card with a faulty power connector and then be blamed for not plugging it in correctly when it burns out then be my guest.

    • by Luckyo ( 1726890 )

      The funny part is that people did plug it in incorrectly. As in they didn't actually seat it all the way, so contact surface was insufficient and connector heated up and melted.

      Plug design did have inherent problems with it though, that was part of the problem. It was actually hard to seat it correctly in tight spaces, and you needed to make sure it was all the way down which was actually hard because connector didn't have a sufficient indication that it's not seated all the way in like PCI-E power connecto

      • That connector literally doesn't even promise that it can handle any particular current level per pin. The spec says you have to do your own testing. Molex makes better connectors that are designed for high current, and if they'd use ultra fine stranded wire with silicone jacketing then it would turn corners just about as tightly as what they're doing now with only two conductors.

        They're going to have to come up with some better way to handle delivering high power to GPUs than these systems with umpty-ump c

        • Standard PCIe connectors get the job done. OEMs and board partners didn't like sacrificing PCB space to clusters of 3-4 connectors, so NV pushed 12VHPWR as an alternative. The only advantage it offers is a few bucks per card, which is nothing when you're charing more than $1k per unit.

          Shame on PCI SIG for going along with it.

          This reason is one of many not to buy Nvidia products!

          • Standard PCIe connectors get the job done.

            Nothing about the existing power connection standard is adequate. The standard PCI-E power connectors for use after the 75W provided by the x16 slot is inadequate are the same crap Molex Mini-Fit Jr connectors that have proven themselves to be problematic even for supplying power to motherboards in the ATX standards. Molex literally does not provide a current rating for those connectors, and the spec tells you to do your own testing. The same number of conductors is used for power on both the 8-pin and 6-pi

            • What EXACT problem have you had with 8-pin PCIe connectors? Be specific. Because from what you're saying already, it tells me you don't know much about them...

              • What EXACT problem have you had with 8-pin PCIe connectors? Be specific. Because from what you're saying already, it tells me you don't know much about them...

                From what you're telling me now, exactly one of us went looking for these spec sheets to find out what the specs were, and it's not you. Go forth to the Molex site, look up spec sheets for the male and female Mini-Fit Jr connectors and see what they specifically say about the per-pin current carrying capacity of the specific connectors used in the ATX spec, and you'll see exactly what I mean. TL;DR: they don't promise that those connectors can carry the current specified by PCIE, especially the 8-pin versio

                • I'm not asking about spec sheets. When have 8-pin PCIe cables failed you? How did they fail? What do you personally dislike about them?

                  I already read the specs. The 8-pins are under specced for constant power delivery. The cables and connectors can easily carry more power than 150W, enabling them to handle transient spikes quite well. 12VHPWR, not so much. People generally don't have problems with them in practical use. They handle bends well. They aren't hard to seat properly. They don't tilt in th

      • Plug design was the ENTIRE problem. Don't believe the lies and excuses. Multiple companies designed 90 degree connectors that were easier to seat and most of those products have failed due to the faulty spec. CableMod had the best cable on the market and they still had enough failures that they had to recall the entire product lineup rather than face angry customers. There's no way to make this spec work.

        If the average PC builder has been using PCIe connectors for years with relatively few incidents but

        • by Luckyo ( 1726890 )

          I described all those relevant details above. Average PC builders didn't have problems with them. Small percentage of outliers (usually ones with really tight fit in the case at the connector's location) did.

          • No, you did not. EVEN PROPERLY SEATED 12VHPWR CABLES CAN AND DO FAIL. There are far fewer "outliers" when it comes to PCIe connector failures.

            People are covering for NV and it's disgusting.

            • by Luckyo ( 1726890 )

              I'm sure you can find some actually faulty cables that failed.

              But the failure that was talked about was a very specific failure where cable was not seated all the way to the end.

  • So how does it look for non gamer users? I just built (upgraded) a system with a Ryzen 5 7600 (no APU, the non pro Ryzen APUs don't do ECC) and it's perfectly fine for video decoding and encoding. So many systems don't need discrete graphics, what's the market share of those?

    Also, ever since Nvidia started crapping on Linux (about 20 years ago), I've stayed away, and have rewarded AMD for their open source driver work, and my systems are rock solid.

    Aside that, I fail to see why so many Nvidia supporters

  • Neither AMD nor nVidia support HEVC 10-bit video with 422 chrome sampling in hardware on anything they make. Guess what absolutely every contemporary mirrorless camera body outputs if you ask it to use its Log output format?

    Intel actually supports the files that come out of my cameras. I don't have to make any compromises to color grade my videos. I don't have to screw around with dummy files or external video recorders.

    Add to that the fact that gamers think Intel Arc is bad hardware and understand that it'

  • 88% of all GPU sales or 88% of cards destined for desktop/laptops?

    I have a pair of 2070s in my desktop, but that's because I train models with my desktop. Most individual consumers are just looking for gaming so an AMD card is just as capable by that metric.

    Just how much of Nvidia's sales are driven by Crypto and AI datacentres?

  • It would be interesting if auxiliary hardware optimized for AI processing becomes physically discrete from auxiliary hardware optimized for graphics processing. I can imagine games where NPCs shut down the interestingness of their speech a little during scenes that require a lot of AI assistance with the rendering, kind of like how people do that same thing.
  • If you click through to the article referenced by the uhm, article, it's clear they are specifically talking about dedicated desktop GPUs, add-in boards as they refer to them.

    That is not even sort of the whole of "GPU market share."

    The PS5, Steam Deck, Xbox, laptops . . . so many devices with GPUs, supplied by AMD and Nvidia, aren't being counted at all.

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...