Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Microsoft Bug Operating Systems Software Windows

NVIDIA's Drivers Caused 28.8% Of Vista Crashes In 2007 344

PaisteUser tips us to an Ars Technica report discussing how 28.8% of Vista's crashes over a period in 2007 were due to faulty NVIDIA drivers. The information comes out of the 158 pages of Microsoft emails that were handed over at the request of a judge in the Vista-capable lawsuit. NVIDIA has already faced a class-action lawsuit over the drivers. From Ars Technica: "NVIDIA had significant problems when it came time to transition its shiny, new G80 architecture from Windows XP to Windows Vista. The company's first G80-compatible Vista driver ended up being delayed from December to the end of January, and even then was available only as a beta download. In this case, full compatibility and stability did not come quickly, and the Internet is scattered with reports detailing graphics driver issues when using G80 processors for the entirely of 2007. There was always a question, however, of whether or not the problems were really that bad, or if reporting bias was painting a more negative picture of the current situation than what was actually occurring."
This discussion has been archived. No new comments can be posted.

NVIDIA's Drivers Caused 28.8% Of Vista Crashes In 2007

Comments Filter:
  • Not surprised (Score:2, Insightful)

    by JamesRose ( 1062530 ) on Friday March 28, 2008 @08:18AM (#22892262)
    The linux drivers for nvidia suck too, nvidia clearly take a long time to get up to speed on new operating systems, it's one reason I no longer use them. Having said that, they're pretty damn solid, so its most likely becuase vistas so mucked up when it comes to drivers.
  • The ow starts now (Score:4, Insightful)

    by Goffee71 ( 628501 ) on Friday March 28, 2008 @08:22AM (#22892290) Homepage
    This is descending below lawsuit territory, I'm starting to think that the whole PC hardware industry should be taken out back and shot. They supported MS in the release of an OS with crap under-powered hardware with smiles and big adverts, in full knowledge that these systems would never work or just were not ready for Vista.

    "The Wow Starts, oh around 2009 if you'll just let us fix this, upgrade that and force you to buy some new stuff" Should have been the tagline for Vista.
  • by Chutulu ( 982382 ) on Friday March 28, 2008 @08:25AM (#22892318)
    ever tried installing ATI drivers in Linux???
  • Re:Not surprised (Score:5, Insightful)

    by morgan_greywolf ( 835522 ) * on Friday March 28, 2008 @08:31AM (#22892366) Homepage Journal

    The linux drivers for nvidia suck too, nvidia clearly take a long time to get up to speed on new operating systems, it's one reason I no longer use them. Having said that, they're pretty damn solid, so its most likely becuase vistas so mucked up when it comes to drivers.
    Well, from my experience (not trolling), but they historically have sucked somewhat less than the ATI drivers, which have been known to cause freezes when switching to a console, etc., due to bugs in the driver, firmware, AMD processors (ironically enough), various chipsets and all sorts of things.

    The problem is that in the race to produce the biggest, baddest, fastest, video cards for gamers, ATI/AMD and NVIDIA have often overlooked stability for performance. I don't know about you, but I'd gladly trade off a couple of FPS for a card that was rock solid stable.
  • by pdusen ( 1146399 ) on Friday March 28, 2008 @08:38AM (#22892410) Journal
    Even if that is true, what the hell does that have to do with the topic?
  • by BlowHole666 ( 1152399 ) on Friday March 28, 2008 @08:48AM (#22892488)
    Well if NVidia is the only one with MAJOR driver problems....lets look at the math. 80% of the drivers work and they were built with the DDK while 20% (including NVidia's drivers) do not work and they were built with the DDK. I would think the 20% did not write their drivers correctly.
  • by Anonymous Coward on Friday March 28, 2008 @08:50AM (#22892504)
    I assume the drivers for such a critical component were officially 'certified' by Microsoft. In that case, it's not NVidia's fault alone and Microsoft should also be jointly accountable for the problem - since such certified drivers are supposed to be thoroughly tested by MS.

    /That or Windows should just stop warning users while installing uncertified drivers, since it doesn't really mean anything either ways.
  • Re:Not surprised (Score:4, Insightful)

    by CastrTroy ( 595695 ) on Friday March 28, 2008 @09:06AM (#22892620)
    Which is why on my Linux box, I prefer having an Intel video card. I don't do much (if any) gaming on it, so graphics don't really matter too much to me. So I would rather have something that was really stable over something that got me 400 FPS (when the refresh rate is only 60-100 Hz).
  • Re:Certified (Score:5, Insightful)

    by Shados ( 741919 ) on Friday March 28, 2008 @09:19AM (#22892720)
    Indeed. And its why you had to see internal mails to know that MS were saying it was Nvidia's fault. Considering anytime Windows crash, MS gets the blamed (even if a significant amount of times its not Windows' fault directly...Creative, I'm looking at you), if they felt it REALLY wasn't their fault, they would have said it reeeeally quick.

    If they didn't, its partly because they took the blame, as they should.
  • What's comical for Microsoft is that they would go and change the driver models for everyone for their new OS, and then blamed the resultant bugfest from the imposed change over on all of its business partners. Way to go Microsoft! You guys are a bunch of class acts!
  • Re:O RLY? (Score:2, Insightful)

    by pestie ( 141370 ) on Friday March 28, 2008 @09:40AM (#22892910)
    That's not how I remember it, actually. In the early/mid 90's I worked with a bunch of machines that had ATI Mach32/Mach64-based cards, and those things were great! They gave pretty much flawless, blazing-fast 2D performance. Of course, if you're talking about 1995 - 1997 or so, when 3D became a big deal (the era of 3Dfx Voodoo cards, etc.), that I'm not so sure about. For some reason I kept getting stuck with crappy machines that had atrocities like Trident video chipsets. Don't even get me started on how much I hate Trident.
  • Re:I'm relieved (Score:4, Insightful)

    by sm62704 ( 957197 ) on Friday March 28, 2008 @09:44AM (#22892950) Journal
    Which begs the question of why the people they licensed it from demand that it not be disclosed. What are they ashamed of?
  • Re:I'm relieved (Score:5, Insightful)

    by kebes ( 861706 ) on Friday March 28, 2008 @09:58AM (#22893112) Journal

    I also wonder why closed source vendors don't open their code. They don't have to release it under the GPL, they can reatain all their copyrights, just publish the source. How could it hurt them? They retain copyrights and presumably patents so it's not like anyone could copy them.
    Only the companies know for sure why they keep it closed-source, but explanations that have been suggested at various times include:
    1. The drivers contain code licensed from third-parties, such that opening the source would require extensive contracts, negotiations, and more licensing. Probably most of these third-party software vendors won't agree to have their code opened for the same reasons that all closed-source companies keep their source closed.
    2. Modern video cards (and other hardware too, probably) contain a surprising amount of their logic and "acceleration magic" in the driver. The card itself, though dedicated to a particular hardware task, is quite general and thus the code controlling the card contains many of the important 'tricks' to get good performance. (In fact I've been told that the difference between some cards and higher models is only in the driver.) In such cases, releasing the software code would be like releasing the hardware circuit diagram: it would reveal many of their trade secrets (some of which may be patent-protected, others not).
    3. Even if it would be illegal, some people would modify and redistribute the code. Hobbyist hackers would alter the code and recompile. This might allow end-users to bypass restrictions on the card, enable other features (effectively upgrade the card by bypassing lockouts), and so on. This makes lock-in harder, and might reduce the frequency that people upgrade their hardware.
    4. Their code, in all likelihood, violates a large number of competitor patents. As long as the violations are buried inside a binary, no one will notice. Opening the code would make it easy for a competitor to spot violations and sue. Probably all the companies violate each other's hardware and software patents, but they maintain an uneasy balance by all being secretive. If one company released too much information, the others would use it against them.
    5. The company may worry about other liabilities that they become exposed to when users and competitors can peruse the codebase.

    As I said, only the companies know for sure. But there are plenty of plausible reasons for why a hardware company wouldn't want to release driver source code. They are not great reasons (many of us would be more willing to buy the hardware if it had more documentation and/or open code), but they make business sense.
  • by ThirdPrize ( 938147 ) on Friday March 28, 2008 @10:00AM (#22893134) Homepage
    Yes, because we could do so much better than the NVIDIA engineers who designed the chips could.
  • Re:Not surprised (Score:5, Insightful)

    by MrNemesis ( 587188 ) on Friday March 28, 2008 @10:01AM (#22893144) Homepage Journal
    Someone has already pointed out that if you want a rock-solid stable video card under Linux, buy a board with an Intel G965 or G33/35 chipset, so I won't make that argument (although I will say the drivers aren't completely rock solid and lack many of the options I'm used to with the nVidia driver, like OGL vsync to stop "tearing" when I play full screen video).

    However, I will say that ATI's Linux drivers have come on leaps and bounds since AMD took the helm. They're still sucky, but they now only about twice as sucky as nVidia, as opposed to the binary equivalent of disemboweling yourself with a grapefruit spoon. The fact that, thanks to AMD publishing the specs for the silicon, a fully OSS, clean room, accelerated driver is now possible is also a colossal boon, and I suspect that within a few months the RadeonHD driver will be featureful and stable enough to be more than adequate for most people, once the distros start picking up on it.

    Then, of course, it'd be nice if someone could write a way of accelerating video so that all us Linux users without eleventy billion jiggahurtz processors could play back 1080p H.264...
  • by a_nonamiss ( 743253 ) on Friday March 28, 2008 @10:11AM (#22893240)
    Let's see... 1,000,000 knowledgeable geeks vs a couple dozen at nVidia... Yeah, I'd say we could.

    They might have more direct knowledge of the hardware, but there is strength in numbers.
  • by morcego ( 260031 ) on Friday March 28, 2008 @10:35AM (#22893480)

    Let's see... 1,000,000 knowledgeable geeks vs a couple dozen at nVidia... Yeah, I'd say we could.


    So, you are saying 1.000.000 knowledgeable geeks would be working to fix the driver ? Talk about wishful thinking.

    I say 10, at most.
  • Well... (Score:5, Insightful)

    by ledow ( 319597 ) on Friday March 28, 2008 @10:37AM (#22893494) Homepage
    NVIDIA vs ATI drivers - I don't really care.
    "It worked for me" - I don't really care.
    Statistics on the cause of crashes - I don't really care.
    Anybody running unsigned drivers and experiencing crashes - I don't really care

    Hang on. Let me explain.

    The fact that you can STILL crash a Windows machine with a dodgy driver - that I care about. I thought everything was supposed to be userspace. I thought the error-handling was supposed to be better. I thought that Windows was supposed to be more stable and secure. I thought people who were using signed drivers were supposed to be "approved" and relatively crash-free.

    Unsigned drivers? You can't support that no matter who you are, unless you're confident they are PURE userspace - they could be doing anything (like the 3DFX drivers that used to open access to all sorts of things it shouldn't in order for a primitive user-space part to actual drive the hardware). That's why you have to click that "CONTINUE Anyway" button with the dire warning. That's the Windows equivalent of kernel tainting. Once you've done that, nobody cares. The fact that most XP drivers are still using uncertified drivers is a bit of a problem but I can understand the reasons why. But you can't blame MS for crashes in uncertified drivers under XP. I thought Vista was supposed to be different, though.

    If a certified driver is crashing that often, then you have an entirely different matter. The certification effectively becomes worthless. Nobody trusts it. Therefore every driver manufacturer ignores certification and just tells users to click "Continue". Then you will have nothing BUT uncertified drivers. Catch-22.

    Blue screens should not happen. They certainly shouldn't happen often enough that people have coined the term "blue-screen" or BSOD to mean a crash. When they DO happen, when the driver goes absolutely nuts and starts stomping memory, aren't things like DEP and the user-space driver model supposed to STOP that happening and recover in some half-decent fashion? Or shouldn't the machine at least what the cause was and provide the user with some hint of what went wrong (i.e. "You installed an uncertified driver. Tough.").

    Let's compare for a second - Linux kernels crash too. They crash much more often if third-party drivers are installed and nobody really cares about that except the third-party and their users. When they do crash, there's not much you can do but most of the time you'll get all sorts of debugging information and usually you can carry on. You might lose X, which may or may not load up again - I have a laptop that likes to crash X if I run more than one copy of Xine at a time but the worst that happens is X dies and restarts and then carries on working for hours/days/weeks as if nothing had happened (and yes, I need to update the kernel/X on that machine!) but things keep on working as best they can. You can do pretty much what you like in terms of software but the worst that'll happen if you're not actually loading a kernel module or patching a kernel or playing with kernel-level features is a software crash and be chucked back to the command-line. Sometimes you might even end up taking out X, like my example above.

    You can rip out the harddrive and *make* the kernel crash but most of the time things will carry on, just without the component you ripped out (i.e. the IDE layer may die, but it'll still keep running as best it can without it). Even when Linux comes to a complete halt and freezes, you have debugging information and logs with which to narrow down the cause yourself, without needing to consult Linus himself.

    When Windows crashes (even with certified drivers and clean installs), there's bugger all to go on. Half the time the event log doesn't show anything at all. The second you see a blue screen, the computer is down and there's little arguing. There's zero information to go on. You have no idea what caused the crash at all because usually all you get is a generic STOP error and a
  • Re:Not surprised (Score:3, Insightful)

    by geekoid ( 135745 ) <dadinportlandNO@SPAMyahoo.com> on Friday March 28, 2008 @10:40AM (#22893522) Homepage Journal
    Thats fine, then it's not about your use.

    Your comment regard FPS shows a little bit of ignorance on why have a high MAX FPS is important.

    When you have 60 people with effects going off all over the place, that 400FPS suddenly becomes 60FPS, which is what you want. 30FPS looks a little choppy, an effect from page flipping.

    For the record, I haven't had stability problems with nVidia for over 10 years.

    As for this report, lets not forget MS didn't give final specs to many companies until they were very close to releasing. And it seems they released easlier then some people in MS wanted to.

    Not excusing nVidia, just pointing out that it's a little more complicated then "nVidia screwed up our perfect stable release of the greatest OS ever!(DOn't forget windows 7 is coming)"
     
  • by gallwapa ( 909389 ) on Friday March 28, 2008 @10:54AM (#22893680) Homepage
    So let me get this straight: When X Crashes you lose your current session, right? Which means that OOo document you were working on just went "poof" - your media player shuts down, along with all your other apps that launch within the context of the X session.

    Now, your uber OS may have stayed "on" in that it could reload all that crap without having to spend 20 seconds rebooting, but for all intents and purposes from a user perspective, your whole OS just freaking crashed.
  • by Akatosh ( 80189 ) on Friday March 28, 2008 @10:55AM (#22893694) Homepage
    Plus the quality assurance team of 999,990.
  • by pxuongl ( 758399 ) on Friday March 28, 2008 @11:38AM (#22894198)
    just a million monkeys tapping away at a million typewriters...
  • Point Missed. (Score:2, Insightful)

    by pleappleappleap ( 1182301 ) on Friday March 28, 2008 @12:55PM (#22895188) Homepage
    All of your applications probably still crashed.
  • by kylef ( 196302 ) on Friday March 28, 2008 @03:01PM (#22896930)

    Most of these driver incompatibilities were actually caused because microsoft changed the driver structure at the last minute which basically shot a lot of the manufacturers in the foot at the starting line.

    Actually Microsoft had been talking to the graphics IHVs about the new Longhorn "Advanced Driver Model" as early as spring 2005. Both ATI and nVidia had representatives (i.e., developers) working closely with Redmond during that time. The Longhorn/Vista display model became known as "WDDM" and was more or less locked down, from what I understand, by late 2005. By the time of WinHEC 2006 (April), they were already talking about WDDM 2.0, as you can see from this presentation [microsoft.com]. If you take a look at the slide deck, ATI's Tim Kelley actually delivered part of the presentation on WDDM 2.0.

    Frankly, I don't think nVidia invested enough energy in making high-quality Vista drivers in time for launch. They had approximately a full year of Betas, the same time that ATI and Intel had. The Vista Beta and RC programs had hundreds of thousands of users around the world, for which Microsoft collected crash dump data (which is the same type of data mentioned in this article, collected BEFORE launch). Yet even with this time, and the user crash dump reports, clearly by launch in January 2007 nVidia still wasn't ready with robust drivers.

    The evidence here really does point at nVidia, no matter how much you want it to point at Microsoft.

  • Re:MOD PARENT DOWN (Score:4, Insightful)

    by PitaBred ( 632671 ) <slashdot&pitabred,dyndns,org> on Friday March 28, 2008 @03:53PM (#22897842) Homepage
    What's that "oldid" part of that URL? I don't see that in most wikipedia links I follow... you wouldn't be trying to pull a fast one by simply noting that Wikipedia in the past has been defaced (and quickly fixed [wikipedia.org], a whole 15 minutes it was up), yet trot it out as an operational fact, would you? Because that would be dishonest, and I wouldn't think an AC would be dishonest.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...