Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Technology

AMD Graphics Chips Could Last 10X To 100X Longer 150

An anonymous reader writes "According to a research report out of UCLA, released this morning, NVidia's high-lead bump packaging could last anywhere from 1/10th to 1/100th as long as AMD's advanced eutectic bump approach. (TG Daily has picked up the claim.) NVidia is currently in the midst of a $200M recall of bad GPUs, and the report suggests that the issue could be much deeper than NVidia's PR department would have us believe." The report lends credence to the strident claims of the Inquirer's Charlie Demerjian, which we discussed a month back.
This discussion has been archived. No new comments can be posted.

AMD Graphics Chips Could Last 10X To 100X Longer

Comments Filter:
  • Sweet! (Score:5, Funny)

    by blueturffan ( 867705 ) on Monday September 29, 2008 @06:04PM (#25199297)
    I'm all for longer life chips, but what are Grahiphics ?
    • Re:Sweet! (Score:5, Funny)

      by Anonymous Custard ( 587661 ) on Monday September 29, 2008 @06:06PM (#25199323) Homepage Journal

      I don't know, but they sound terhiphic.

    • Re:Sweet! (Score:4, Funny)

      by josteos ( 455905 ) on Monday September 29, 2008 @06:10PM (#25199359)

      Watching grammar nazis making fun of innocent mistakes just makes me sickick!

    • Re:Sweet! (Score:5, Insightful)

      by atari2600 ( 545988 ) on Monday September 29, 2008 @06:20PM (#25199467)

      I've had an ATI X1950 Pro for 3 years now and while the card works great, the newer games render it near obsolete. So yes, I can have a card forever but what good is that going to do me if I need to upgrade anyway?

      Resale value would suck and why would anyone want to spend 50$ on a 3 year old card when they can get a 1 year old "better" card for 90$. (I pulled the numbers out of thin air but you get the idea).

      • Comment removed based on user account deletion
        • Re: (Score:2, Interesting)

          by Anonymous Coward
          While it's great that all your cards lasted so long, none of that is relevant. The eutectic/high-lead solder thing is really only hitting the post-G80 cards as far as anyone knows. G80 (8800GTX and first gen 8800GTS) and earlier used a different solder mix. The current Nvidia high-lead solders are failing at an unusual rate, which is what's being discussed.

          Thanks for your meaningless data, though!
        • Re: (Score:2, Interesting)

          by hamster_nz ( 656572 )

          Well, I just built a home NAS server, 2TB of disk, Gigabit NIC, S3 Virge PCI 2MB graphics... who could ask for anything more!

      • by billstewart ( 78916 ) on Monday September 29, 2008 @07:35PM (#25200065) Journal

        I'm skeptical about the report, especially given the lack of any field studies with it. The useful life of a piece of computer equipment is usually 3-5 years; high-end graphics cards are probably shorter, because the main customers are gamers who need cutting-edge performance to kill orcs with.

        So does "10-100 times longer" mean that significant fractions of nVidias are failing in 10 days - 3 months due to bad solder joints? Or does it mean that the solder joints in an ATI will last 30-300 years, long after anybody except a few retro gamers are interested in a graphics system that's mounted on a card in a separate box and doesn't interface directly to their optic nerves?

        • It could just mean that if failures occur along a normal distribution, which they probably do, each point is approximately 10-100x higher than the ATI cards, which would be a Big Deal.

          Most companies offer at least a year long warranty; if they have significant failures in that year, like 10-100x higher than normal, that may put too much pressure on their warranty policy.

          And let's not forget nVidia's partners in selling cards (you know, all the non-nVidia nVidia cards). Those people may see high failure rates of nVidia parts, and all of a sudden using another chipset just got a heckuva lot more attractive.

          So, the moral of the story is, there is no set 'time' that a card will die. It's not like after 10 months all of them will just conk out. But if there are higher failure rates than normal in their warranty period, not to mention harm done to their reputation, it could end up costing them greatly.

          • by strstrep ( 879828 ) on Monday September 29, 2008 @10:20PM (#25201071)

            Typical failure models use an exponential distribution, rather than a Gaussian distribution to model time-to-failure.

            • Re: (Score:2, Informative)

              Practically speaking, using an exponential distribution means this:

              If you can expect a card to last (on average) one year when it's new, then, given that it's N months old, you can still expect it to last one year. An exponential distribution has no history.

              It works surprisingly well.

          • Re: (Score:3, Interesting)

            I can confirm the putting pressure on the warranty part. Dell just ran out of replacement Nvidia cards for the D620. 15 day wait list if yours fails.

            • Warranties can be a significant selling point. The reason why I pretty much just by seagate drives is that they're pretty much the only ones that are willing to give me a decent warranty. It's worth it to me to pay a couple dollars more for a drive with a 5 year warranty rather than just get a 3 year.

              But really it isn't so much about warranty coverage as it is about what that really means. A company isn't going to set a warranty period that's so long that they're going to have to replace a significant numbe

        • obsolescence has nothing to do with the physical life-span of a video card. when a video card becomes obsolete depends primarily on the user and the application. what these articles are referring to is the physical life of the video card before it is expected to fail.

          and not everyone uses their computer primarily for gaming. outside of gaming, technical obsolescence does not occur so quickly. at the moment i'm using a 5-6 year old workstation at work and at home. i do graphic design and web development, and

      • Re: (Score:3, Interesting)

        by adolf ( 21054 )

        why would anyone want to spend 50$ on a 3 year old card when they can get a 1 year old "better" card for 90$

        Perhaps someone only has $60, and still wants to eat for the rest of the day.

        *shrug*

        Generally, your "insightful" rhetorical question is absurd, like this: Why would anyone want to spend $50,000 on a 3-year-old Corvette when they can get a better 1-year-old Corvette C6 Z06 for $90,000?

      • Re: (Score:3, Insightful)

        I think that the real killer is laptops. Virtually nobody ever upgrades the laptop's GPU, indeed this is generally impossible, and the laptop's life is pretty much equivalent to the GPU's life, barring very expensive service.

        With desktop systems it isn't so bad; by the time the card dies, equivalent performance will be considerably cheaper(or the card will still be under warranty), and swapping it out will take maybe 10 minutes. With laptops, not so much.
      • I agree. This will not scare me away from an NVidia video card. That expensive laptop with NVidia chips, however, has me thinking twice. If something goes bad right after my warranty expires I'm in a difficult place. Warranted or not - all this talk of NVidia problems will leave me nervous. And I'm sure I'm not the only one. It will be bad for NVidia if this turns out to be true - and bad for NVidia if it doesn't.
        • by Knara ( 9377 )

          FWIW, I've been using a 9600GT for about a year now with no real problems. There was a thread back either here or on Ars asking people to chime in about their 8xxx/9xxx cards and if they'd had a problem with them, and it was a big thread of people saying they didn't have any issues. So... not sure if on a *practical* level this is really a huge issue for users.

          But, I'm not religiously devoted to either Nvidia or ATI (though historically I've had much better luck with Nvidia's driver software than ATI's.

      • by EzInKy ( 115248 )


        I've had an ATI X1950 Pro for 3 years now and while the card works great, the newer games render it near obsolete. So yes, I can have a card forever but what good is that going to do me if I need to upgrade anyway?

        Resale value would suck and why would anyone want to spend 50$ on a 3 year old card when they can get a 1 year old "better" card for 90$. (I pulled the numbers out of thin air but you get the idea).

        Graphics cards have other uses besides games. I've spent over $100 a number of times for Radeon X850

      • by ponos ( 122721 )

        I've had an ATI X1950 Pro for 3 years now and while the card works great, the newer games render it near obsolete. So yes, I can have a card forever but what good is that going to do me if I need to upgrade anyway?

        The point is that if the expected life of the card is 5 years, a 5% will fail at 1 year, for example (this is a guess, assuming a certain variance between parts). If however, the expected life of the card is 50 years, only 0.0001 will fail at 1 year. And I think we can agree that failure at 1 ye

      • You may not care whether your current graphics card is obsolete after 3 years today but, with the combination of how powerful GPUs have become and the diminishing returns on further investment in graphics techonogy, in the near future your GPU will be good for much longer than 3 years. I'm not saying that this will happen tomorow, but 3D graphics have come a long way since One Must Fall and Doom, and I suspect that within 5 to 10 years we will have hit a point where even the most insane graphics whore won't

    • by Perf ( 14203 )

      Bad case of hiccups while typing?

    • It's nort my jorb te tell yeahs, but praps ole Coach Z [homestarrunner.com] conterberted to da summrry.
  • by atari2600 ( 545988 ) on Monday September 29, 2008 @06:09PM (#25199347)

    I was going to Google for that word but then I realized that kdawson was involved.

  • Huh? (Score:2, Interesting)

    by TinFoilMan ( 1371973 )

    The report lends credence to the strident claims of the Enquirer's Charlie Demerjian

    As in National Enquirer?
    As in Real news?

    --
    Oh Well, Neutral Karma and all . . .

  • by lobiusmoop ( 305328 ) on Monday September 29, 2008 @06:23PM (#25199507) Homepage

    "Silicon doesn't wear out; microchips were effectively immortal. The Wig took notice of the fact. Like every other child of his age, however, he knew that silicon became obsolete, which was worse than wearing out"

    • by Detritus ( 11846 ) on Monday September 29, 2008 @07:12PM (#25199867) Homepage
      The silicon may not wear out but I've seen pictures from an electron microscope that show that the metal interconnects can deteriorate and fail. See electromigration [wikipedia.org].
      • by IorDMUX ( 870522 ) <mark.zimmerman3@gm a i l . c om> on Tuesday September 30, 2008 @12:36AM (#25201789) Homepage
        If you're building chips where electromigration is an issue within any half-reasonable time span, you're doing it wrong.
        • Care to explain? +5 interesting because you made a statement?

          Overclocking of processors, especially when using higher than nominal voltage, causes electromigration between their transistors and significantly shortens the chips' lifetime.

          GPUs tend to be overclocked frequently (hell, they sell overclocked GPUs as is) Defining reasonable time span can be difficult depending on the speed of technology. You spent $100 on a card today, that breaks in 3 years, and if you want to buy another of that same card, it will most likely be $20.

          But to prove my point, electromigration happens, and IS an issue within ANY time span.

          • Re: (Score:3, Informative)

            by idontgno ( 624372 )

            But to prove my point, electromigration happens, and IS an issue within ANY time span.

            You point is valid. On-point, even. But what makes Nvidia's alleged misdeeds significant is that electromigration, along with other factors, makes the interconnects in the 8xxx series GPUs fail in an unreasonably-short time span. Without elaborate external mitigation strategies*, within warranty.

            And that's the other factor in the significance in this story: Nvidia is alleged** to have made a point of downplaying, denying

          • by IorDMUX ( 870522 )
            I design IC's for a major electronics design company, and we run every chip through myriad tests and analyses to make sure that electromigration is one of the least of a chip's concerns. Aside from the one-in-ten-million chance, any properly designed chip will fail long before electromigration comes into play due to some other method such as ESD or even (seriously) gamma ray bombardment.

            Yes, electromigration does happen, but the glass in your windows is slowly migrating with gravity, as well. However you
  • ...but with all the issues that have come out of nVidia products recently, I think it may be about time to switch over to ATI (again).

    I actually started with ATI. Then I installed linux and started actually using it. I realized that the fglrx drivers just weren't worth the hassle, so my next card was nVidia, and I've stuck with them until now mainly for that reason. However, with the open source ATI drivers gaining support, it might be time to switch over.

    Writing this I've come to notice that the actual
    • I had an early ATI Radeon card (7xxx something I think), which had terrible driver support, but I lived with it (and subsequent headaches of bad driver errors, the old "uninstall BEFORE you install the new drivers" nightmare). Then I got a 9000 Pro which was OK, until it became obsolete. I switched to an nVidia FX 5700 LE ("Lame Edition") which had good driver support but was woefully underpowered. Then I went back to ATI (what can I say, I was trying to support a "Canadian company") for an x800, which was

      • by spoco2 ( 322835 ) on Monday September 29, 2008 @07:04PM (#25199813)

        I have an older 9550 ATI in one box and my more powerful machine has an 8800 nVidia in it.

        As far as large, clunky drivers go, ATI is king of the hill... their setup that requires .Net to install, the bloated and resource hungry Catalyst Control Center... it's ugly.

        nVidia on the otherhand seems to be far more lightweight and fits in better with Windows.

        But performance wise I haven't really had anything to complain about though, and I can't think of instances of actually cursing the drivers for not working...

        I've never been loyal to either really, when it comes time to do an upgrade I research on the web what card people are saying gives best bang for buck in my pricerange, I don't give a hoot who makes it.

        • At least the 9550 is old enough that you can use the 5.9 drivers (the last ones before the .NET based "what were they thinking?" trainwreck of a driver). I have a box with a 9600Pro in it, and I use what is now a really old version of the driver and I have no problems with it. But as always, YMMV.

        • by minvaren ( 854254 ) on Monday September 29, 2008 @09:14PM (#25200705)
          How to fix bloated ATI drivers :

          1] Download full CCC installer.
          2] Don't install the CCC, just the drivers.
          3] Download a copy of ATI Tray Tools [guru3d.com].
          • by spoco2 ( 322835 )

            I'm going to give ATI Tray Tools a whirl.

            I'm not sure why in your steps you have 'Download full CCC installer'... why not just download the display driver only? They always give the option. The full CCC doesn't even give temp readings from the card, so I'm keen to give Tray Tools a go, being a media pc in my entertainment unit, I like to keep things quiet, but in doing so you sometimes have to keep an eye on temps.

    • by kesuki ( 321456 )

      i've been a fan of ATI for a long time, there was a time where i tried out nvidia, but i picked the wrong company, and it tainted my feelings about nvidia's approach to letting other people make the cards while they make the chips. since then i've been using ati in everything except systems where price was more of a factor. but given the news against nvidia, i probably won't be building any nvidia systems ever again.

      then again i haven't had many people have me build them systems lately, and it's not just

    • Give it another 6 months, the 'almost latest' radeon and mesa in 8.10 can just about support kde4 composition (and id guess compiz) with a few problems, but the advances made in the last year mean im fairly confident that by 9.04 composting will be fully supported and by the time i leave university ill be able to game on the opensource drivers.

      • but the advances made in the last year mean im fairly confident that by 9.04 composting will be fully supported and by the time i leave university ill be able to game on the opensource drivers.

        I don't know what kind of advances they are making, but lots of them so called green folk would love to have an open source system that also composts.......... now that is what I call an eco-system....

        Maybe they found a use for all those old floppy drives.

  • More data please! (Score:4, Interesting)

    by unix_geek_512 ( 810627 ) on Monday September 29, 2008 @06:29PM (#25199563)

    What does 1/10th and 1/100th actually mean in standard solar days?

    Can someone please provide a plot of the various solders and their performance vs. temperature and time?

    I would like to see the plots for ====>

    90Pb10Sn
    60Pb40Sn
    97Sn2.5Ag0.5Cu
    99.3Sn0.7Cu
    96Sn4Ag
    99.25Sn0.75Cu

    What is the risk associated with Tin? Especially Tin whiskers.

    What kind of solders does the slashdot community use?

    • by TheRealMindChild ( 743925 ) on Monday September 29, 2008 @06:38PM (#25199623) Homepage Journal
      Solder? More like Duct Tape or it ain't worth saving.
    • Re:More data please! (Score:4, Informative)

      by dr2chase ( 653338 ) on Monday September 29, 2008 @06:47PM (#25199677) Homepage

      I have an old (circa 1980) roll of rosin-core eutectic solder, that I don't use any more because of the lead.

      I also have a recently purchased some mgchemicals 4900-112G [mgchemicals.com], it is 96.3Sn, 0.7Cu, 3Ag, with a "no clean" flux. It works ok with my old soldering ironing, flows nicely, no idea how it does with tin whiskers. I'm not getting a lot of trouble with cold joints, and I do push my luck (lots of free-hand work, for instance, in-place soldering of LEDs for under-cabinet lights).

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        I've been doing occasional hand soldering for years, work as an electronics designer. I was worried about lead, got my blood levels checked a few times, no problem, have now given up. Unless you chew on the stuff as you work, my statistical sample of 1 suggests you're probably ok. (Wash your hands before lunch). I'd be more worried about the fumes from the fluxes...

        • by Kneo24 ( 688412 )

          Flux fumes really aren't that bad, unless you're working with excessive amounts of it.

          In any case, you're correct in that all you need to do with the lead solder is to make sure you wash your hands before you eat.

          • Not using lead costs me roughly nothing, and I don't have to take special precautions installing electronics and working in a kitchen above and beyond gathering up stray bits of solder. The way I see it, unless I have a specific reason to worry about "tin whiskers", the old ways are not better.

            I do miss the smell of rosin flux :-)

            • by Kneo24 ( 688412 )

              I never said the old way was better, but just merely agreeing with the AC that you needn't worry about lead based solder if you took one careful precaution.

              Even then, the amount of lead you could potentially ingest wouldn't harm most people. I work with electronics every day and I see people not washing their hands and handling their food with dirty hands. I've seen these people do this for years, and not a single one of them has had lead poisoning. There probably are some people out there that are more sen

    • by BACPro ( 206388 )

      50/50.
      Paste flux.
      lots of hot water to wash with...

    • Re: (Score:3, Informative)

      I use 70/30 and 60/40 both at home and at work. I have some 70/30 circuits I soldered up in 1974 that are still working.
      High-tin solders are harder to work with: it doesn't flow as easily and doesn't seem to be willing to bead up on a pad, so if you try and self-locate a small package -- a BGA or LLP -- using solder, it won't: it'll just bridge all over the place. Thankfully, at work we provide engineering samples, not commercial stuff, so we don't have to worry about RoHS and can keep using leaded solder

  • Old (Score:5, Insightful)

    by sexconker ( 1179573 ) on Monday September 29, 2008 @06:33PM (#25199585)

    Nvidia has already switched away from high-lead solder.

    The Inquirer reported on this whole fiasco.

    People shit on The Inquirer a lot, but there are 3 awesome things about that site:

    Their writers do not sign NDAs.

    They have writers all over the world - not someone they send out, but people who live there.

    Their writers intimately know people in the industry - from the people up top to the people at the factory floor.

    • Nvidia has already switched away from high-lead solder.

      So, should I plan on buying a new card soon, when my solder snaps, or do I have a good one? How do I find out? Is this why my 7600 gave up the ghost so quickly? Have I already been a victim of this problem?

    • Re:Old (Score:4, Funny)

      by Babbster ( 107076 ) <aaronbabb&gmail,com> on Monday September 29, 2008 @07:40PM (#25200103) Homepage

      Their writers intimately know people in the industry - from the people up top to the people at the factory floor.

      Isn't screwing one's sources against the journalistic ethic?

      Try the veal!

      • Isn't screwing one's sources against the journalistic ethic?

        I can be intimate without screwing, you insensitive clod!

    • by aeoo ( 568706 )

      Ahhh, I've always loved The Inquirer and I used to love The Register when Mike Magee was there.

      Dang, I looked it up, looks like Mike Magee's new site is itexaminer.com. I guess I have to check it out. :) I just like Mike's approach.

  • Whatever. (Score:2, Informative)

    by snarfies ( 115214 )

    I have tried three different ATI cards - all three ATI cards up and died on me within one year of usage, and I don't even play any graphics-heavy FPS games. After my most recent ATI card croaked (A Radeon X550, I think) I switched to Nvidia. So far my current Nvidia card has outlasted any ATI cards I've ever owned.

    • Hey unless my SLI nVidia cards die before I upgrade in another 6 months to a year, who cares? I didn't read TFA, but if we're talking 1 year vs 100 years, I couldn't care less. They work, they're fast and stable.
    • by Keill ( 920526 )

      Maybe the make/brand matters?

      I have a 4-year-old Sapphire 9800 Pro still going strong in this computer - (well, as strong as it gets, (woefully underpowered now)).

    • "I HAVE BEEN A LOYAL USER OF BRAND X, BUT ALL OF BRAND X'S CHIPS SPONTANEOUSLY COMBUSTED WITHIN 24 HOURS. I BOUGHT A BRAND Y CHIP AND I HAVE BEEN RUNNING IT FOR AT LEAST 100 YEARS WITHOUT ANY PROBLEMS"

      Do you realize how many people spout this exact same thing on both sides? It's a completely meaningless, stupid metric because 1. graphics cards don't usually just die and 2. changes between generations are so massive that even if one had a known failure due to heat or something it would be accounted for in th

  • by Somegeek ( 624100 ) on Monday September 29, 2008 @06:42PM (#25199655)

    This study does NOT specifically address or study AMD or NVidia's Chips.

    It does not specifically address or test the exact chemical makeup of chips belonging to AMD or NVidia.

    The conclusions being drawn as to the relative life spans of those manufacturer's chips appear to strictly belong to the bloggers who want a big headline, and not to the authors of the study. The study authors specifically note that in order to determine the life span of real chips, the real chips in question should be studied. Quote:

    "For life-time prediction, the real microstructure of these two kinds of flip chip solder joint should be studied and actual failure rate should be measured. "

    The study states that they are ignoring various factors that would come into play in the real world in order to simplify the study, and that they are making a number of assumptions about various testing conditions and about the makeup of the materials themselves.

    From reading the study linked, it's not even clear to me that they actually tested anything, and it appears from their wording to be only a theoretical exercise.

    In no way should the results of this study be used to state that brand X's chips will have a longer lifespan than brand Z's chips.

  • Yes you too can extend your chip 10 to 100 times for the low price of $39.95!
  • Fans? (Score:4, Insightful)

    by Twigmon ( 1095941 ) on Monday September 29, 2008 @06:52PM (#25199719) Homepage

    The chips on my cards have always outlasted the fans on my cards. I have owned both nVidia and ATI cards.

    Just because the chip - or at least, one aspect of the chip *could* last longer doesn't mean the card will.

    • Re: (Score:3, Informative)

      by argent ( 18001 )

      Fans can be replaced. I have replaced the fans on video cards on a number of occasions.

  • Solid state devices should last decades.

  • by Profane MuthaFucka ( 574406 ) <busheatskok@gmail.com> on Monday September 29, 2008 @07:34PM (#25200059) Homepage Journal

    From TFA:

    "£GDl/h' = 13.5/10. Clearly the strain in the eutectic SnPb layer (in the composite solder joint) is about one order of magnitude higher than that in the homogeneous eutectic SnPb solder joint."

    What fucking dipshits! I can't believe those morons at NVidia didn't realize this. Any judge is going to take one look at this in a class-action lawsuit, and NVidia is HISTORY, man!

  • by FurtiveGlancer ( 1274746 ) <AdHocTechGuy&aol,com> on Monday September 29, 2008 @08:27PM (#25200387) Journal
    Now I'll have to seriously consider switching to unleaded.
  • Hi, I have a laptop under warrenty with an nvidia 6200M GS. It might be crashing when it gets hot. Yesterday it locked up while I was playing war3 demo in wine, which I have done since ubuntu 5.10 without problem on another computer. When it is hot, sometimes I will see what looks like a mesh of fabric on the LCD. This fabric is grey with a black background.

    What do I say to Acer? Do I act dumb or say I know about the GPU's with low temperature soldier balls and they all will fail, and mine is going. ? Do I

  • by Raynor ( 925006 ) on Monday September 29, 2008 @08:48PM (#25200501) Journal
    "Since the plastic energy produced in the eutectic SnPb layer in the composite joint is about 100 times larger than that in the homogeneous eutectic SnPb joint, we expect the cycle times needs to fail the latter will be 100 times longer. But the above model is ideal, nevertheless it is reasonable to say that it is at the least 10 times longer. " A) They didn't test ANYTHING. B) They admit their ideal model is probably not realistic, and actual difference could be much less C) They admit this does not mean actual life-time of products and explicitly suggest testing in that regard. D) /. needs to stop posting sensationalist crap like this. The article is interesting, the title is bullshit. If I want sensationalism I will turn on my TV.
  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Monday September 29, 2008 @08:56PM (#25200547)
    Comment removed based on user account deletion
  • After the dive AMD took today in the markets, they better be long lasting, because it's going to be hard to find replacements.
  • by Dr. Mu ( 603661 ) on Monday September 29, 2008 @10:20PM (#25201069)

    With the lead content, they're not RoHS-compliant.

    • Re: (Score:3, Informative)

      In the RoHS directive there is an exception for high-lead solder used in flip-chip style packages.

      TFA mention a mixed scenario where you mix high-lead and eutectic solder. Not sure if that is excempt and also not sure that this combination have been used by nVidia.

  • What recall? (Score:3, Interesting)

    by celest ( 100606 ) <mekki@@@mekki...ca> on Tuesday September 30, 2008 @09:36AM (#25204421) Homepage

    "NVidia is currently in the midst of a $200M recall of bad GPUs"

    Last I checked, they reserved $200M on their financial sheets in case they needed to deal with the chips. I've heard nothing about an official recall? Only thing I can find is a lot of angry resellers who are demanding a recall.

    Correct me if I'm wrong?

Technology is dominated by those who manage what they do not understand.

Working...