AMD Graphics Chips Could Last 10X To 100X Longer 150
An anonymous reader writes "According to a research report out of UCLA, released this morning, NVidia's high-lead bump packaging could last anywhere from 1/10th to 1/100th as long as AMD's advanced eutectic bump approach. (TG Daily has picked up the claim.) NVidia is currently in the midst of a $200M recall of bad GPUs, and the report suggests that the issue could be much deeper than NVidia's PR department would have us believe." The report lends credence to the strident claims of the Inquirer's Charlie Demerjian, which we discussed a month back.
Re:Sweet! (Score:5, Insightful)
I've had an ATI X1950 Pro for 3 years now and while the card works great, the newer games render it near obsolete. So yes, I can have a card forever but what good is that going to do me if I need to upgrade anyway?
Resale value would suck and why would anyone want to spend 50$ on a 3 year old card when they can get a 1 year old "better" card for 90$. (I pulled the numbers out of thin air but you get the idea).
Old (Score:5, Insightful)
Nvidia has already switched away from high-lead solder.
The Inquirer reported on this whole fiasco.
People shit on The Inquirer a lot, but there are 3 awesome things about that site:
Their writers do not sign NDAs.
They have writers all over the world - not someone they send out, but people who live there.
Their writers intimately know people in the industry - from the people up top to the people at the factory floor.
Study does not relate to AMD vs NVidia (Score:5, Insightful)
This study does NOT specifically address or study AMD or NVidia's Chips.
It does not specifically address or test the exact chemical makeup of chips belonging to AMD or NVidia.
The conclusions being drawn as to the relative life spans of those manufacturer's chips appear to strictly belong to the bloggers who want a big headline, and not to the authors of the study. The study authors specifically note that in order to determine the life span of real chips, the real chips in question should be studied. Quote:
"For life-time prediction, the real microstructure of these two kinds of flip chip solder joint should be studied and actual failure rate should be measured. "
The study states that they are ignoring various factors that would come into play in the real world in order to simplify the study, and that they are making a number of assumptions about various testing conditions and about the makeup of the materials themselves.
From reading the study linked, it's not even clear to me that they actually tested anything, and it appears from their wording to be only a theoretical exercise.
In no way should the results of this study be used to state that brand X's chips will have a longer lifespan than brand Z's chips.
Fans? (Score:4, Insightful)
The chips on my cards have always outlasted the fans on my cards. I have owned both nVidia and ATI cards.
Just because the chip - or at least, one aspect of the chip *could* last longer doesn't mean the card will.
Re:Down to the drivers (Score:5, Insightful)
I have an older 9550 ATI in one box and my more powerful machine has an 8800 nVidia in it.
As far as large, clunky drivers go, ATI is king of the hill... their setup that requires .Net to install, the bloated and resource hungry Catalyst Control Center... it's ugly.
nVidia on the otherhand seems to be far more lightweight and fits in better with Windows.
But performance wise I haven't really had anything to complain about though, and I can't think of instances of actually cursing the drivers for not working...
I've never been loyal to either really, when it comes time to do an upgrade I research on the web what card people are saying gives best bang for buck in my pricerange, I don't give a hoot who makes it.
Re:More data please! (Score:1, Insightful)
I've been doing occasional hand soldering for years, work as an electronics designer. I was worried about lead, got my blood levels checked a few times, no problem, have now given up. Unless you chew on the stuff as you work, my statistical sample of 1 suggests you're probably ok. (Wash your hands before lunch). I'd be more worried about the fumes from the fluxes...
So Do nVidias last 3 months, or ATIs 30-300 years? (Score:5, Insightful)
I'm skeptical about the report, especially given the lack of any field studies with it. The useful life of a piece of computer equipment is usually 3-5 years; high-end graphics cards are probably shorter, because the main customers are gamers who need cutting-edge performance to kill orcs with.
So does "10-100 times longer" mean that significant fractions of nVidias are failing in 10 days - 3 months due to bad solder joints? Or does it mean that the solder joints in an ATI will last 30-300 years, long after anybody except a few retro gamers are interested in a graphics system that's mounted on a card in a separate box and doesn't interface directly to their optic nerves?
Re:Sweet! (Score:3, Insightful)
With desktop systems it isn't so bad; by the time the card dies, equivalent performance will be considerably cheaper(or the card will still be under warranty), and swapping it out will take maybe 10 minutes. With laptops, not so much.