Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
AMD Businesses Graphics

AMD Graphics Chip Shortage Hits PC Vendors 97

Posted by Soulskill
from the at-least-tsmc-is-consistent dept.
CWmike writes "An offshore AMD foundry is having trouble ramping up production of a new 40-nanometer GPU, forcing PC makers to delay shipments of desktop and laptop computers, AMD confirmed today. TSMC is struggling to get up to speed manufacturing AMD's 5800 series, 40-nm GPUs, according to Jim McGregor, an analyst at In-Stat. He added that the foundry is in full production, but so far yields are below expectation. Matt Davis, a spokesman for AMD, confirmed that TSMC is having issues with production of the chips. He added that it's not clear how far behind the foundry is on production expectations. 'The design is sound. It's just a matter of trying to get TSMC to a point where they can yield. They're feeling the manufacturing crunch,' said Davis. 'We're a little bit under yield but we're working back into a manufacturing schedule we want for these parts. TSMC can only kick them out so fast at this point.' He said that PC vendors are being affected but declined to say how many vendors are feeling the pinch or which ones. 'It's the end of the whip,' he added. '[The vendors] are going to have a hard time.'" A post at Anandtech suggests we'll see price hikes for the 5800-series Radeons until this situation sorts itself out.
This discussion has been archived. No new comments can be posted.

AMD Graphics Chip Shortage Hits PC Vendors

Comments Filter:
  • TSMC (Score:5, Informative)

    by cheesybagel (670288) on Friday November 06, 2009 @07:18PM (#30011462)
    NVIDIA also manufactures their GPUs at TSMC. TSMC is the largest foundry, but it has competitors like UMC, Chartered and SMIC. TSMC probably has more revenue than all those combined however...
    • by DavMz (1652411)

      Things don't look good for SMIC though. It seems they will severly loose against TSMC in the suit for theft of trade secrets...

  • by haruchai (17472) on Friday November 06, 2009 @07:21PM (#30011478)

    They're not called Chipzilla for nothing. I can't remember the last time Intel had poor yields ( or were admitting to it)
    but this has been an issue for pretty much everyone else for years, particularly AMD.

    • Re: (Score:3, Insightful)

      by cheesybagel (670288)
      AMD actually used to have some of the best fabs in the business. They managed to have good yields and mixed production in the same plant. AMD started using copper before Intel for e.g. That part of the business was spun-off as Global Foundries. But yeah, Intel has the best production research and facilities in the industry. It is just that they don't share their fabs with anyone else.
      • by tlhIngan (30335) <[ten.frow] [ta] [todhsals]> on Saturday November 07, 2009 @12:41AM (#30012614)

        AMD actually used to have some of the best fabs in the business. They managed to have good yields and mixed production in the same plant. AMD started using copper before Intel for e.g. That part of the business was spun-off as Global Foundries. But yeah, Intel has the best production research and facilities in the industry. It is just that they don't share their fabs with anyone else.

        True, but AMD also had a problem with capacity - they literally had to have good yields because their fabs were often running at full capacity because they were always backordered. I can't remember a time when AMD had excess production capacity. Heck, it was often why AMD's chips were poor overclockers - they got binned at their highest speed they were stable at and sold because demand was such that there was no spare chips.

        Also why Apple didn't go AMD - Apple has way too much experience being burned by Motorola and IBM both being unable to supply chips in heavy demand. And AMD would've killed for the Apple contract given the way Apple orders parts. But it would pretty much mean that there would be no AMD chips for anyone else.

        Heck, it might've been why Microsoft switched from AMD to Intel for the original Xbox. Production problems caused a very expensive redesign for Microsoft and nVidia (to create an Intel compatible chipset).

        Intel's got huge fab capacity, and can oversupply quite easily. In fact, there's so much oversupply that Intel often holds back production of faster chips and waits for AMD to catch up to keep prices up. Also why Intel can do special fab runs for customers (like how all Apple's chips support VT, or the special chip in the MacBook Air, etc).

        The only real production problems I remember are the special Pentium 3 1.13GHz processors. Which were basically just overclocked Pentium 3s and Intel was called out on it when systems were crashing.

        • by edxwelch (600979)

          > Also why Apple didn't go AMD
          Yet Apple have dumped Nvidia and are now using AMD GPUs

          • by toddestan (632714)

            It's easy for them to switch back to nVidia at any point. Which is why I see some of these "Why Apple didn't use AMD chips" posts as so silly. It would be very easy for them to use either both Intel and AMD (and VIA too) and switch between them as needed.

          • by bjb (3050) *
            Yeah, but this happens with almost every revision of their hardware. If you look at the historical specs of Apple hardware, they've gone back and forth almost every year. On top of that, for the PowerMac / Mac Pro (read: you have a choice of graphics chip), they've always offered nVidia and ATi options.
        • Apple has way too much experience being burned by Motorola and IBM both being unable to supply chips in heavy demand.

          As I recall, Motorola and IBM had no problem with regular supply. The problem was that Apple was the only major customer for desktop/laptop-suitable PowerPC processors, and those vendors quite reasonably expected long-term order commitments for these products while Apple wanted more flexibility. With Intel, Apple is just one of many customers and while it has less control over x86 processor d

      • That was the fatal mistake for AMD, to spin off their fabs, or let someone on top make such a bad strategic decision. Soon it's gonna be Intel alone, and we'll be back to $1000 cpu's again. It was because AMD, Cyrix, Winchip and Transmeta that we had $25 PC compatible GHz cpu's, or something at the. AMD is the only viable competitor still around. ARM is efficient but too slow by today's standards. VIA and IBM are still kind of around, but they are neither in the field, nor was cpu's ever one of their core b
    • by Sycraft-fu (314770) on Friday November 06, 2009 @08:08PM (#30011700)

      Part of the problem in particular with this one seems to be the process. TSMC has decided to blaze their own trail as it were and is going outside the ITRS roadmap. You'll note it says 40nm chips and that's not a typo. They have a 40nm process, whereas pretty much everything else (like Intel and AMD CPUs) are 45nm currently and working on moving to 32nm.

      Ok well this roadmap with set nodes isn't for nothing. You don't semiconductor manufacturing in a vacuum, the foundries buy hardware from a number of companies to be able to make their fab work. As such it is useful if everyone has a common goal to work on. If machines for one step are for one process and machines for another are for a different process, you have problems.

      Well TSMC has decided to go ahead and make their own process, not something part of the ITRS standard. Ok well that means they are buying some custom equipment or modifying the procedure or the like.

      The result? Well it seems to be poor yields. They had a lot of trouble bringing it online, took longer than they planned, and now it doesn't work as well as they'd hoped.

      This isn't isn't entirely surprising. How well it works out for them in the long run remains to be seen. They do have the smallest process on the market now as far as I'm aware and both nVidia and ATi are placing orders using it. However I wonder if they'll be shopping elsewhere for future cards, given the problems this is having. They can't change what they've got now (a design for one process doesn't work on another as is) but they can change what they do in the future.

      You are also correct, Intel rocks at fabs. They generally beat just about everyone to market with on a new node and they seem to be able to keep yields high enough to meet demand and keep prices at whatever level they like.

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        True, but there was an article about why they (knowingly) do it this way. (Somewhere at AnandTech, afaicr; perhaps the GlobalFoundries article?)

        I believe the gist of it was that the pace of GPU refreshes is much shorter than CPU's, and consequently it makes economic sense to both design-for and migrate to so-called half-node production steps. Both AMD (ATI) and NVIDIA been doing it this way for a while now, and I believe it has burned them in the past as well.

        ButJudging by the fact that they continue down t

        • Re: (Score:3, Interesting)

          by Sycraft-fu (314770)

          Well they've only done it once before, that was with TSMC's 55nm process. Prior to that, all GPUs I am aware of were on ITRS nodes. In terms of the 55nm chips they did get a bit burned on supply but it worked pretty well, more or less. Of course while 55nm was non standard, it wasn't blazing new ground. When TSMC was bringing 55nm online, Intel already had 45nm products for sale. Also from my understanding their 55nm process was more or less a shrink on the 65nm process they have. Not much changes in terms

    • by Chris Burke (6130) on Friday November 06, 2009 @08:57PM (#30011920) Homepage

      They're not called Chipzilla for nothing. I can't remember the last time Intel had poor yields ( or were admitting to it)
      but this has been an issue for pretty much everyone else for years, particularly AMD.

      Oh, they've had poor yields at times. But they can often make up for it -- a big part of being 'zilla -- with their sheer manufacturing capacity. Low yields just means their costs are higher, not that they can't supply customers. It has happened though that they had to "paper launch" products in the past. Though saying they've had poor yields should not be taken to imply that their fab tech isn't absolutely top notch -- low yields happens to everyone. ;) But it's that fab tech times their fab size that makes them chipzilla.

    • by Pulzar (81031)

      Just three years ago, they had a major shortage of single-channel chipsets and had to buy a boatload of them from ATI for Intel-branded motherboards.

      • Yes, and that Intel motherboard with the ATI chipset was an absolute disaster. Silly me, I bought one of those because it was very low cost. Well, the time I spent troubleshooting it was not worth the savings in dollars. D101GGC? Something like that... Luckily, that was the end of ATI providing chipsets for Intel. Of course then ATI was bought out by AMD, which meant no more ATI chipsets.
    • by evilviper (135110)

      I can't remember the last time Intel had poor yields ( or were admitting to it)
      but this has been an issue for pretty much everyone else for years, particularly AMD.

      This is utter nonsense. Intel has chip shortages *almost* every year. They are WORSE than AMD and others in this regard.

      September 2005
      http://www.tomshardware.com/news/manufacturers-report-intel-chipset-shortage,1410.html [tomshardware.com]

      May 2008
      http://www.slashgear.com/intel-atom-demand-prompts-chipset-shortages-0111422/ [slashgear.com]

      Sep 2009
      http://en.newspeg.com/Intel-G31-c [newspeg.com]

  • The big vendors who I trust already have built their inventory and this is just a temporary glitch in their manufacturing process. It's hardly something to be concerned about.

    For Joe's Custom PCs and Feed Lot (or Dell), this may be a problem.

    Should you go with an OEM who is well known and sells large volumes? Or should you stick with mom 'n pop PC assembly shops? I think it's like asking whether you should buy American or Chinese. Sure, one is cheaper but is it worth the lead poisoning?

    • Or wait till the next generation comes out when all the current generation stuff is 1/2 the price and everyone has plenty of stock.

      At least that's what works for me.

    • We're talking graphics card. There's no really big vendor, and no reason to stockpile, on the contrary, the newer stuff commands a premium and is to be sold asap, before it becomes last month's news.Prices are already rising at retail, and OEMs will surely raise them too, even if they have plenty of parts.

      Regarding the mom n' pop vs large OEM comment... I rather think the contrary: would you rather eat at a chain restaurant, or at a mom n' pop one ? Do you always buy standard, chain-made stuff assuming it's

      • would you rather eat at a chain restaurant, or at a mom n' pop one ?

        Depends. Are they both buying their ingredients from the same supplier and using the same recipes?

  • From a faked board to rumors about really bad yields, nVidia won't show up until next year. Sure, it'll probably be faster, but they clearly had to sacrifice something to focus on high-end computing with features like ECC and double-precision. My 4890 is serving me pretty well for now.
    • Re: (Score:3, Interesting)

      by cheesybagel (670288)
      NVIDIA's scared enough of Intel and Larabee to be doing stuff like this [intelsinsides.com].
    • by JoeSixpack00 (1327135) on Friday November 06, 2009 @07:56PM (#30011636)
      This is how nVidia always manages to stay on top: assumption.

      I don't know why, but people always assume that nVidia parts are at the least equal, and for the most part better than ATi. Granted they have been in the past, but anyone savvy enough to know about graphics cards should also know how much things can change with every next generation.

      I've heard people actually say "It's safe to say that the HD 3800 was pretty much a failure". That had to be one of the dumbest comments I've ever heard from a so-called "true gamer".
      • Re: (Score:1, Insightful)

        by Anonymous Coward

        You ATI fanboys are funny.

        At least nvidia knows how to make a decent driver.

        • And hide it too.
        • by Deosyne (92713)

          I've been told by a couple of friends running Crossfire rigs that ATI has finally been pulling their head from their ass on driver flakiness in the past couple of years. That along with the issues that I've run into with a couple of the past few NVIDIA driver revisions and the whole PhysX clusterfuck, I'm willing to give ATI another chance after swearing off of them for the past few years. Well, except for the fact that I can't get a 5850 anywhere and the prices are sliding on up because of it.

          I'm still wil

        • At least ATI GPUs don't melt. http://news.cnet.com/8301-13554_3-10020782-33.html [cnet.com]
      • Re: (Score:3, Interesting)

        by KillShill (877105)

        They always manage to stay on top because they are a monopolist in the gfx industry. They are the Inte£/Micro$oft of their respective industry.

        Remember the partial precision era (5800)? They just happened to continue using PP well up to the 8 series...

        3Dmark? They threatened to leave the sponsors group when things didn't go their way, a few years back.

        They have PhysX in 3Dmark, when no one else has it in hardware to artificially boost benchmark scores (which basically sells hardware to 99% of non-enth

        • by Bacon Bits (926911) on Friday November 06, 2009 @09:45PM (#30012122)

          Inte£/Micro$oft
          $vidia

          Way to nuke any possibility of credibility, dude. Using currency symbols in company names just makes you look like a nutjob, regardless of how accurate your accusations might be. Nevermind that company of nVidia's, Intel's, Microsoft's, or indeed even ATI/AMD's size has "a very long history of dirty tricks, anti-competitive and anti-consumer behavior". Pick the card that works the best for your needs. Giving the name on the box more press -- even bad press -- simply makes the brand name that much more valuable than the hardware you're buying.

          • by haruchai (17472)

            Being a nutjob never hurt Rush Limbaugh or any number of right-wingnuts.

          • "history of dirty tricks, anti-competitive and anti-consumer behavior"

            Two words, lock in. These companies are terrified of interoperability, support for future processors or graphics cards or anything in that mindset.

            Meanwhile they are searching diligently for ways to guarantee profits with minimal effort, the best way to do this? Muck about with software so your chips always look good while still ending up slow.

            If you make software updates break your old products you create a product treadmill.

            We'v
        • Re: (Score:3, Informative)

          by TheRaven64 (641858)

          They always manage to stay on top because they are a monopolist in the gfx industry

          Market share numbers from Q2 2009:

          • Intel: 51.20%
          • nVidia: 29.2%
          • AMD: 18.14%

          Sure, nVidia is an evil monopolist, what with having a market share slightly more than half of the company with the majority market share and a third larger than their nearest competitor.

          I know it's fashionable to call everyone a monopolist on Slashdot, but the term has real meanings in both law and economics. Neither definition can apply to a company that has both a market share under 50% and a competitor with a larger market

          • by petermgreen (876956) <plugwash.p10link@net> on Saturday November 07, 2009 @08:14AM (#30013850) Homepage

            It really depends how you define the market. Yes intel makes a lot of motherboard chipsets and most of those come with integrated graphics with 3D capability that ranges from appalling to mediocre.

            If you define the market as all GPUs sold even those that are used in machines that never need 3D acceleration or those that are there because they are part of the chipset but are disabled by a better card (which is what I suspect your stats do) then it doesn't at all surprise me that intel comes out on top.

            OTOH if you define the market as GPUs sold for use on seperate cards (that is GPUs that customers buy willingly because they want more than their onboard graphics offers) then afaict ATI and nVidia are the only real players left.

            P.S. this post does not take any position postive or negative on whether nVidia is an evil monopolist, just that I don't think it's reasonable to count crappy integrated graphics and chips for gaming cards as the same market.

          • Monopolist = Company I don't like that sells more hardware/software than company I do like.

      • Re: (Score:3, Interesting)

        by drinkypoo (153816)

        nVidia manages to stay in my systems on the assumption that it will work better in Linux. So far, I have never been wrong about this; ATI has always been an abject nightmare for me, while nVidia has usually worked. Note that I am not the fanboy who will say it "just works" which would be a lie. But, it can be made to work. I've been flip-flopping between ATI and nVidia and back in the day had 3dfx and even Permedia and PowerVR at times and I've spent most of my time with nVidia and never regretted it. I've

      • One word: Drivers (Score:1, Informative)

        by Anonymous Coward

        AMD/ATi's video card drivers suck, have sucked and continue to suck. This is a real shame because ATi's hardware has really picked up (the 9800 was a ground-breaker, the X???? series was pretty lame but 3xxx/4xxx/5xxx is blowing nVidia out of the water which probably serves them right for over capitalising on the high-end scientific computation market).

        The problem with ATi's drivers for me personally are:
        Windows
        Problems with OpenGL performance compared to DirectX (annoying)
        No per application profiles (deal

        • Re: (Score:1, Informative)

          by Anonymous Coward

          www.phoronix.com/vr.php?view=11880
          http://www.phoronix.com/scan.php?page=news_item&px=NzY0NA
          http://www.phoronix.com/scan.php?page=article&item=960&num=1
          http://www.phoronix.com/scan.php?page=news_item&px=NzAxNg

          If you want to help Linux, buy AMD.

          • It is certainly good news that AMD is opening up the specs for ATI graphics but ultimately i'm going to base my buying descisions on what works best now not what may work best in a few years time.

            http://wiki.x.org/wiki/radeonhd [x.org] has the following claims

            The following subsystems have not been implemented yet or show some limitations:

            * 3D acceleration is active by default only on R5xx and RS6xx right now. Experimental support for R6xx and R7xx is available, but not for the faint of

      • by CAIMLAS (41445)

        I've not bought an ATI card (intentionally - unless it was bundled with something) since the All In Wonder cards were new, and I've been buying Nvidia since my G400 bit the dust years ago. But they do have their act together in all but the driver department, IMO.

        The biggest thing for me is thermal footprint. In many ways, it demonstrates the overall quality of the design, I think.

        I met a Nvidia engineer in the Denver (I think) airport last winter. We sat and talked for about half an hour while we waited for

  • by Anonymous Coward

    Its Intel's fault. Help us out Cuomo !

    • by mykos (1627575)
      I dunno about that. Intel was found to have broken laws in Europe and they employ quite a few people over there. Perhaps it is possible that breaking the law can cause charges to be brought against you. Who knows?
  • by physburn (1095481) on Friday November 06, 2009 @07:54PM (#30011620) Homepage Journal
    Just as AMD started turning back into profit, and gained the graphics card they had to run out of chip production. Its a pity really the're using TMC, I believe global foundaries can do 40nm standard silicon either now or soon, so AMD should perphaps switch to there part owned foundary. Hope AMD sort out the problem soon, i'd hate to be on a one cpu maker planet.

    ---

    Graphics Cards [feeddistiller.com] Feed @ Feed Distiller [feeddistiller.com]

    • by mykos (1627575)
      They really should. Relying on a business that their competitors have even larger contracts with opens them up to taking damage from bribery, like if Nvidia were to pay TSMC a little extra for a bigger chunk of the 40nm silicon (assuming this isn't happening already!). Turning production to Globalfoundries could help them avoid such an issue.
    • Re: (Score:3, Interesting)

      by Chris Burke (6130)

      I believe global foundaries can do 40nm standard silicon either now or soon, so AMD should perphaps switch to there part owned foundary.

      No, they can't. Global Foundries can do 45nm, and soon 32nm, but not 40. Also, Global Foundries uses SOI while TSMC is bulk.

      I'm sure AMD will use GF eventually for their graphics chips, but for right now, I'm also sure it will take less time for TSMC to sort themselves out than it would to modify the design for a very different process.

      Also, don't expect graphics by itsel

    • by TheRaven64 (641858) on Saturday November 07, 2009 @06:46AM (#30013556) Journal
      They'll probably fix it soon. One of the reasons for spinning off The Foundry Companies was to make it easier for AMD to use other foundries for production. I'd imagine that their next chips will be sent to two or more foundries with penalty clauses in the contract if they can't keep up with demand and bonuses if the others can take up the slack when one can't keep up.

      i'd hate to be on a one cpu maker planet

      You mean one x86 CPU manufacturer. TI, Samsung, Qualcomm, and a dozen other companies all make ARM chips and these outsell x86 by a large margin.

      • Re: (Score:3, Funny)

        by evilviper (135110)

        You mean one x86 CPU manufacturer. TI, Samsung, Qualcomm, and a dozen other companies all make ARM chips and these outsell x86 by a large margin.

        Frito Lay chips outsell ARM chips by a large margin...

        Of course they're vastly different things, and not remotely comparable, but you don't seem to care about that in the slightest...

  • In years past TSMC would be on the phone to the Semiconductor Equipment Manufactures requesting their engineers on the next jet to help resolve the problem. Alas, there is no money for travel anymore and most of the engineers have been laid off. A sign of the times is guess.

    • by freak132 (812674)
      If the engineers have been let go from 'Semiconductor Equipment Manufactures' perhaps TSMC should hire one or two at a lower salary.
  • Sorry if this is a little off-topic, but...

    Does anyone know when AMD/ATI will be releasing notebook version of its 5000-line chips, and how they're expected to compare to chips currently on the market from them and from nVidia?

  • A post at Anandtech suggests we'll see price hikes for the 5800-series Radeons until this situation sorts itself out.

    Price hikes? No. Probably not. They were already sold out pretty much everywhere. So it's more likely we just won't see any until this is worked out. I'm not saying there won't be assholes selling 5870s for $800 on eBay, I mean, there ALWAYS are. I'm just saying it's not like Newegg is suddenly going to have them back in stock but for a hundred bucks more.

    • Re: (Score:3, Insightful)

      by XanC (644172)

      I'm not saying there won't be assholes selling 5870s for $800 on eBay

      How does that make them assholes?

      • by Ant P. (974313)

        They buy for $400, sell for $800, then geometrically increase their purchases fucking up the supply chain for legitimate users.

  • Honestly. Fab problems when pushing technology forward has been a hallmark of AMD's business for nearly a decade now. Why should this surprise anyone?

    Yeah. The new series of graphics controllers may be the bees' knees, and may make nVidia cry for mommy, but until people can...y'know...OBTAIN THEM, it's all just smoke and mirrors.

    • AMD is not having yield problems because, since spinning off The Foundry Company, AMD does not have any fabs. The company that AMD contracted to produce their chips is having yield problems. Still not entirely unexpected (they are using a very ambitious technique), but being able to switch suppliers when this kind of thing happened was part of the reason for spinning off TFC...
  • by Targon (17348) on Saturday November 07, 2009 @06:38AM (#30013532)

    AMD has needed new fabs to increase capacity for a long time now. After AMD purchased ATI, I always found it odd that there wasn't more of a push to build more fabs and bring their GPU production in-house. At the least, NVIDIA should also be suffering from TSMC having problems, even though they may not be feeling the crunch at the moment.

    • NVIDIA are suffering a lot from TMSC having problems. The reason no one is going to be able to by a Fermi based chip until at least April (and very few people until July or so) is because it's an even bigger and more complex part on the same process. If they can't get enough 5800s out the door, they sure as hell can't make enough of Fermi.

  • An offshore AMD foundry...

    How far offshore is that in nautical miles? I am not surprised they are having troubles with what with all the salt in the air.

Real Programmers think better when playing Adventure or Rogue.

Working...