Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Intel Graphics Software Technology

Why AMD Could Win The Coming Visual Computing Battle 161

Vigile writes "The past week has been rampant with discussion on the new war that is brewing between NVIDIA and Intel, but there was one big player left out of the story: AMD. It would seem that both sides have written this competitor off, but PC Perspective thinks quite the opposite. The company is having financial difficulties, but AMD already has the technologies that both NVIDIA and Intel are striving to build or acquire: mainstream CPU, competitive GPU, high quality IGP solutions and technology for hybrid processing. This article postulates that both Intel and NVIDIA are overlooking a still-competitive opponent, which could turn out to be a drastic mistake."
This discussion has been archived. No new comments can be posted.

Why AMD Could Win The Coming Visual Computing Battle

Comments Filter:
  • This was written by an AMD shareholder, of course. Guilty as charged as well, here.
    • by Anonymous Coward on Thursday April 17, 2008 @01:10PM (#23108162)
      I'm sure those AMD shares will come in handy some day... I, for instance, am out of paper towels.
      • by konputer ( 568044 ) <slashdot@[ ]puter.org ['kon' in gap]> on Thursday April 17, 2008 @01:15PM (#23108250) Homepage
        I'm still rooting for AMD. I think that they can pull themselves out of the mess they made. Why? No sane reason. But whenever the U.S. economy decides to come back up, so will AMD.
        • by account_deleted ( 4530225 ) on Thursday April 17, 2008 @02:18PM (#23109234)
          Comment removed based on user account deletion
        • AMD won't come back until the make some key product marketing and customer service changes. I have now had three local shops drop AMD products because RMA's take a month or longer and they don't want the hassle of fronting a new product while waiting for the RMA to be returned so they can now have out sates parts on the shelf. Intel products are essentially a 1 week no questions asked turnaround.

          They have to front the product or suffer a reputation sting where they could sell a product, have it work for a m
      • by Creepy ( 93888 )

        I'm sure those AMD shares will come in handy some day... I, for instance, am out of paper towels.
        Just use dollar bills for the time being.
    • by hey! ( 33014 ) on Thursday April 17, 2008 @01:44PM (#23108700) Homepage Journal
      There's always an element of drawing the bullseye around the bullet hole in business planning. Your position is never quite what you'd want it to be (with rare exceptions), so you job, in part, is to imagine a bright future that, through an incredible stroke of luck, start right where you're standing right now.

      The thing is, while that is all necessary and good as part of business planning, individual investors really ought not to make investment decisions based on this kind of planning, unless they have their own teams of researchers and analysts and their own sources of information.

      If you know nothing about the technology, you can't really examine something like this critically. If you know a great deal about it, you are even less qualified to make prognostications, because your opinion about what is good technology messes with your opinion about what makes good business sense.

      Mark Twain was a very intelligent man, who lost his entire fortune investing in a revolutionary typesetting system. The things that made him a great writer made him a lousy investor: imagination, contrariness, a willingness to buck convention. Of course, exactly the same qualities describe a superb investor. The thing that really did him in was overestimating his knowledge of a market he was peripherally involved in.

      It was natural for Twain to be interested in the process of printing books and periodicals, and to be familiar enough with the process of typesetting in general to see the potential, but not quite intimately enough to see the pitfalls. He would have been better off investing in something he had absolutely no interest or prior experience in.
      • There's are 2 critical differences between Mark Twain's investing in a typesetting machine and me investing in AMD: I'm the target customer for AMD and I'm not going to invest principally in this one company. I would assume from your description of what happened to Mark Twain that he wasn't heavily involved with the people who used type setting machines and therefore made the purchasing decisions for them. On the other hand, I'm intimately familiar with all the reasons that people choose one company over an
        • Re: (Score:3, Interesting)

          by hey! ( 33014 )
          Oh, certainly. I wasn't making a specific point about you.

          If you've ever been on the product management end of the stick, though, the biggest danger is overestimating the number of people who think as you do or visualize their needs as you do. That's why it's dangerous for people with lots of technical knowledge to use it to guide their investments. You can overcome this, but it's a serious trap.

          That's why I don't invest in tech companies at all; whenever I have it hasn't worked out.

          I did pretty well in
          • Re: (Score:2, Funny)

            Thats why I only invest in alcohol and gambling stocks
          • I think involvement inside the technical market is a problem. I think the volatility and unpredictable nature of the tech market is a problem.

            I can't think of that many people outside the tech industry who have enough knowledge to get rich investing in it through anything more than luck.

            Warren Buffett is an overused example, but I'll bring him up anyway: he avoids tech stocks entirely in favor of things he does understand, like insurance, soft drinks, retail stores, restaurants, and recently, railways
    • Re: (Score:1, Informative)

      by Vigile ( 99919 ) *

      This was written by an AMD shareholder, of course.

      Guilty as charged as well, here.
      I am absolutely not an AMD shareholder. Nor do I own anything in Intel or NVIDIA.
    • by Mr_eX9 ( 800448 )
      Circumstantial ad hominem...nothing to see here, move along.
  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Thursday April 17, 2008 @01:02PM (#23108016)
    Year over year annual growth has ceased and this past quarter shows a 0.2% decline in revenues.
    • by brunes69 ( 86786 ) <slashdot@nOSpam.keirstead.org> on Thursday April 17, 2008 @01:10PM (#23108152)
      Only a 0.2 decline in revenues in the mist of what many consider an already begun recession ain't too bad.
      • But how are they doing compared to Intel, Via, ARM etc? That's what matters. There may well be a recession in America, and to a lesser extent (so far anyway) in Europe, but that is nothing compared to the huge market increases in India and China
    • by moderatorrater ( 1095745 ) on Thursday April 17, 2008 @01:16PM (#23108268)
      Apparently you didn't RTFA, because they describe the problems that AMD is having and then go on to say why the problems may be surmounted. In other words, you're overlooking the obvious position that AMD's in. nVidia doesn't have a CPU line that's one of the top CPUs in the market and in performance. Intel doesn't have a GPU that's competitive in performance. With the market moving towards greater integration and interaction between the CPU and the GPU, there's only one company that can deliver both.

      So it's going to come down to whether or not AMD has the ability right now to keep pushing their product lines and innovating fast enough to beat Intel and nVidia to the punch. Their financial situation hurts their chances, but it doesn't negate them completely.
      • Re: (Score:3, Interesting)

        exactly, I think AMD should have gotten Nvidia from the start as ATI was always an Intel fanclub, but the bucks landed and Intel cut the leading integrated chipset vendor (ATI was always ahead of Nvidia in low-end installs that did just what intel told them to) out with their integrated graphics push. Nvidia was much more in line with AMD in terms of chipsets that properly complimented AMDs Hyper-transport tech. The GPU/CPU is what AMD made Hyper-transport for, it will be neat to see it implemented. Theor
        • Except Nvidia wouldn't sell out, so AMD couldn't buy them... If they merged the Nvidia CEO insisted on running the show... So, no I still think AMD made the right choice on which way to go graphics company-wise
  • Flamewar in 3...2...1...
  • Catch & Release... (Score:5, Insightful)

    by Deadfyre_Deadsoul ( 1193759 ) on Thursday April 17, 2008 @01:14PM (#23108232) Journal
    Amd has supposed to have been dead and written off how many times in the past years? Ati as well?

    Its nice to know that they still maintain an edge, even though they have no where near the capitol on hand that nVidia and Intel do.

    I for one always liked Underdogs... :)
    • Amd has supposed to have been dead and written off how many times in the past years? Ati as well?

      Its nice to know that they still maintain an edge, even though they have no where near the capitol on hand that nVidia and Intel do.

      Heck, 10 years ago the press had anointed 3dfx as king while nVidia was a barely-mentioned also-ran indistinguishable from the half-dozen other 3D chipset manufacturers. These companies stumbling on one major release is no big deal. If they stumble on two sequential releases l

  • I thought AMD was dead [slashdot.org]!
  • Cash Crunch (Score:5, Interesting)

    by Guppy ( 12314 ) on Thursday April 17, 2008 @01:34PM (#23108538)
    I used to know an engineer who worked for AMD, and one of the things he would tell me about were the problems with the merger with ATI. There were a lot of manufacturing and engineering differences between the two companies that made it difficult to combine designs from the two. In addition, the poor financial situation of AMD meant they didn't have enough time and money to complete the "Fusion" CPU/GPU combo -- one of the main drivers behind the merger in the first place.

    He said that the company will still bring something out, and that something will still go by the codename "Fusion", but it will not be the product originally envisioned at the time the companies decided to merge. He speculated maybe some kind of Multi-Chip Module -- essentially just a separate CPU and a separate GPU die mounted into the same packaging.
    • Like the Intel's quad code offering (two dual-core chips)? It may not be so bad after all.

    • by vivin ( 671928 ) <vivin.paliath@nOsPam.gmail.com> on Thursday April 17, 2008 @02:17PM (#23109218) Homepage Journal
      ... and I recall during company meetings we would be told that Intel was "keeping an eye on nVidia". AMD not so much. Intel looks at nVidia to be a new and strong threat.
    • by account_deleted ( 4530225 ) on Thursday April 17, 2008 @02:55PM (#23109726)
      Comment removed based on user account deletion
      • somebody could make a GPU with Hyper-transport connections instead of the standard PCI-E. HT is designed for just that purpose. Perhaps AMD/ATI will try it out.
        • by LoRdTAW ( 99712 )
          Hyper transport won't help with rendering as much as faster GPU's would. Remember GPU's have their own memory on very wide buses that can move data at over 20GB per second. The integrated CPU/GPU stuff is most likely targeted at low end users like cheap PC's, media centers and embedded systems.

          Me personally, would like to see SLI/Cross-Fire like setups actually give you a near 100% boost in rendering speeds. From all the benchmarks I have seen you barely get 20-30% speed increase for a 100% price increase.
    • Remember that Intel spent billions developing the Itanium, whilst one of their cash-strapped subsidiaries in Israel came up with the Core Duo.

      Just because Fusion might not be the glorious flagship envisioned by AMD doesn't mean that it'll flop.
  • by Ilyon ( 1150115 ) on Thursday April 17, 2008 @01:42PM (#23108666)

    I respect AMD and had faith in their ability to make a comeback in the past, but there's a new wrinkle this time: Apple.

    Apple computer sales are growing at 2.5 times the industry rate, and they use Intel CPUs. With all the growth in the PC market going to Intel CPU's, is there much room for an AMD comeback?

    I can see two ways for AMD to make a comeback. If Apple's agreement to use Intel CPUs expires and AMD can win some business with Apple, AMD can latch on to Apple's growth. But Apple chose Intel for its ability to ramp up production. Will AMD be able to provide the same? Will AMD be willing to give up other customers to meet Apple's demand?

    If Apple chooses this route, how big of an architecture change will this be? I've no doubt Apple can provide developer tools to aid the migration, but will Core 2 optimizations easily translate to AMD optimizations?

    Will Apple take the risk of supporting both architectures? They are very active in LLVM development, which allows dynamic optimization of code. If LLVM works as well as many hope, Apple could deliver software in a common binary format that automatically adapts to any architecture using LLVM. This would be quite novel. Apple would benefit from ongoing competition between Intel and AMD while giving AMD a fighting chance in a market increasingly dominated by Apple.

    The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?

    If Apple weren't in Intel's camp, I would invest in AMD with confidence in a turnaround, but I think the fate of AMD lies largely with adoption by Apple or Linux.

    What do you think?

    • by dreamchaser ( 49529 ) on Thursday April 17, 2008 @02:24PM (#23109312) Homepage Journal
      OS X runs just fine on SSE3 equipped AMD CPU's as it stands. It's not supported of course but it runs just fine. They (Apple) could easily support AMD even if they didn't optimize for them quite as much as they do for the Core 2 architecture. Frankly, I'm not sure the difference would be all that noticeable compared to the already noticeable delta in performance between the two main x86 architectures.
      • by Creepy ( 93888 )
        I'd be curious to know how well AMD CPUs running software OpenGL (which actually uses those SSE instructions). Supposedly AMD supports it, in fact, they released SSE5 while Intel continues working on its replacement, AVX [wikipedia.org]
        • Uh... AMD released a powerpoint presentation about SSE5 and if they are lucky it will be shipping at about the same time the AVX instructions appear in production Intel Silicon.
              If AMD's powerpoint slides magically came true then Barcelona would have been out in April... of 2007, and it would have actually been faster and less power-hungry than Intel's chips. Unfortunately, presentations != reality.
          • by Creepy ( 93888 )
            You're right - SSE5 was only a spec, and it looks like it is expected in the same timeframe as AVX. I actually heard someone talk about them both in a podcast, but I was at work and apparently wasn't paying enough attention. So much for multitasking :)

            Anyhow, that wasn't my point - I was wondering if it worked in software mode.
        • Software only OpenGL is horribly slow without a GPU to accellerate it, no matter how many SIMD extensions you have at your disposal.
          • by Creepy ( 93888 )
            Apple also supports hybrid rendering, so if you aren't doing too much in software, you may be able to get away with it. I actually use software mode for testing, but my mac is ancient (it can't even run X.5), so it isn't much good.
    • by moderatorrater ( 1095745 ) on Thursday April 17, 2008 @02:35PM (#23109448)

      What do you think?
      That linux is a dominant player in the server market and that Apple is pretty much negligible in either. With how similar Intel and AMD chips tend to be, I don't know that there's anything stopping Apple from switching to AMD at any time. Either way, it's a relatively small chunk of the desktop market.

      The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?
      This is an interesting question. When AMD comes out with their chips, if they really want to impress people with its abilities, they would do well to get some coders working on Folding@Home working on their new chips. It was impressive to see what ATI cards could do with the code, and it would be a great way to showcase the abilities to computationally heavy programs that run on servers (thereby breaking into that market).

      On the desktop end they would have to get something working to showcase the performance in games. Unfortunately, open source doesn't have a lot of 3d games floating around.

      Whatever happens, I think they're going to have to show something that works well with windows or else they're going to flop. If it works well enough with windows and they can show substantial performance improvements, then get manufacturing capacity up, they might be able to land an Apple contract. It would be huge for publicity and for a single contract, but for the overall market, it's not going to make or break them.
      • AMD was (I haven't checked the statistics lately) making large gains in the server market... In fact they tend to take a large chunk of both low end PC & server markets, with weak laptop sales (not for lack of good mobile cpus though) & average mid-high end PC sales.

        They don't have much need to show off server muscle, Intel still has to work to compete with AMD on performance in that market where Core 2 didn't make as strong a comeback for them.
    • AMD never sold CPUs to Apple, and based on your Apple Growth = 2.5*Total PC Growth with Apple's still very small market share, the non-Apple PC market is growing too (which we already know, anyway). So AMD's potential market is expanding. I think that's probably not a bad thing for them.

      As to your later comments, Intel and AMD CPUs still follow the x86 architecture that make them play nice with the same software. I imagine Mac software would work just fine on an AMD chip, and I seem to recall reading abo
  • by MOBE2001 ( 263700 ) on Thursday April 17, 2008 @01:43PM (#23108668) Homepage Journal
    Nvidia has a better chance to compete successfully against Intel because their executives do not think like Intel. AMD, OTOH, is a monkey-see-monkey-do company. Many of their executives (e.g., Dirk Meyer) and lead engineers came from Intel and they only see the world through Intel glasses. Having said that, this business of mixing coarse-grain MIMD and fine-grain SIMD cores on a single die to create a heterogeneous processor is a match made in hell. Anybody with a lick of sense can tell you that universality should be the primary goal of multicore research and that incompatible processing models should not be encouraged let alone slapped together. Programming those hybrid processors will be more painful than pulling teeth with a crowbar. Heck, breaking programs down into threads is a pain in the ass. Why would anybody want to make things worse?

    The best strategy, IMO, is to work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses. AMD needs somebody with the huevos to say, "fooey with this Intel crap! Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing". Is Hector Ruiz up to the task? Only time will tell. For a different take on the multicore and CPU/GPU issue, read Nightmare on Core Street [blogspot.com].
    • Re: (Score:2, Insightful)

      Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing".
      Parallel processing is a new paradigm? Since when? The 1960s called, they want you to stop stealing their ideas.
    • by Vigile ( 99919 ) *
      I don't agree with this: it was AMD that led Intel into the world of on-die memory controllers as well as removing the front side bus from PC architecture.
      • by samkass ( 174571 ) on Thursday April 17, 2008 @02:11PM (#23109136) Homepage Journal
        From the introduction of the Athlon by AMD (the first really "modern" x86 CPU that finally eliminated most of the CISC disadvantages), though on-die memory controllers and dragging Intel kicking and screaming into the 64-bit world, right up until AMD's lack of a solid response to Core, I'd say AMD led Intel's thinking. Now they're the followers again.
        • by Vigile ( 99919 ) *
          Yeah, I forgot about 64-bit - another instance in which Intel was lacking.

          And in truth, AMD APPEARS to be ahead in the move to fusing a CPU and GPU architecture into something new.
        • by moosesocks ( 264553 ) on Thursday April 17, 2008 @04:19PM (#23110988) Homepage
          Intel got lucky with Core. It was never on their roadmap as a flagship desktop chip.

          It's effectively a multicore version of a laptop-adapted Pentium III with a bunch of modern features tacked on.

          Nobody ever envisioned that this would work as well as it did, and Intel only started paying attention to the idea once their lab in Israel was producing low-power mobile chips that were faster than their flagship Pentium 4 desktop chips.

          AMD didn't have an answer to Core, because Intel themselves were largely ignorant of the fact that the P6 architecture that they had previously deemed obsolete was adaptable to more modern systems. AMD saw Itanium and Pentium 4 in Intel's roadmaps, and knew that it had nothing to fear, as the products they had developed were vastly superior to both.
      • by nxtw ( 866177 )
        But looking back, what practical advantage does an on-die memory controller have for the end-user? HyperTransport has less latency and more bandwidth, but Intel Core CPUs remain competitive performance-wise without these features.

        It was never the monumental change many made it out to be for desktop systems; it's another incremental improvement in performance.
    • Re: (Score:3, Interesting)

      by ThePhilips ( 752041 )

      That reminds me of AMD before Opteron release. Their CPUs sucked because they were always catching up with Intel. Their finances sucked. Luckily for them Intel made strategical mistake (called Itanic [wikipedia.org] thus giving AMD opportunity and enough time to release completely new architecture - AMD64.

      I wonder if AMD will get lucky second time - in the repeated "nothing to lose" situation.

      /me crossing fingers.

    • by samkass ( 174571 )
      I don't think you're obviously correct. You may turn out to be correct, but there is a lot to be said for heterogeneous processors on a single die.

      Some thoughts:
      1. A very-low power, slow core tied to a super heavy-duty number cruncher on the same die that use the same instruction set. One could imagine an OS shutting off the big core when all it has to do is blink the cursor to save power, but firing it back up when you click "Compute". Done right, it seems like this could give you a laptop with a day or
    • Yoru blogs have many words but no concrete examples of "true" parallel processing applied to problems currently viewed as best solved by sequential algorithms. If your truly parallel processors are passing messages to achieve proper sequencing, how is that better than implicit sequencing for the class of problems that require sequential processing?

      If you are saying that all problems can be parallelized with a net gain in elapsed time to solve, please provide the math proof that supports that position. Don
    • by imgod2u ( 812837 )
      Funny, this is actually what Intel's doing. Nehalem is built on scalable cores that can be stitched together to form massive processing arrays. Imagine something very much like ARM cores today but with an improved system interconnect. Start thinking tiles instead of cores. Start thinking routing instead of system bus. Start thinking FPGA-fabric like devices instead of SoC's.
  • We as consumers can only hope that this will be true.
  • by EjectButton ( 618561 ) on Thursday April 17, 2008 @01:50PM (#23108812)
    Nvidia and Intel are well aware of AMD and have not "written this competitor off". The only one ignoring AMD is the technology press because they are generally too stupid to focus on more than two things at a time. Most articles are presented in a context of "x is going to overtake y" "technology x is a y-killer". Conflict sells and overly simplistic conflict sells to a wider audience.

    AMD has some financial problems and their stock may sink for a while but they are not about to go bankrupt. If anyone should be worried about their long-term prospects it's Nvidia. Intel and AMD both have complete "platforms" as in they can build a motherboard with their own chipset, their own GPU and stick their own CPU in it. Nvidia has a GPU and not a whole lot more, their motherboard chipsets are at an obvious disadvantage if they need to design chipsets exclusively for processors whose design is controlled by their direct competitors.

    Nvidia's strength has been that on the high-end they blow away intel GPUs in terms of speed and features, Intel has been slowly catching up and their next iteration will be offered both onboard and as a discrete card and will have hardware-assisted h.264 decoding.

    Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to thei proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.

    The consensus lately is that we are looking at a future where you will have a machine with lots of processor cores and cpu/gpu/physics/etc functions will be tightly coupled. This is a future that does not bode well for Nvidia since the job of making competitive chipsets for their opponents will get tougher while they are at the same time the farthest from having their own platform to sell.
    • by EMeta ( 860558 )
      Speaking of which, does anybody know of anything I can do to get rid of or minimize how much I have to see Nvidia's control panel?
    • Re: (Score:3, Funny)

      by GregPK ( 991973 )
      I think its just a move with the cuda engine that needs refinement. As it grows mature, driver issues will subside.

      AMD is making a break for the open source arena. I gave Hectar that advice a while ago. Apparently, he was listening in his anonymous drunken stupor on the financial forums. AMD is poised to make a stand in the next 2 to 3 years.
    • Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to their proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.

      while this is good for linux, are they making similar improvements in the windows arena, I may be biased but i always preferred nvidia drivers to to ati ones.
      If ati sort out there drivers then they will be able to cash in on the market that needs these chips, which also happens to be the one where the money is, laptops.

      My only problem with AMD, is that their CPUs dont seam to scale as low as intel ones, my current 2.0ghz only drops to 800mhz, but my intel one would drop fro 1.6ghz to 200mhz, not sure how t

      • by nxtw ( 866177 )

        My only problem with AMD, is that their CPUs dont seam to scale as low as intel ones, my current 2.0ghz only drops to 800mhz, but my intel one would drop fro 1.6ghz to 200mhz, not sure how this reflects in powerusage though.

        Your Intel CPU would drop from 1600 MHz to 200 MHz? Are you sure?

        My 1200MHz Core 2 Duo ULV only drops to 800 MHz. My 4 year old Pentium M system dropped from 1400 MHz to 600 MHz.

        Thinking of power usage, is anybody working on moving wireless on to CPUs or save power on wifi in other way

        • Your Intel CPU would drop from 1600 MHz to 200 MHz? Are you sure?

          Fairly sure, but i dont have it with me so cant confirm it as 100%,it was a celeron.

          That makes no sense. Wifi is going to use power no matter what; it takes power to transmit.

          but it also uses alot of interups, the my powertop on an idle internet conenction looks something like:

          Top causes for wakeups: 10s
          46.6% (113.4) : wifi0
          11.4% ( 27.8) firefox-bin : futex_wait (hrtimer_wakeup)
          9.3% ( 22.6) kontact : schedule_timeout (process_timeout)
          7.9% ( 19.2) kicker : schedule_timeout (process_timeout)
          4.8% ( 11.7) Xorg : do_setitimer (it_real_fn)
          4.5% ( 11.0) : acpi

          surely something can be done to cut that down on the cpu wake ups. Or development of a power saving protocol that dosen't use battery when nothing is being sent/received.

          • by nxtw ( 866177 )

            surely something can be done to cut that down on the cpu wake ups. Or development of a power saving protocol that dosen't use battery when nothing is being sent/received.

            are more CPU wakeups bad? It appears this means that the CPU is sleeping more. NICs have to do something with those packets.

            Many NICs already have interrupt moderation.
  • by Alzheimers ( 467217 ) on Thursday April 17, 2008 @02:13PM (#23109156)
    Why AMD + ATI Should win: Hypertransport. Putting the GPU on the same bus as the CPU should theoretically eliminate whatever roablocks the PCI bus created. Plus, allowing for die-2-die communication and treating the GPU as a true co-processor instead of a peripheral should open up huge possibilities for performance boosts.

    Why AMD + ATI won't win: AMD won't risk alienating their OEM partners who also manufacture Intel motherboards and NVidia boards. Also, it's AMD.
    • by Kelz ( 611260 )
      If you combined a current CPU with a current GPU:

      It'd overheat. Like crazy. Current GPUs usually run 40c hotter than most CPUs.
      • Re: Heat (Score:4, Insightful)

        by IdeaMan ( 216340 ) on Thursday April 17, 2008 @03:38PM (#23110370) Homepage Journal
        Ok let's talk about heat.

        Putting both GPU and CPU in close proximity to each other should help, not hinder. I think you mistook the GP for saying they'd be on the same die, but he said bus, not die.
        It may be that they need to be separated a couple of inches from each other to allow room for fanout of the CPU signals to the rest of the board rather than having them in the same socket. If they weren't separated, and the chip packaging was the same height, they could design one heat sink over both chips. This reduces the parts count for the fan and heatsink and therefore increases reliability.

        Having something on a plug in card with such an extreme cooling requirement just doesn't make sense. You aren't allowed much space for heat sink design between it and the next slot. Having the GPU on the motherboard gives case/motherboard designers more room for the heatsink design.
    • Why AMD + ATI Should win: Hypertransport.

      Another possible reason AMD + ATI won't win: it's too late. Intel's QuickPath Interconnect (QPI) is coming later this year when Nehalem ("tock") is launched.

      Putting the GPU on the same bus as the CPU should theoretically eliminate whatever roablocks the PCI bus created. Plus, allowing for die-2-die communication and treating the GPU as a true co-processor instead of a peripheral should open up huge possibilities for performance boosts.

      Intel has said (and shown in their diagrams) that some versions of Nehalem will have integrated graphics. However, their big GPU statement isn't coming until 2009-2010 in the form of Larrabee. Even if Larrabee is delayed, it might be too late for AMD's Fusion. By the time Fusion launches, Intel should have their interconnect and GPU ready.

      What you n [arstechnica.com]

    • by imgod2u ( 812837 )
      The CPU to GPU interconnect was never really much of a bottleneck. When AGP went from 4x to 8x (double the speed) there was barely any improvement. The same is true of going to PCI-E. Yes, it is because all graphics cards include a huge pool of obscenely fast DRAM on it such that it almost never needs to go over the peripheral bus but the types of bandwidth required here simply isn't reproduce-able even with high-speed interconnects like Hypertransport or Quickpath.

      From a performance perspective, graphic
  • by LWATCDR ( 28044 ) on Thursday April 17, 2008 @02:31PM (#23109404) Homepage Journal
    Right now AMD has some great CPUs on the low end and the best integrated graphics solution.
    A huge number of PCs never pay a game more graphically intensive than Tetris and are never used to transcode video!
    Right now on newegg you can pick up an Athlon X2 64 4000 for $53.99
    The cheapest Core2Duo is $124.99. Yes it is faster but will you notice? Most people probably will not.

  • by Eldragon ( 163969 ) on Thursday April 17, 2008 @02:55PM (#23109724)
    What is overlooked by most of the PC enthusiast press is that AMD still offers an excellent price/performance ratio that Intel does not match.

    We have AMD to thank for the reason high end CPUs from intel costs $300 instead of $1000 right now.
  • KISS (Score:2, Funny)

    by faragon ( 789704 )
    Intel has to buy Nvidia or AMD to be competitive in the long run. There is no try, as if AMD survive, it could hurt badly Intel with CPU+IGP solutions.
  • by dtjohnson ( 102237 ) on Thursday April 17, 2008 @07:31PM (#23112692)
    AMD processors are great. I've used them exclusively for the last 9 years. But AMD has a fundamental business problem that will prevent them from competing as the article says. They are out of money. It takes a lot of money to build the state-of-the-art fabrication facilities that are needed to be in the business that AMD and Intel are in. AMD builds a new fab and then they sell the products so cheap that they never come close to recovering the money they spent to build the product. Then they go out to investors for more money and the cycle starts again. After doing this a few times, their debt piles up, their stock tanks, and their ability to borrow money slips away. The bottom line is that whatever new cool product AMD is going to build will have to made in their current fabs and in the fast-moving semiconductor business, if you're not updating your fabs, you're dying. AMD slashed prices over and over to get market share from Intel and max out their production but their sales prices were way below their fab replacement costs. Intel said fine...have some more rope. Now...no more new fabs. AMD just never learned how to sell their products and their technology any way other than with a low price. Yes, Intel didn't play fair and pressured computer companies to buy Intel but AMD's problems were far deeper than that. AMD needed an accountant to tell them 'wait a minute...your fab will only last for 5 years so you've got to sell that product for 50 percent more than you are or you won't stay in business.' Yes, it's a competitive market and Intel sets the price for their competing products and AMD can't control that...now. But I've also watched AMD sell their products for dirt cheap prices even when they had Intel in a hammerlock...and I'd scratch my head at how little money AMD would make even in those good times when AMD was setting the price points.
    • Your view is simplistic... AMD wouldn't have made as much of a gain as it did if they had kept prices higher. As it was for some time Intel was under pricing AMD, forcing them to cut more and more... As only a company 40 times AMD's size can do. AMD chips cost less (on average) when they first launched the Athlon way back when, but Intel responded by reducing prices... Which they can cut much lower than AMD can while still turning a profit.

      This is a perfect example of a market that had a monopoly player for
  • Buy low, sell high... we all know that.

    But what they don't tell you is they intentionally crash the stock. It's all about 3 and 7 year cycles.

    They make the company look bad but they are technologically sitting on gold and silk furniture.

    Devalue the stock and make people want to dump it while you invest your money elsewhere. Then you buy up all the trashed stock a couple years later and pump it back to what it should be, taking a minimum 60 percent return within a year.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...