Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Intel Graphics Software Technology

Why AMD Could Win The Coming Visual Computing Battle 161

Vigile writes "The past week has been rampant with discussion on the new war that is brewing between NVIDIA and Intel, but there was one big player left out of the story: AMD. It would seem that both sides have written this competitor off, but PC Perspective thinks quite the opposite. The company is having financial difficulties, but AMD already has the technologies that both NVIDIA and Intel are striving to build or acquire: mainstream CPU, competitive GPU, high quality IGP solutions and technology for hybrid processing. This article postulates that both Intel and NVIDIA are overlooking a still-competitive opponent, which could turn out to be a drastic mistake."
This discussion has been archived. No new comments can be posted.

Why AMD Could Win The Coming Visual Computing Battle

Comments Filter:
  • by brunes69 ( 86786 ) <[slashdot] [at] [keirstead.org]> on Thursday April 17, 2008 @02:10PM (#23108152)
    Only a 0.2 decline in revenues in the mist of what many consider an already begun recession ain't too bad.
  • Catch & Release... (Score:5, Insightful)

    by Deadfyre_Deadsoul ( 1193759 ) on Thursday April 17, 2008 @02:14PM (#23108232) Journal
    Amd has supposed to have been dead and written off how many times in the past years? Ati as well?

    Its nice to know that they still maintain an edge, even though they have no where near the capitol on hand that nVidia and Intel do.

    I for one always liked Underdogs... :)
  • by konputer ( 568044 ) <{slashdot} {at} {konputer.org}> on Thursday April 17, 2008 @02:15PM (#23108250) Homepage
    I'm still rooting for AMD. I think that they can pull themselves out of the mess they made. Why? No sane reason. But whenever the U.S. economy decides to come back up, so will AMD.
  • by What Would NPH Do ( 1274934 ) on Thursday April 17, 2008 @02:47PM (#23108762)

    Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing".
    Parallel processing is a new paradigm? Since when? The 1960s called, they want you to stop stealing their ideas.
  • by samkass ( 174571 ) on Thursday April 17, 2008 @03:11PM (#23109136) Homepage Journal
    From the introduction of the Athlon by AMD (the first really "modern" x86 CPU that finally eliminated most of the CISC disadvantages), though on-die memory controllers and dragging Intel kicking and screaming into the 64-bit world, right up until AMD's lack of a solid response to Core, I'd say AMD led Intel's thinking. Now they're the followers again.
  • by Alzheimers ( 467217 ) on Thursday April 17, 2008 @03:13PM (#23109156)
    Why AMD + ATI Should win: Hypertransport. Putting the GPU on the same bus as the CPU should theoretically eliminate whatever roablocks the PCI bus created. Plus, allowing for die-2-die communication and treating the GPU as a true co-processor instead of a peripheral should open up huge possibilities for performance boosts.

    Why AMD + ATI won't win: AMD won't risk alienating their OEM partners who also manufacture Intel motherboards and NVidia boards. Also, it's AMD.
  • by Anonymous Coward on Thursday April 17, 2008 @03:20PM (#23109284)

    ...work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses.
    And just what, pray tell, is that model? Really, I'm interested. How do you get the best of both with the weaknesses of neither?
  • by LWATCDR ( 28044 ) on Thursday April 17, 2008 @03:31PM (#23109404) Homepage Journal
    Right now AMD has some great CPUs on the low end and the best integrated graphics solution.
    A huge number of PCs never pay a game more graphically intensive than Tetris and are never used to transcode video!
    Right now on newegg you can pick up an Athlon X2 64 4000 for $53.99
    The cheapest Core2Duo is $124.99. Yes it is faster but will you notice? Most people probably will not.

  • by moderatorrater ( 1095745 ) on Thursday April 17, 2008 @03:35PM (#23109448)

    What do you think?
    That linux is a dominant player in the server market and that Apple is pretty much negligible in either. With how similar Intel and AMD chips tend to be, I don't know that there's anything stopping Apple from switching to AMD at any time. Either way, it's a relatively small chunk of the desktop market.

    The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?
    This is an interesting question. When AMD comes out with their chips, if they really want to impress people with its abilities, they would do well to get some coders working on Folding@Home working on their new chips. It was impressive to see what ATI cards could do with the code, and it would be a great way to showcase the abilities to computationally heavy programs that run on servers (thereby breaking into that market).

    On the desktop end they would have to get something working to showcase the performance in games. Unfortunately, open source doesn't have a lot of 3d games floating around.

    Whatever happens, I think they're going to have to show something that works well with windows or else they're going to flop. If it works well enough with windows and they can show substantial performance improvements, then get manufacturing capacity up, they might be able to land an Apple contract. It would be huge for publicity and for a single contract, but for the overall market, it's not going to make or break them.
  • by Eldragon ( 163969 ) on Thursday April 17, 2008 @03:55PM (#23109724)
    What is overlooked by most of the PC enthusiast press is that AMD still offers an excellent price/performance ratio that Intel does not match.

    We have AMD to thank for the reason high end CPUs from intel costs $300 instead of $1000 right now.
  • Re: Heat (Score:4, Insightful)

    by IdeaMan ( 216340 ) on Thursday April 17, 2008 @04:38PM (#23110370) Homepage Journal
    Ok let's talk about heat.

    Putting both GPU and CPU in close proximity to each other should help, not hinder. I think you mistook the GP for saying they'd be on the same die, but he said bus, not die.
    It may be that they need to be separated a couple of inches from each other to allow room for fanout of the CPU signals to the rest of the board rather than having them in the same socket. If they weren't separated, and the chip packaging was the same height, they could design one heat sink over both chips. This reduces the parts count for the fan and heatsink and therefore increases reliability.

    Having something on a plug in card with such an extreme cooling requirement just doesn't make sense. You aren't allowed much space for heat sink design between it and the next slot. Having the GPU on the motherboard gives case/motherboard designers more room for the heatsink design.
  • by moosesocks ( 264553 ) on Thursday April 17, 2008 @05:19PM (#23110988) Homepage
    Intel got lucky with Core. It was never on their roadmap as a flagship desktop chip.

    It's effectively a multicore version of a laptop-adapted Pentium III with a bunch of modern features tacked on.

    Nobody ever envisioned that this would work as well as it did, and Intel only started paying attention to the idea once their lab in Israel was producing low-power mobile chips that were faster than their flagship Pentium 4 desktop chips.

    AMD didn't have an answer to Core, because Intel themselves were largely ignorant of the fact that the P6 architecture that they had previously deemed obsolete was adaptable to more modern systems. AMD saw Itanium and Pentium 4 in Intel's roadmaps, and knew that it had nothing to fear, as the products they had developed were vastly superior to both.
  • by BobPaul ( 710574 ) * on Thursday April 17, 2008 @09:30PM (#23113088) Journal
    Hence why I mentioned VIA specifically ;)

    Partnering with VIA gives nVidia about as much CPU as Intel already has GPU, though... Having class A components (even if they're really only A- or B+) in house could prove to be a big advantage for AMD.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...