Why AMD Could Win The Coming Visual Computing Battle 161
Vigile writes "The past week has been rampant with discussion on the new war that is brewing between NVIDIA and Intel, but there was one big player left out of the story: AMD. It would seem that both sides have written this competitor off, but PC Perspective thinks quite the opposite. The company is having financial difficulties, but AMD already has the technologies that both NVIDIA and Intel are striving to build or acquire: mainstream CPU, competitive GPU, high quality IGP solutions and technology for hybrid processing. This article postulates that both Intel and NVIDIA are overlooking a still-competitive opponent, which could turn out to be a drastic mistake."
Cash Crunch (Score:5, Interesting)
He said that the company will still bring something out, and that something will still go by the codename "Fusion", but it will not be the product originally envisioned at the time the companies decided to merge. He speculated maybe some kind of Multi-Chip Module -- essentially just a separate CPU and a separate GPU die mounted into the same packaging.
Apple's role in AMD-Intel war (Score:4, Interesting)
I respect AMD and had faith in their ability to make a comeback in the past, but there's a new wrinkle this time: Apple.
Apple computer sales are growing at 2.5 times the industry rate, and they use Intel CPUs. With all the growth in the PC market going to Intel CPU's, is there much room for an AMD comeback?
I can see two ways for AMD to make a comeback. If Apple's agreement to use Intel CPUs expires and AMD can win some business with Apple, AMD can latch on to Apple's growth. But Apple chose Intel for its ability to ramp up production. Will AMD be able to provide the same? Will AMD be willing to give up other customers to meet Apple's demand?
If Apple chooses this route, how big of an architecture change will this be? I've no doubt Apple can provide developer tools to aid the migration, but will Core 2 optimizations easily translate to AMD optimizations?
Will Apple take the risk of supporting both architectures? They are very active in LLVM development, which allows dynamic optimization of code. If LLVM works as well as many hope, Apple could deliver software in a common binary format that automatically adapts to any architecture using LLVM. This would be quite novel. Apple would benefit from ongoing competition between Intel and AMD while giving AMD a fighting chance in a market increasingly dominated by Apple.
The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?
If Apple weren't in Intel's camp, I would invest in AMD with confidence in a turnaround, but I think the fate of AMD lies largely with adoption by Apple or Linux.
What do you think?
Monkey See, Monkey Do (Score:4, Interesting)
The best strategy, IMO, is to work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses. AMD needs somebody with the huevos to say, "fooey with this Intel crap! Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing". Is Hector Ruiz up to the task? Only time will tell. For a different take on the multicore and CPU/GPU issue, read Nightmare on Core Street [blogspot.com].
Re:... vested interest. (Score:5, Interesting)
The thing is, while that is all necessary and good as part of business planning, individual investors really ought not to make investment decisions based on this kind of planning, unless they have their own teams of researchers and analysts and their own sources of information.
If you know nothing about the technology, you can't really examine something like this critically. If you know a great deal about it, you are even less qualified to make prognostications, because your opinion about what is good technology messes with your opinion about what makes good business sense.
Mark Twain was a very intelligent man, who lost his entire fortune investing in a revolutionary typesetting system. The things that made him a great writer made him a lousy investor: imagination, contrariness, a willingness to buck convention. Of course, exactly the same qualities describe a superb investor. The thing that really did him in was overestimating his knowledge of a market he was peripherally involved in.
It was natural for Twain to be interested in the process of printing books and periodicals, and to be familiar enough with the process of typesetting in general to see the potential, but not quite intimately enough to see the pitfalls. He would have been better off investing in something he had absolutely no interest or prior experience in.
they have not "written them off" (Score:5, Interesting)
AMD has some financial problems and their stock may sink for a while but they are not about to go bankrupt. If anyone should be worried about their long-term prospects it's Nvidia. Intel and AMD both have complete "platforms" as in they can build a motherboard with their own chipset, their own GPU and stick their own CPU in it. Nvidia has a GPU and not a whole lot more, their motherboard chipsets are at an obvious disadvantage if they need to design chipsets exclusively for processors whose design is controlled by their direct competitors.
Nvidia's strength has been that on the high-end they blow away intel GPUs in terms of speed and features, Intel has been slowly catching up and their next iteration will be offered both onboard and as a discrete card and will have hardware-assisted h.264 decoding.
Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to thei proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.
The consensus lately is that we are looking at a future where you will have a machine with lots of processor cores and cpu/gpu/physics/etc functions will be tightly coupled. This is a future that does not bode well for Nvidia since the job of making competitive chipsets for their opponents will get tougher while they are at the same time the farthest from having their own platform to sell.
Re:Monkey See, Monkey Do (Score:3, Interesting)
That reminds me of AMD before Opteron release. Their CPUs sucked because they were always catching up with Intel. Their finances sucked. Luckily for them Intel made strategical mistake (called Itanic [wikipedia.org] thus giving AMD opportunity and enough time to release completely new architecture - AMD64.
I wonder if AMD will get lucky second time - in the repeated "nothing to lose" situation.
I used to work at Intel... (Score:4, Interesting)
Re:... vested interest. (Score:3, Interesting)
If you've ever been on the product management end of the stick, though, the biggest danger is overestimating the number of people who think as you do or visualize their needs as you do. That's why it's dangerous for people with lots of technical knowledge to use it to guide their investments. You can overcome this, but it's a serious trap.
That's why I don't invest in tech companies at all; whenever I have it hasn't worked out.
I did pretty well in the financial services sector for some time, although I'll admit I had more than my fair share of luck. I simply chose that as one of my investments because money bores me. I'm mostly out now, but I'm thinking of getting back in now that that a disaster is making people scared of these stocks. That's the ticket: if you balance your portfolio, every time an industry goes down, you end up buying. Every time it goes up, you end up selling. Which is just another way of buying low and selling high.
It doesn't pay to get excited by a single company's brilliant potential. I'm actually contemplating getting out of stocks altogether because it's too tempting to be clever rather than patient. It's really a lot of trouble for an amateur to try to out think the market; it's better to out wait it.
Re:Cash Crunch (Score:1, Interesting)
I've had gaming rigs run for 5 years on the same mobo and chip, while only selectively upgrading the graphics card & RAM once or twice along the way.
The day they force me to buy some static cpu/gpu combo module for $500+ every year to keep up with the software is the day my PC becomes a media/web browsing machine and my gaming consoles take over that facet of my entertainment.
I rather like my ability to upgrade a single cheap piece of hardware to keep my rig running nicely every year or two. Thats the whole beauty of PC's over consoles. It'll be a shame if they funk that up.
Re:Sorry, you overlooked the obvious (Score:2, Interesting)
Re:... vested interest. (Score:4, Interesting)
They were better and more expensive than their nVidia counterparts at several points over the last few years.
And whilst phenom has been a crash, AMD lead Intel by a considerable margin through the Athlon64/Opteron days, before Intel got the "Core" architecture up and running. They left the giant rival chipmaker in the dust, struggling to figure out a way to make the P4 competetive.
BTW, before anyone accuses me of fanboyism, I'll mention I'm posting from a Core 2 Duo box with an nVidia chip. The last upgrade cycle was AMD and ATI all the way, and I hope we do get back to the state where we have multiple players really able to compete and continually outdo each other.
That's good for all of us.
Re:Sorry, you overlooked the obvious (Score:3, Interesting)
The one thing AMD fans miss is that AMD had a huge jump in multi-processor, multi-core technology 4 years ago with opteron... and Microsoft backed up intel with late products and underutilized systems that made the Opterons less of a benefit.. until Intel caught up... now multi-core is cool. AMD should have pushed Linux and other OSes a lot harder with Opteron instead of thinking MS would "innovate" with their new technology. The real problem is that MS is holding back the markets (look at XP for eeePC) trying to hang on. It has to get slightly worse before it gets better.