Why AMD Could Win The Coming Visual Computing Battle 161
Vigile writes "The past week has been rampant with discussion on the new war that is brewing between NVIDIA and Intel, but there was one big player left out of the story: AMD. It would seem that both sides have written this competitor off, but PC Perspective thinks quite the opposite. The company is having financial difficulties, but AMD already has the technologies that both NVIDIA and Intel are striving to build or acquire: mainstream CPU, competitive GPU, high quality IGP solutions and technology for hybrid processing. This article postulates that both Intel and NVIDIA are overlooking a still-competitive opponent, which could turn out to be a drastic mistake."
... vested interest. (Score:1, Flamebait)
Re:... vested interest. (Score:5, Funny)
Re:... vested interest. (Score:5, Insightful)
Comment removed (Score:4, Funny)
Re: (Score:2)
They have to front the product or suffer a reputation sting where they could sell a product, have it work for a m
Re:... vested interest. (Score:4, Interesting)
They were better and more expensive than their nVidia counterparts at several points over the last few years.
And whilst phenom has been a crash, AMD lead Intel by a considerable margin through the Athlon64/Opteron days, before Intel got the "Core" architecture up and running. They left the giant rival chipmaker in the dust, struggling to figure out a way to make the P4 competetive.
BTW, before anyone accuses me of fanboyism, I'll mention I'm posting from a Core 2 Duo box with an nVidia chip. The last upgrade cycle was AMD and ATI all the way, and I hope we do get back to the state where we have multiple players really able to compete and continually outdo each other.
That's good for all of us.
Re: (Score:2)
Re:... vested interest. (Score:5, Interesting)
The thing is, while that is all necessary and good as part of business planning, individual investors really ought not to make investment decisions based on this kind of planning, unless they have their own teams of researchers and analysts and their own sources of information.
If you know nothing about the technology, you can't really examine something like this critically. If you know a great deal about it, you are even less qualified to make prognostications, because your opinion about what is good technology messes with your opinion about what makes good business sense.
Mark Twain was a very intelligent man, who lost his entire fortune investing in a revolutionary typesetting system. The things that made him a great writer made him a lousy investor: imagination, contrariness, a willingness to buck convention. Of course, exactly the same qualities describe a superb investor. The thing that really did him in was overestimating his knowledge of a market he was peripherally involved in.
It was natural for Twain to be interested in the process of printing books and periodicals, and to be familiar enough with the process of typesetting in general to see the potential, but not quite intimately enough to see the pitfalls. He would have been better off investing in something he had absolutely no interest or prior experience in.
Re: (Score:2)
Re: (Score:3, Interesting)
If you've ever been on the product management end of the stick, though, the biggest danger is overestimating the number of people who think as you do or visualize their needs as you do. That's why it's dangerous for people with lots of technical knowledge to use it to guide their investments. You can overcome this, but it's a serious trap.
That's why I don't invest in tech companies at all; whenever I have it hasn't worked out.
I did pretty well in
Re: (Score:2, Funny)
Re: (Score:2)
I can't think of that many people outside the tech industry who have enough knowledge to get rich investing in it through anything more than luck.
Warren Buffett is an overused example, but I'll bring him up anyway: he avoids tech stocks entirely in favor of things he does understand, like insurance, soft drinks, retail stores, restaurants, and recently, railways
Re: (Score:1, Informative)
Guilty as charged as well, here.
Re: (Score:2)
Sorry, you overlooked the obvious (Score:3, Informative)
Re:Sorry, you overlooked the obvious (Score:5, Insightful)
Re: (Score:2)
Re:Sorry, you overlooked the obvious (Score:5, Informative)
So it's going to come down to whether or not AMD has the ability right now to keep pushing their product lines and innovating fast enough to beat Intel and nVidia to the punch. Their financial situation hurts their chances, but it doesn't negate them completely.
Re: (Score:3, Interesting)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2, Interesting)
Re: (Score:2, Insightful)
Partnering with VIA gives nVidia about as much CPU as Intel already has GPU, though... Having class A components (even if they're really only A- or B+) in house could prove to be a big advantage for AMD.
Re: (Score:2)
Dons an asbestos suit... (Score:1)
Catch & Release... (Score:5, Insightful)
Its nice to know that they still maintain an edge, even though they have no where near the capitol on hand that nVidia and Intel do.
I for one always liked Underdogs...
Re: (Score:2)
Heck, 10 years ago the press had anointed 3dfx as king while nVidia was a barely-mentioned also-ran indistinguishable from the half-dozen other 3D chipset manufacturers. These companies stumbling on one major release is no big deal. If they stumble on two sequential releases l
More like zombie visual computing (Score:2, Funny)
Re:More like zombie visual computing (Score:5, Funny)
Re: (Score:3, Funny)
Cash Crunch (Score:5, Interesting)
He said that the company will still bring something out, and that something will still go by the codename "Fusion", but it will not be the product originally envisioned at the time the companies decided to merge. He speculated maybe some kind of Multi-Chip Module -- essentially just a separate CPU and a separate GPU die mounted into the same packaging.
Re: (Score:2)
I used to work at Intel... (Score:4, Interesting)
Comment removed (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Me personally, would like to see SLI/Cross-Fire like setups actually give you a near 100% boost in rendering speeds. From all the benchmarks I have seen you barely get 20-30% speed increase for a 100% price increase.
Re: (Score:2)
Just because Fusion might not be the glorious flagship envisioned by AMD doesn't mean that it'll flop.
Apple's role in AMD-Intel war (Score:4, Interesting)
I respect AMD and had faith in their ability to make a comeback in the past, but there's a new wrinkle this time: Apple.
Apple computer sales are growing at 2.5 times the industry rate, and they use Intel CPUs. With all the growth in the PC market going to Intel CPU's, is there much room for an AMD comeback?
I can see two ways for AMD to make a comeback. If Apple's agreement to use Intel CPUs expires and AMD can win some business with Apple, AMD can latch on to Apple's growth. But Apple chose Intel for its ability to ramp up production. Will AMD be able to provide the same? Will AMD be willing to give up other customers to meet Apple's demand?
If Apple chooses this route, how big of an architecture change will this be? I've no doubt Apple can provide developer tools to aid the migration, but will Core 2 optimizations easily translate to AMD optimizations?
Will Apple take the risk of supporting both architectures? They are very active in LLVM development, which allows dynamic optimization of code. If LLVM works as well as many hope, Apple could deliver software in a common binary format that automatically adapts to any architecture using LLVM. This would be quite novel. Apple would benefit from ongoing competition between Intel and AMD while giving AMD a fighting chance in a market increasingly dominated by Apple.
The other potential AMD savior is Linux. Can the open source community deliver software that can take advantage of AMD's CPU-GPU architecture spectacularly enough to give AMD the sales it needs?
If Apple weren't in Intel's camp, I would invest in AMD with confidence in a turnaround, but I think the fate of AMD lies largely with adoption by Apple or Linux.
What do you think?
Re:Apple's role in AMD-Intel war (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
If AMD's powerpoint slides magically came true then Barcelona would have been out in April... of 2007, and it would have actually been faster and less power-hungry than Intel's chips. Unfortunately, presentations != reality.
Re: (Score:2)
Anyhow, that wasn't my point - I was wondering if it worked in software mode.
Re: (Score:2)
Re: (Score:2)
Re:Apple's role in AMD-Intel war (Score:5, Insightful)
On the desktop end they would have to get something working to showcase the performance in games. Unfortunately, open source doesn't have a lot of 3d games floating around.
Whatever happens, I think they're going to have to show something that works well with windows or else they're going to flop. If it works well enough with windows and they can show substantial performance improvements, then get manufacturing capacity up, they might be able to land an Apple contract. It would be huge for publicity and for a single contract, but for the overall market, it's not going to make or break them.
Re: (Score:2)
They don't have much need to show off server muscle, Intel still has to work to compete with AMD on performance in that market where Core 2 didn't make as strong a comeback for them.
Re: (Score:2)
As to your later comments, Intel and AMD CPUs still follow the x86 architecture that make them play nice with the same software. I imagine Mac software would work just fine on an AMD chip, and I seem to recall reading abo
Re: (Score:2)
I know its 25% of laptop sales by revenue, but I'm not sure on the unit counts.
Re: (Score:2)
IIRC its near 10% now. Nearly 20% of laptop sales, too.
As others have pointed out, those numbers are from USA retail sales, which doesn't include international sales and non-retail sales (e.g. direct, business). I think total worldwide unit sales is what's important for your original point about Apple's role in the AMD-Intel war.
According to Gartner's latest numbers [gartner.com], Apple had 6.6% of USA sales in Q1 2008, up from 5.2% in Q1 2007. IDC says [idc.com] Apple has 6% USA market share, up from 4.9%.
Apple's worldwide market share wasn't listed because they weren't in the t
Re: (Score:2)
Re: (Score:2)
Monkey See, Monkey Do (Score:4, Interesting)
The best strategy, IMO, is to work on a universal processor that combines the strengths of both MIMD and SIMD models while eliminating their obvious weaknesses. AMD needs somebody with the huevos to say, "fooey with this Intel crap! Let's carve our own market and create a completely new technology for a completely new paradigm, parallel processing". Is Hector Ruiz up to the task? Only time will tell. For a different take on the multicore and CPU/GPU issue, read Nightmare on Core Street [blogspot.com].
Re: (Score:2, Insightful)
Re: (Score:2, Funny)
Re: (Score:2, Funny)
Re: (Score:2)
Re: (Score:2)
Re:Monkey See, Monkey Do (Score:5, Insightful)
Re: (Score:2)
And in truth, AMD APPEARS to be ahead in the move to fusing a CPU and GPU architecture into something new.
Re:Monkey See, Monkey Do (Score:4, Insightful)
It's effectively a multicore version of a laptop-adapted Pentium III with a bunch of modern features tacked on.
Nobody ever envisioned that this would work as well as it did, and Intel only started paying attention to the idea once their lab in Israel was producing low-power mobile chips that were faster than their flagship Pentium 4 desktop chips.
AMD didn't have an answer to Core, because Intel themselves were largely ignorant of the fact that the P6 architecture that they had previously deemed obsolete was adaptable to more modern systems. AMD saw Itanium and Pentium 4 in Intel's roadmaps, and knew that it had nothing to fear, as the products they had developed were vastly superior to both.
Re: (Score:2)
It was never the monumental change many made it out to be for desktop systems; it's another incremental improvement in performance.
Re: (Score:3, Interesting)
That reminds me of AMD before Opteron release. Their CPUs sucked because they were always catching up with Intel. Their finances sucked. Luckily for them Intel made strategical mistake (called Itanic [wikipedia.org] thus giving AMD opportunity and enough time to release completely new architecture - AMD64.
I wonder if AMD will get lucky second time - in the repeated "nothing to lose" situation.
Re: (Score:2)
Some thoughts:
1. A very-low power, slow core tied to a super heavy-duty number cruncher on the same die that use the same instruction set. One could imagine an OS shutting off the big core when all it has to do is blink the cursor to save power, but firing it back up when you click "Compute". Done right, it seems like this could give you a laptop with a day or
Examples, please (Score:2)
If you are saying that all problems can be parallelized with a net gain in elapsed time to solve, please provide the math proof that supports that position. Don
Re: (Score:2)
Re: (Score:2)
Gosh. You make it sound like AMD is being accused of a crime. The point I was making is that, even if there are only a few former Intel leaders at AMD, the fact remains that AMD was formed to compete against Intel on Intel's own turf, x86 compatibility. Not that there was anything wrong with that in those days. AMD did pretty well considering what they were up against. My point is that the computer world has changed drastically to the point that even Intel is gett
Re: (Score:2)
Yea, AMD would really suffer if this small, insignificant market was the only bunch who liked their products.
Hope (Score:2)
they have not "written them off" (Score:5, Interesting)
AMD has some financial problems and their stock may sink for a while but they are not about to go bankrupt. If anyone should be worried about their long-term prospects it's Nvidia. Intel and AMD both have complete "platforms" as in they can build a motherboard with their own chipset, their own GPU and stick their own CPU in it. Nvidia has a GPU and not a whole lot more, their motherboard chipsets are at an obvious disadvantage if they need to design chipsets exclusively for processors whose design is controlled by their direct competitors.
Nvidia's strength has been that on the high-end they blow away intel GPUs in terms of speed and features, Intel has been slowly catching up and their next iteration will be offered both onboard and as a discrete card and will have hardware-assisted h.264 decoding.
Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to thei proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.
The consensus lately is that we are looking at a future where you will have a machine with lots of processor cores and cpu/gpu/physics/etc functions will be tightly coupled. This is a future that does not bode well for Nvidia since the job of making competitive chipsets for their opponents will get tougher while they are at the same time the farthest from having their own platform to sell.
Re: (Score:2)
Re: (Score:3, Funny)
AMD is making a break for the open source arena. I gave Hectar that advice a while ago. Apparently, he was listening in his anonymous drunken stupor on the financial forums. AMD is poised to make a stand in the next 2 to 3 years.
Re: (Score:2)
Nvidia's advantage over ATI has been that ati has generally had inferior drivers regardless of what platform you were using, since AMD took over ATI has been improving their driver situation significantly both with respect to their proprietary drivers and their recent release of specs for the open source version. Meanwhile Nvidia seems to have been doing everything they can to trash the reputation of their drivers over the last year both with their awful Vista drivers and their buggy/sloppy control panel that they have forced on everyone.
while this is good for linux, are they making similar improvements in the windows arena, I may be biased but i always preferred nvidia drivers to to ati ones.
If ati sort out there drivers then they will be able to cash in on the market that needs these chips, which also happens to be the one where the money is, laptops.
My only problem with AMD, is that their CPUs dont seam to scale as low as intel ones, my current 2.0ghz only drops to 800mhz, but my intel one would drop fro 1.6ghz to 200mhz, not sure how t
Re: (Score:2)
Your Intel CPU would drop from 1600 MHz to 200 MHz? Are you sure?
My 1200MHz Core 2 Duo ULV only drops to 800 MHz. My 4 year old Pentium M system dropped from 1400 MHz to 600 MHz.
Re: (Score:2)
Your Intel CPU would drop from 1600 MHz to 200 MHz? Are you sure?
Fairly sure, but i dont have it with me so cant confirm it as 100%,it was a celeron.
That makes no sense. Wifi is going to use power no matter what; it takes power to transmit.
but it also uses alot of interups, the my powertop on an idle internet conenction looks something like:
Top causes for wakeups: 10s
46.6% (113.4) : wifi0
11.4% ( 27.8) firefox-bin : futex_wait (hrtimer_wakeup)
9.3% ( 22.6) kontact : schedule_timeout (process_timeout)
7.9% ( 19.2) kicker : schedule_timeout (process_timeout)
4.8% ( 11.7) Xorg : do_setitimer (it_real_fn)
4.5% ( 11.0) : acpi
surely something can be done to cut that down on the cpu wake ups. Or development of a power saving protocol that dosen't use battery when nothing is being sent/received.
Re: (Score:2)
are more CPU wakeups bad? It appears this means that the CPU is sleeping more. NICs have to do something with those packets.
Many NICs already have interrupt moderation.
Re: (Score:2)
It looks more like their design might be for small integrated devices maybe a li
Why AMD + ATI should win, plus why they won't (Score:5, Insightful)
Why AMD + ATI won't win: AMD won't risk alienating their OEM partners who also manufacture Intel motherboards and NVidia boards. Also, it's AMD.
Re: (Score:2)
It'd overheat. Like crazy. Current GPUs usually run 40c hotter than most CPUs.
Re: Heat (Score:4, Insightful)
Putting both GPU and CPU in close proximity to each other should help, not hinder. I think you mistook the GP for saying they'd be on the same die, but he said bus, not die.
It may be that they need to be separated a couple of inches from each other to allow room for fanout of the CPU signals to the rest of the board rather than having them in the same socket. If they weren't separated, and the chip packaging was the same height, they could design one heat sink over both chips. This reduces the parts count for the fan and heatsink and therefore increases reliability.
Having something on a plug in card with such an extreme cooling requirement just doesn't make sense. You aren't allowed much space for heat sink design between it and the next slot. Having the GPU on the motherboard gives case/motherboard designers more room for the heatsink design.
Re: (Score:2)
Why AMD + ATI Should win: Hypertransport.
Another possible reason AMD + ATI won't win: it's too late. Intel's QuickPath Interconnect (QPI) is coming later this year when Nehalem ("tock") is launched.
Putting the GPU on the same bus as the CPU should theoretically eliminate whatever roablocks the PCI bus created. Plus, allowing for die-2-die communication and treating the GPU as a true co-processor instead of a peripheral should open up huge possibilities for performance boosts.
Intel has said (and shown in their diagrams) that some versions of Nehalem will have integrated graphics. However, their big GPU statement isn't coming until 2009-2010 in the form of Larrabee. Even if Larrabee is delayed, it might be too late for AMD's Fusion. By the time Fusion launches, Intel should have their interconnect and GPU ready.
What you n [arstechnica.com]
Re: (Score:2)
From a performance perspective, graphic
AMD has some great solutions (Score:4, Insightful)
A huge number of PCs never pay a game more graphically intensive than Tetris and are never used to transcode video!
Right now on newegg you can pick up an Athlon X2 64 4000 for $53.99
The cheapest Core2Duo is $124.99. Yes it is faster but will you notice? Most people probably will not.
Price per Performance keeps AMD alive (Score:5, Insightful)
We have AMD to thank for the reason high end CPUs from intel costs $300 instead of $1000 right now.
KISS (Score:2, Funny)
AMD is out of money... (Score:3, Informative)
Re: (Score:2)
This is a perfect example of a market that had a monopoly player for
As far as stock, they intentionally did it (Score:2)
But what they don't tell you is they intentionally crash the stock. It's all about 3 and 7 year cycles.
They make the company look bad but they are technologically sitting on gold and silk furniture.
Devalue the stock and make people want to dump it while you invest your money elsewhere. Then you buy up all the trashed stock a couple years later and pump it back to what it should be, taking a minimum 60 percent return within a year.
Re: (Score:1)
Re: (Score:1)
Silly me.
Re: (Score:1, Offtopic)
And no, the information I'm sharing isn't illegal.
Re: (Score:2)
Re: (Score:2, Informative)
Re:AMD bought out ATI? (Score:4, Informative)
Re: (Score:2)
They both have their places. Hammers, screwdrivers, all that jazz.
Re: (Score:2)
Re: (Score:2)
$30 versus $X.
You see, as X approaches infinity, the value of the thumdrive increases infinitely!
I mean it's not like you actually need permanent copies of old linux distros. once they're used they're no longer necessary. like 99.9% of everything you burn to a dvd.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
S939 was around a long time.
AM2, AM2+ are compatible.
Intel has had just LGA775 long time - but with all the new processor release new power requirements forced new motherboards to be purchased / manufactured anyway.
For someone that goes through a LOT of motherboards, I still give AMD the edge here.
Re: (Score:2)
The crucial difference is that Intel is one-stop CPU shop: they produce everything in house. AMD is much much smaller than Intel and often has to resort to outsourcing.http://slashdot.org/comments.pl?sid=526148&cid=23109172# Cancel
What is important to Apple is that Intel delivers complete solution in several markets. AMD has lots of holes in its offerings which need to be filled by 3rd party components to make a product out of them. For Apple it is important to keep secret until product reaches mark