Nvidia Claims Intel's Larrabee Is "a GPU From 2006" 278
Barence sends this excerpt from PC Pro:
"Nvidia has delivered a scathing criticism of Intel's Larrabee, dismissing the multi-core CPU/GPU as wishful thinking — while admitting it needs to catch up with AMD's current Radeon graphics cards. 'Intel is not a stupid company,' conceded John Mottram, chief architect for the company's GT200 core. 'They've put a lot of people behind this, so clearly they believe it's viable. But the products on our roadmap are competitive to this thing as they've painted it. And the reality is going to fall short of the optimistic way they've painted it. As [blogger and CPU architect] Peter Glaskowsky said, the "large" Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI.' Speaking ahead of the opening of the annual NVISION expo on Monday, he also admitted Nvidia 'underestimated ATI with respect to their product.'"
Doh of the Day (Score:5, Insightful)
Good, learn from that and don't make that same mistake again!
Larrabee [...] will have roughly the same performance as a 2006 GPU from Nvidia or ATI.'
DOH!
Intel isn't aiming at gamers (Score:5, Insightful)
So why is NVIDIA on the defensive?
Intel is aiming at number crunchers (note that their chip uses doubles, not floats). They don't want NVIDIA to steal that market with CUDA.
When Intel says "graphics", they mean movie studios, etc.
If Larrabee eventually turns into a competitor for NVIDIA, all well and good, but that's not their goal at the moment.
Better than NVIDIA's proprietary hardware (Score:2, Insightful)
At least Intel documents their hardware. Fuck NVIDIA and their stupid proprietary hardware!
Glass
Re:Intel isn't aiming at gamers (Score:5, Insightful)
Shinatosh
Re:Intel isn't aiming at gamers (Score:5, Insightful)
So why is NVIDIA on the defensive?
I think the Nvidia people think pretty highly of themselves, rightfully so, and Intel has recently been making a number of bold claims, without backing them up. In a poker analogy, Nvidia is calling Intels bluff.
Classic case of disruption (Score:4, Insightful)
Ten years ago you would see Nvidia GPUs in everything from low- to high-end. Today, not so much - Intel dominates the low-end spectrum, with ATI hanging onto a somewhat insignificant market share. The Larrabee is Intel moving upmarket. Sure, it might not perform as well the latest Nvidia or ATI high-end GPU but it might be enough in terms of performance or have other benefits (better OSS support) to win some of Nvidia's current market share over. Considering it's supposedly the Pentium architecture recycled, it's also reasonable to assume the design will be relatively cost-effective and allow Intel to sell at very competitive prices while still maintaining healthy profit margins.
It's a classic case of disruption. Intel enters and Nvidia is happy to leave because there's a segment above that's much more attractive to pursue. Continue along the same lines until there's nowhere for Nvidia to run, at which point the game ends - circle of disruption complete. See also Silicon Graphics, Nvidia's predecessor in many ways.
Re:Intel isn't aiming at gamers (Score:2, Insightful)
So why is NVIDIA on the defensive?
They're substituting rhetoric for an actual competitive product. Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box. You can say Ubuntu isn't linux, but it's what all those Dell buyers are going to see.
Re:Intel isn't aiming at gamers (Score:0, Insightful)
You mean like ATI has open-sourced and open-specced most of its hardware?
Yes, I can see how that would give Intel a massive advantage...
I'll tell you what will be scathing.. (Score:2, Insightful)
If, in the future, the trend evolves that all gpu's are integrated.
Intel, nvidia, AMD and ATI...
Who is the odd one out there?
Re:in raw power he's probably right (Score:3, Insightful)
but the extra programability larrabee have as its just a bunch of cpus with some gpu instructions
Agreed -- why stick to GPU applications, when you have a general purpose multicore machine? How about getting those new instructions into general usage -- remember how MMX was originally introduced for stuff we now run on GPUs.
As for traditional GPU applications, there's already an OpenGL driver for the Cell SPUs in development. A similar driver for a generic multicore machine would be nice, particularly if it's not limited to Larrabee and x86. Of course we already have software implementations of OpenGL, but I wonder how well those scale with dozens of CPUs.
Re:Intel isn't aiming at gamers (Score:3, Insightful)
"movie studios"? Yeah, I'm sure Intel is putting a GPU in every Intel CPU (80% of the desktop market) just to make a couple of companies happy.
Why wouldn't Intel want to be a Nvidia's competitor? Intel has been the top seller of graphic chips, more than nvidia or ati, for some years. I'd say that they have been competing for a looong time now.
Intel has been very succesful with their integrated graphic chips because most of people on the world only need a chip that can draw Windows. Apparently, now they want to go beyond of that. Larrabee cant catch Nvidia, but it will be "fast enought" to become a important target for game developers. Nvidia always keep the "top-performance" tip of the market share, but that tip is becoming smaller and smaller.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Re:Classic case of disruption (Score:5, Insightful)
>ATI hanging onto a somewhat insignificant market share.
C'mon, 17 million units shipped in a quarter and ~20% of the market is hardly 'a somewhat insignificant market share' in a market with four major players (Intel, nVidia, VIA).
For comparison, take Matrox, they have insignificant market share with about 100K/q
Re:Better than NVIDIA's proprietary hardware (Score:3, Insightful)
They can also leave security holes unpatched, like the issue a year or so back where you had a remote arbitrary code execution vulnerability in the driver by making it display pixmaps with certain characteristics (look at an image online and you're machine's compromised). If there's an issue like this in an old card, and they don't release a patch, then you can't safely use the card at all, except maybe in VESA mode.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
It's obvious that the graphics angle is really just a Trojan horse; they're using graphics as the reason to get it into the largest number of hands possible, but what they really want to do is to keep people writing for the X86 instruction set, rather than OpenCL, DirectX 11, or CUDA. Lock-in with the X86 instruction set has served them too well in the past.
In other words, general compute was an area where things were slipping out of their grasp; this is a means to shore things up.
It's a sound business strategy. But I have to agree with blogger-dude; I don't see them being overly competitive with NVIDIA and AMD's latest parts for rasterized graphics anytime soon.
Re:Classic case of disruption (Score:3, Insightful)
Intel always was the biggest graphic chips provider - and I don't think was ever below 40% of the total market (by numbers at least). With all their expensive and cheap graphic chips, ATI and NVidia were unable to dethrone Intel's integrated graphic division.
Re:Gee, How "Forward Thinking" of You, NVidia! (Score:4, Insightful)
I think Fusion is an alternative to current integrated graphics, not to separate high-performance GPUs. After all it shares the drawback of having to steal bandwidth from the regular RAM, where discrete graphics cards have their own memory.
For a moderately power-hungry graphics chip (think 20 watt) the advantage is that Fusion can share the CPU cooler with a moderately power-hungry CPU, while integrating the GPU elsewhere on the board will require a separate cooler. That takes extra board area and money.
So I think Fusion might be able to perform somewhat better for the same price than other integrated graphics. Which means it threatens the widely used Intel boards with integrated graphics rather than NVidia.
To mention something else (but still mostly on topic):
In the Linux market, AMD is currently building a lot of goodwill by providing documentation and Open Source driver code. That might become an advantage over NVidia too.
Re:Intel isn't aiming at gamers (Score:5, Insightful)
NVidia is on the defensive for the simple reason that it needs to be. Not because Intel has a product that threatens NVidia, but because Intel is using classic vaporware strategies to undermine NVidia (and AMD/ATI). Intel is basically throwing around promises, and by virtue of its reputation a lot of people listen and believe those promises. With 'amazing Intel GPU technology just around the corner', some people might delay buying NVidia hardware. NVidia is trying to prevent that from happening.
Re:Classic case of disruption (Score:5, Insightful)
Re:Intel isn't aiming at gamers (Score:5, Insightful)
zquote>Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box.
Traditionally (before the AMD buyout) ATI has had terrible support under Linux, nVidia has been delivering their binary drivers and they have in my experience been easy to install, stable and fully functional much longer than ATI. By the way, the latest 177.68 drivers should now have fixed the recent KDE4 performance issues. From what I've understood the fglrx (closed source) ATI driver has made considerable progress so maybe now they're equal, but closed vs closed source nVidia got nothing to be ashamed of. For exampel, ATI just this month added CrossFire support while SLI has been supported since 2005, that's more like three years behind than two years ahead.
Of course, ATI is now opening up their specs but it's going slowly. For example, they have not yet recieved [phoronix.com] the 3D specs on the last generation R600 cards, much less the current generation. And after those specs are released some very non-trivial drivers must be written as well, meaning it could take another few years before we see fully functional open source drivers. Also this strategy is less than a year old, so if they're two years ahead they're moving fast. Nothing of this is something that should make nVidia executives the least bit queasy.
They are crapping their pants because ATI has delivered two kick-ass parts in the 4850 and 4870, and there's very little reason to buy anything else these days. They are crapping their pants because Intel won't give them a Nehalem system bus license. They're crapping their pants because the 800lb gorilla that's Intel is entering a market everyone else has been leaving. They're crapping their pants because the US economy is shaky and people might not spend big $$$ on graphics cards. But over Linux support? Put it into perspective, and you'll see it's an extremely minor issue.
Re:Classic case of disruption (Score:3, Insightful)
No, he is saying that ATI has an insignificant portion of the low-end market, which is true. Both ATI and NVIDIA cards are now seen as upgrades to the default Intel chipset in practically every laptop sold, whereas in the past they provided both the low-end and high-end cards for that market.
Core +Larrabee on same chip? (Score:3, Insightful)
As Moore's law makes silicon cheaper, what are you going to do with it? more cache, more cores... why not a GPU? Concurrent software to utilize multi-cores is not yet mainstream (maybe never), so that leaves cache and GPU.
In a way, the existence of separate GPUs is just a sign that the CPU wasn't powerful enough to deliver the graphics the market wanted (and would pay for). When CPU's are powerful enough (clock speed or multi-core), they'll subsume the GPU, as they did maths co-processors and cache. ie. The silicon would be partitioned into CPU, GPU and cache - but it would all be on the one chip (called the "CPU" no doubt).
Intel already owns a fair bit of the integrated graphics market. They have great access to channels. Even if this is only half as good as a separate GPU, they will increase market share. I can't see what could stop them... except maybe a patented technology that can't be worked around. Some manufacturers of separate GPU's will survive in specialized niches. Some.
Re:Better than NVIDIA's proprietary hardware (Score:4, Insightful)
Linux support done right is a well written GPL-compliant driver.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Um...
Correct me if I'm wrong, but are you going around talking about the performance of 2d parts?
In 2008?
Seriously?
Are you sure you want to do that?
I mean, you'll look stupid.
Seriously. A straight framebuffer device can pretty much update at full speed without using a fraction of the PCI-E bus or a fraction of the processing power of a modern CPU.
Re:Intel isn't aiming at gamers (Score:3, Insightful)
I hope so, but Intel has released comprehensive driver docs [intellinuxgraphics.org] for a long time, and their driver still sucks.
Re:That's what Intel is SAYING.... (Score:3, Insightful)
What chips? Intel hasn't demonstrated any Larrabee hardware yet. They've published some specs, but we don't even know how many cores it will have or their clock speed.
Re:What bullshit. (Score:2, Insightful)
Two problems there; firstly Larrabee will probably run closer to 1.5-2 GHz. Right now many GPUs run in the <1 GHz range, believe it or not. Secondly, performance may not scale linearly with GHz; other bottlenecks like memory bandwidth might get in the way.
I do agree that the potential for flexibility in the software rendering is very exciting, but Intel will have trouble getting people to write custom code to take full advantage of it. What Larrabee really needs is to be in XBox 720 or PS4. In that situation people could go crazy writing totally custom renderers and it could really change how graphics is done.
Re:AMD is in the Best Position (Score:3, Insightful)
Shouldn't that be four Cortext A9 chips, on the grounds that A9 has SMP support but not A8?
That said ... the reason the ARM market is so big is that it goes after a lower power market,
with cell phones and other battery powered gadgets being the canonical example. That market
has not yet felt a real need for SMP. So even if such an NVidia chip gets off the ground, it's
unclear how much it would sell.
The first widely available ARM Cortex chips are TI's OMAP3 family ... as seen in the
Beagleboard [beagleboard.org] and, possibly more relevant in this
context, the open source Pandora [wikipedia.org] gaming thingie. NVidia? Haven't really heard of them in these contexts,
though maybe it's just their usual closed-source mindset.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Unlike Nvidia and AMD, intel can bundle whatever it wants and cover the costs in CPU and chipset manufacturing. If intel wants, they can give away whatever specs needed in order to corner the market, a la Microsoft and bend the software industry to it's will. In some way even AMD can bundle GPU and CPUs in value packs but Nvidia is out cold for now and that scares the piss out of them.
Re:Intel isn't aiming at gamers (Score:3, Insightful)
On the other hand, I "live with" Intel graphics hardware and hate it. I commend them for supplying docs, but I don't think it does much good without hardware worth writing quality drivers for.
Hopefully ATI hardware is, and this time around ATI/AMD's open source commitment is sincere.
Re:Better than NVIDIA's proprietary hardware (Score:3, Insightful)
Who is forcing you to upgrade?
I stuck with my Ti4200 for years till it stopped working. Could i get a new one and did I want a new one? No. The newer AGP replacement was faster and supported DX9 stuff.
Basically:
1) You don't have to upgrade your hardware if you don't upgrade your software. If you want to upgrade your software, how's it the hardware manufacturer's problem that your old hardware stops working after that?
2) After a while your hardware rots away anyway.
3) X years later, hardly anyone is selling old hardware anymore (except on ebay) - good luck getting your VESA Local Bus graphics card. I already have difficulty getting PCI graphics cards, and the trickle of AGP cards is drying up too.
You can say #3 is the result of the evil tactics of Nvidia et all, but most shops are going to stock for the 99% who want new stuff, not the < 1% who want the old stuff.
Who wants to keep supporting older stuff? It costs money and time. Even the open source devs stop supporting older stuff.
If I want to buy a PC today, I can't get one that uses SDRAM, AGP (anyone remember VLB?
But I'm not crying. My latest PC is actually cheaper (including inflation and currency devaluation), a lot faster than my previous PC. It runs cooler and probably uses less power too.
BTW I used to have an Apple IIGS. But unless you're a collector, it's usually better to run an Apple IIGS emulator on modern hardware.
Re:Intel isn't aiming at gamers (Score:3, Insightful)
That doesn't say bad things about the video card, it says bad things about X11 and KDE.
People have been drawing anti-aliased text for over a decade. Windows 95 with the plus pack had a version, Windows 98 had a version, Windows XP introduced subpixel rendering, and all of that was fast enough to run on a P2-500 with a crappy Intel video chip. I remember having anti-aliased fonts on Windows 95 running on my 386.
Not just Microsoft, either. BeOS can run at meteoric speeds using only the Vesa driver. Guess what? Full anti-aliased fonts.
It seems pretty ignorant to blame the hardware when everyone else on the planet has been able to get anti-aliased fonts to work with a fraction of the video bandwidth and processing power.