Nvidia Claims Intel's Larrabee Is "a GPU From 2006" 278
Barence sends this excerpt from PC Pro:
"Nvidia has delivered a scathing criticism of Intel's Larrabee, dismissing the multi-core CPU/GPU as wishful thinking — while admitting it needs to catch up with AMD's current Radeon graphics cards. 'Intel is not a stupid company,' conceded John Mottram, chief architect for the company's GT200 core. 'They've put a lot of people behind this, so clearly they believe it's viable. But the products on our roadmap are competitive to this thing as they've painted it. And the reality is going to fall short of the optimistic way they've painted it. As [blogger and CPU architect] Peter Glaskowsky said, the "large" Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI.' Speaking ahead of the opening of the annual NVISION expo on Monday, he also admitted Nvidia 'underestimated ATI with respect to their product.'"
Doh of the Day (Score:5, Insightful)
Good, learn from that and don't make that same mistake again!
Larrabee [...] will have roughly the same performance as a 2006 GPU from Nvidia or ATI.'
DOH!
Comment removed (Score:5, Interesting)
Re: (Score:3, Funny)
It does you no good to be able to process 16 billion pixels / second, when you can only get the data for 4 billion per second from your memories.
[voice accent="scotty"]
It's the memories, capt'n! They canna' take it any longer!
[/voice]
Repudiating my own quote (Score:5, Informative)
First, what's being described as a quote is actually just John Montrym's summary from my original post, which is here:
http://news.cnet.com/8301-13512_3-10006184-23.html [cnet.com]
What I actually described as equating to "the performance of a 2006-vintage... graphics chip" was a performance standard defined by Intel itself-- running the game F.E.A.R. at 60 fps in 1,600 x 1,200-pixel resolution with four-sample antialiasing.
Intel used this figure for some comparisons of rendering performance. If Larrabee ran at 1 GHz, for example, Intel's figures show that it would take somewhere from 7 to 25 Larrabee cores to reach that 60 Hz frame rate.
Larrabee will probably run much faster than that, at least on desktop variants.
Well... rather than writing the whole response here, I think I'd rather write it up for my blog and publish it there. Please surf on over and check it out:
http://news.cnet.com/8301-13512_3-10024280-23.html [cnet.com]
Comments are welcome here or there.
. png
no wonder its slow (Score:5, Funny)
Re:no wonder its slow (Score:5, Funny)
Re: (Score:3, Funny)
No, he ran out of pink.
Intel isn't aiming at gamers (Score:5, Insightful)
So why is NVIDIA on the defensive?
Intel is aiming at number crunchers (note that their chip uses doubles, not floats). They don't want NVIDIA to steal that market with CUDA.
When Intel says "graphics", they mean movie studios, etc.
If Larrabee eventually turns into a competitor for NVIDIA, all well and good, but that's not their goal at the moment.
Re:Intel isn't aiming at gamers (Score:5, Insightful)
Shinatosh
Re:Intel isn't aiming at gamers (Score:5, Insightful)
So why is NVIDIA on the defensive?
I think the Nvidia people think pretty highly of themselves, rightfully so, and Intel has recently been making a number of bold claims, without backing them up. In a poker analogy, Nvidia is calling Intels bluff.
Re:Intel isn't aiming at gamers (Score:4, Interesting)
I wouldn't call it that.
I'd call it a knee-jerk reaction to a non-issue.
Nvidia are getting very scared now that ATi are beating them senseless. I run both ATi and Nvidia, so don't go down the "you're just a fanboy" angle either.
I've seen chip makers come and go, this is just another attempt by Nvidia to try and sure up support for their product, but this time they can't turn to ATi and say "look how crap their chips are" - they have to do it to Intel who are aiming the chips at corporate markets.
To be honest, the best bang for buck at the lower end of the market for 2D seems to be the Intel chips. One thing that does tend to surprise people is the complete lack of performance that the Nvidia chipsets have when not in 3D. ATi don't seem to have these problems having built around a solid base of 2D graphics engines in the 90's (Rage/RageII is at least one reason why people went with Macs back then). Nvidia is really feeling the pinch with ATi taking up the higher end of the market (pro-gear/high end HD) and intel suring up the lower end (GMA, etc). Nvidia pretty much are stuck with consumers buying their middle of the line gear (8600/9600).
When you aim high you tend to hurt real when you fall from grace, the whole 8800 to 9800 leap was abysmal at best unlike their main competitor who really pulled their finger out to release the 3xxx & 4xxx series.
All in all this seems like a bit of pork barrelling on Nvidia's part to detract from the complete lack of performance in their $1000 video card range. If anything this type of bullshit will be rewarded with a massive consumer (yes, geek and gamer) backlash.
I know my products, I know their limitations - I don't need some exec talking crap to tell me, and base level consumers will never read it.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Um...
Correct me if I'm wrong, but are you going around talking about the performance of 2d parts?
In 2008?
Seriously?
Are you sure you want to do that?
I mean, you'll look stupid.
Seriously. A straight framebuffer device can pretty much update at full speed without using a fraction of the PCI-E bus or a fraction of the processing power of a modern CPU.
Re:Intel isn't aiming at gamers (Score:4, Interesting)
Re: (Score:3, Insightful)
That doesn't say bad things about the video card, it says bad things about X11 and KDE.
People have been drawing anti-aliased text for over a decade. Windows 95 with the plus pack had a version, Windows 98 had a version, Windows XP introduced subpixel rendering, and all of that was fast enough to run on a P2-500 with a crappy Intel video chip. I remember having anti-aliased fonts on Windows 95 running on my 386.
Not just Microsoft, either. BeOS can run at meteoric speeds using only the Vesa driver. Guess what
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Unlike Nvidia and AMD, intel can bundle whatever it wants and cover the costs in CPU and chipset manufacturing. If intel wants, they can give away whatever specs needed in order to corner the market, a la Microsoft and bend the software industry to it's will. In some way even AMD can bundle GPU and CPUs in value packs but Nvidia is out cold for now and that scares the piss out of them.
Re: (Score:3, Informative)
Nvidia are getting very scared now that ATi are beating them senseless.
Nvidia is really feeling the pinch with ATi taking up the higher end of the market (pro-gear/high end HD) and intel suring up the lower end (GMA, etc). Nvidia pretty much are stuck with consumers buying their middle of the line gear (8600/9600).
When you aim high you tend to hurt real when you fall from grace, the whole 8800 to 9800 leap was abysmal at best unlike their main competitor who really pulled their finger out to release the 3xxx & 4xxx series.
I guess you're referring to AMD/ATI's successful HD 4000 launch about a month ago, but you also seem to be omitting NVIDIA's GTX 200 series. At the high end (non-workstation), the GTX 280 outperforms the HD 4870 [arstechnica.com]. The $550 HD 4870 X2 (released about two weeks ago) outperforms the $450 GTX 280, but consumes a heck of a lot more power [anandtech.com].
Also, NVIDIA seems to have been beating AMD/ATI senseless for years. According to Jon Peddie Research [jonpeddie.com], for total graphics chips, NVIDIA had 31.4% market share in Q2 2008 vs. AM
Re: (Score:2)
And AMD/ATi is the Ferrari ( AMD are a sponsor of them.... )
Re: (Score:2)
Re: (Score:2, Insightful)
So why is NVIDIA on the defensive?
They're substituting rhetoric for an actual competitive product. Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box. You can say Ubuntu isn't linux, but it's what all those Dell buyers are going to see.
Re: (Score:2, Informative)
Re: (Score:2)
ATI recently released specs for their R600 chips.
Their driver might suck big time - as it's open source counter-part - yet in long term, ATI has huge advantage right now. In my eyes, sincere Linux support is huge advantage - though I game exclusively on Windows.
Needless to say, that dialog ATI had established with its Linux users and OSS developer community would also contribute positively to their proprietary drivers.
Both Intel and nVidia - proprietary driver companies - should be on defensive ri
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
I hope so, but Intel has released comprehensive driver docs [intellinuxgraphics.org] for a long time, and their driver still sucks.
Re: (Score:3, Insightful)
On the other hand, I "live with" Intel graphics hardware and hate it. I commend them for supplying docs, but I don't think it does much good without hardware worth writing quality drivers for.
Hopefully ATI hardware is, and this time around ATI/AMD's open source commitment is sincere.
Re:Intel isn't aiming at gamers (Score:5, Informative)
Both Intel and nVidia - proprietary driver companies - should be on defensive right now.
You obviously don't know much about Intel's commitment to open-source in its drivers. Instead of recently dumping some of the specs to its chips to the community at large, Intel has actively paid developers to maintain top-quality 100% FOSS drivers for Linux and X11 for years, making its commitment light-years ahead of ATI or Nvidia. Ever hear of Keith Packard, you know, the leading developer of the entire X system? He's an Intel employee. For all the accolades that ATI gets for dumping a bunch of specs on the web, Intel has put vastly more time & money into supporting OSS, but still gets labeled as "closed source" by fanboys.
Re:Intel isn't aiming at gamers (Score:4, Insightful)
Re: (Score:2)
both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box. You can say Ubuntu isn't linux, but it's what all those Dell buyers are going to see.
I was not aware that there are specific "Ubuntu" drivers. Can you confirm that is what you mean - otherwise that last sentence does not make sense to me.
Support for binary drivers in Ubuntu (Score:3, Informative)
The nVidia drivers are binary only, so they are not available in the standard source repositories and are not compiled and included by default in most opensource distribution.
Ubuntu has made the necessary arrangement and provides, out-of-the-box a tool that can automatically download and install binary drivers from within the usual setup tool.
It think that's why the parent poster may refer to.
That means that, instead of having to manually download a package and execute it (from the command line) - which isn
Re:Intel isn't aiming at gamers (Score:5, Insightful)
zquote>Right now they're crapping their pants because they gambled everything on Vista which is failing spectacularly, whereas both ATi and Intel have got a 2 year head start on supporting Ubuntu out of the box.
Traditionally (before the AMD buyout) ATI has had terrible support under Linux, nVidia has been delivering their binary drivers and they have in my experience been easy to install, stable and fully functional much longer than ATI. By the way, the latest 177.68 drivers should now have fixed the recent KDE4 performance issues. From what I've understood the fglrx (closed source) ATI driver has made considerable progress so maybe now they're equal, but closed vs closed source nVidia got nothing to be ashamed of. For exampel, ATI just this month added CrossFire support while SLI has been supported since 2005, that's more like three years behind than two years ahead.
Of course, ATI is now opening up their specs but it's going slowly. For example, they have not yet recieved [phoronix.com] the 3D specs on the last generation R600 cards, much less the current generation. And after those specs are released some very non-trivial drivers must be written as well, meaning it could take another few years before we see fully functional open source drivers. Also this strategy is less than a year old, so if they're two years ahead they're moving fast. Nothing of this is something that should make nVidia executives the least bit queasy.
They are crapping their pants because ATI has delivered two kick-ass parts in the 4850 and 4870, and there's very little reason to buy anything else these days. They are crapping their pants because Intel won't give them a Nehalem system bus license. They're crapping their pants because the 800lb gorilla that's Intel is entering a market everyone else has been leaving. They're crapping their pants because the US economy is shaky and people might not spend big $$$ on graphics cards. But over Linux support? Put it into perspective, and you'll see it's an extremely minor issue.
Re: (Score:2)
NVIDIA may be crapping their pants for many reasons, but Ubuntu support isn't one of them.
Re: (Score:3, Informative)
nVidia never did - they finally said "screw it, we can't.", after which Microsoft removed that requirement. Interestingly enough that means that there's no real barrier between DX10 being ported to XP, but we'll leave that alone for now.
Meanwhile for the last few years in my experie
Re: (Score:3, Insightful)
"movie studios"? Yeah, I'm sure Intel is putting a GPU in every Intel CPU (80% of the desktop market) just to make a couple of companies happy.
Why wouldn't Intel want to be a Nvidia's competitor? Intel has been the top seller of graphic chips, more than nvidia or ati, for some years. I'd say that they have been competing for a looong time now.
Intel has been very succesful with their integrated graphic chips because most of people on the world only need a chip that can draw Windows. Apparently, now they want
Movie studios ... or anybody who uses 3D studio (Score:2)
Artists need faster render times more than they faster on-screen interaction. Larrabee would be a good mixture for them.
Re: (Score:2)
And what percentage of the customer base are artists? 0.00000000001?
Artists, engineers, scientists... (Score:2)
There's a lot more than you think.
Re: (Score:3, Interesting)
I'm fixated on what engineers use their computers for.
I design things all day, and all I've got, all I need, is an ancient Intel 865 video chipset built into the motherboard of my Dell Optiplex.
I don't want or need a GPU, neither does anyone else in our department.
Re: (Score:3, Informative)
That's not true. From their paper [intel.com]:
And it's definitely aimed largely at games: the paper gives performance studies of DirectX 9 rendering from Half Life 2, FEAR and Gears of War.
Re:Intel isn't aiming at gamers (Score:4, Informative)
Did you not see the bit right after where you bolded text? ... double-precision float...
Re:Intel isn't aiming at gamers (Score:4, Insightful)
It's obvious that the graphics angle is really just a Trojan horse; they're using graphics as the reason to get it into the largest number of hands possible, but what they really want to do is to keep people writing for the X86 instruction set, rather than OpenCL, DirectX 11, or CUDA. Lock-in with the X86 instruction set has served them too well in the past.
In other words, general compute was an area where things were slipping out of their grasp; this is a means to shore things up.
It's a sound business strategy. But I have to agree with blogger-dude; I don't see them being overly competitive with NVIDIA and AMD's latest parts for rasterized graphics anytime soon.
Re:Intel isn't aiming at gamers (Score:5, Insightful)
NVidia is on the defensive for the simple reason that it needs to be. Not because Intel has a product that threatens NVidia, but because Intel is using classic vaporware strategies to undermine NVidia (and AMD/ATI). Intel is basically throwing around promises, and by virtue of its reputation a lot of people listen and believe those promises. With 'amazing Intel GPU technology just around the corner', some people might delay buying NVidia hardware. NVidia is trying to prevent that from happening.
That's what Intel is SAYING.... (Score:2)
But the chips tell a different story.
Larrabee will be a low end gamers device but a high end number cruncher.
Re: (Score:3, Insightful)
What chips? Intel hasn't demonstrated any Larrabee hardware yet. They've published some specs, but we don't even know how many cores it will have or their clock speed.
Better than NVIDIA's proprietary hardware (Score:2, Insightful)
At least Intel documents their hardware. Fuck NVIDIA and their stupid proprietary hardware!
Glass
Re: (Score:3, Informative)
NVIDIA's proprietary hardware is what's capable of playing the games I want to play at the frame rate and quality I want.
For goodness sake already, why won't people stop being so ideological and just USE the damn hardware if it works better than the alternative. Pick what you need from a practical viewpoint, NOT on ideology. Life's not worth wasting one's efforts of the ideology of a fucking graphics chipset already!
Re:Better than NVIDIA's proprietary hardware (Score:4, Interesting)
The fix would be trivial (just recompile the current version), but Nvidia clearly would rather sell me a new card.
Re: (Score:2)
I feel your pain, cos when I was running a desktop machine with an NVIDIA card I had problems of my own (eg. standby not resuming properly, occasional glitches, etc). However, what pisses me off is when people assume that open-source drivers such as the Intel drivers are somehow better. Shit, the Intel graphics drivers in Linux don't properly support all the GL extensions that the Windows drivers so, there's a documented but as of yet unfixed issue with the gamma levels when using XV for video playback, I'v
Re: (Score:3, Informative)
I'm using a 71.86.04 driver (released last January 2008) for my Riva TNT2 on an old PII-366. Works pretty fine on a vanilla 2.4.x kernel.
Re: (Score:2)
Re: (Score:2)
The Current workaround: don't update X11 until Nvidia updates their proprietary GLX library for the legacy drivers. This nicely illustrates the problem with closed source drivers.
There will come a time -not chosen by you, but by the manufacturer- when the hardware you bought will stop functioning correctly.
Re: (Score:2)
Pick what you need from a practical viewpoint, NOT on ideology.
But Free software is practical. If it breaks, you can fix it. With closed source, you have to rely on someone else to fix it for you.
Re: (Score:2, Informative)
And for those of us who don't code and do other things for a living, I guess we are just shit of of luck then?
I spend my money on hardware and drivers. In other words, I'm paying someone else to do it right.
Re:Better than NVIDIA's proprietary hardware (Score:4, Insightful)
Linux support done right is a well written GPL-compliant driver.
Re: (Score:3, Insightful)
Who is forcing you to upgrade?
I stuck with my Ti4200 for years till it stopped working. Could i get a new one and did I want a new one? No. The newer AGP replacement was faster and supported DX9 stuff.
Basically:
1) You don't have to upgrade your hardware if you don't upgrade your software. If you want to upgrade your software, how's it the hardware manufacturer's problem that your old hardware stops working aft
Re: (Score:2)
Re: (Score:2)
"Xerox's 9700 laser printer prints the documents I want to print with the speed and quality that I want them printed."
"For goodness sake already, why won't people stop being so ideological and just USE the damn hardware if it works better than the alternative. Pick what you need from a practical viewpoint, Not on ideology. Life's not worth wasting one's efforts of the ideology of a fucking laser printer already!"
History [faifzilla.org]. Remember, what you choose today might just change wha
Re: (Score:3, Insightful)
Re: (Score:2)
Intel? documents??
They have published few specs - only after number of on-line petitions and PR harassments.
As far as specs go, nVidia in some respect is less hated than Intel: later already has greater history of keeping everything confidential, sometimes not sharing even with partners.
That's of course different in markets Intel trying to enter right now e.g. telecom: they are very nice and polite, often sending updated specs to you even without asking.
But as desktop market concerned, make no
Gee, How "Forward Thinking" of You, NVidia! (Score:5, Interesting)
"OH MY GOD! CPU AND GPU ON ONE DIE IS STOOOOOOOOPIIIIIDDDDDEDEDDDD!!!1111oneoneone"
How stupid is it really? So what if the average consumer actually knows very little about their PC. That doesn't necessarily mean it won't be put into a person's PC.
If they were really forward thinking, they could see it as an effort to bridge the gap between low-end PC's and high-end PC's. Now maybe, at some point in the future, people can do gaming a little better on those PC's.
Instead of games being nigh unplayable, are now running slightly more smoothly. With advance in this design, it could really work out better.
Sure, for the time being, I don't doubt that the obvious choice would be to have a discrete component solution for gaming. However, there might be a point where that isn't in the gamers best interests anymore. I'm not a soothsayer, I don't know.
Still, I can't only help but imagine how Intel's and AMD's ideas can only help everyone as a whole.
Re: (Score:2)
The one thing I see is that it's merging the two different product types that have different rates of advancement. I would think that might not be such a good idea. This might make the performance gulf between typical systems and gaming systems a lot larger.
Re:Gee, How "Forward Thinking" of You, NVidia! (Score:4, Insightful)
I think Fusion is an alternative to current integrated graphics, not to separate high-performance GPUs. After all it shares the drawback of having to steal bandwidth from the regular RAM, where discrete graphics cards have their own memory.
For a moderately power-hungry graphics chip (think 20 watt) the advantage is that Fusion can share the CPU cooler with a moderately power-hungry CPU, while integrating the GPU elsewhere on the board will require a separate cooler. That takes extra board area and money.
So I think Fusion might be able to perform somewhat better for the same price than other integrated graphics. Which means it threatens the widely used Intel boards with integrated graphics rather than NVidia.
To mention something else (but still mostly on topic):
In the Linux market, AMD is currently building a lot of goodwill by providing documentation and Open Source driver code. That might become an advantage over NVidia too.
Re: (Score:2)
Re: (Score:2)
Right now GPUs cannot be used widely by software because they are relatively expensive and support is sparse.
The point is to integrate the GPU functions deeper into system, allowing cheap low-end integrated boards to also have GPUs.
nVidia tries hard to keep the GPU acceleration exclusive to high-end. Intel and AMD/ATI want it to hit low-end - the market where most money are.
Larrabee as a rasteriser... (Score:4, Informative)
Classic case of disruption (Score:4, Insightful)
Ten years ago you would see Nvidia GPUs in everything from low- to high-end. Today, not so much - Intel dominates the low-end spectrum, with ATI hanging onto a somewhat insignificant market share. The Larrabee is Intel moving upmarket. Sure, it might not perform as well the latest Nvidia or ATI high-end GPU but it might be enough in terms of performance or have other benefits (better OSS support) to win some of Nvidia's current market share over. Considering it's supposedly the Pentium architecture recycled, it's also reasonable to assume the design will be relatively cost-effective and allow Intel to sell at very competitive prices while still maintaining healthy profit margins.
It's a classic case of disruption. Intel enters and Nvidia is happy to leave because there's a segment above that's much more attractive to pursue. Continue along the same lines until there's nowhere for Nvidia to run, at which point the game ends - circle of disruption complete. See also Silicon Graphics, Nvidia's predecessor in many ways.
Re:Classic case of disruption (Score:5, Insightful)
>ATI hanging onto a somewhat insignificant market share.
C'mon, 17 million units shipped in a quarter and ~20% of the market is hardly 'a somewhat insignificant market share' in a market with four major players (Intel, nVidia, VIA).
For comparison, take Matrox, they have insignificant market share with about 100K/q
Re: (Score:3, Insightful)
No, he is saying that ATI has an insignificant portion of the low-end market, which is true. Both ATI and NVIDIA cards are now seen as upgrades to the default Intel chipset in practically every laptop sold, whereas in the past they provided both the low-end and high-end cards for that market.
Re: (Score:3, Insightful)
Intel always was the biggest graphic chips provider - and I don't think was ever below 40% of the total market (by numbers at least). With all their expensive and cheap graphic chips, ATI and NVidia were unable to dethrone Intel's integrated graphic division.
Re:Classic case of disruption (Score:5, Insightful)
Take away their price-fixing... (Score:2)
...and ATI/AMD easily bests nVidia. Somehow, I'm not surprised.
I always require intel chipsets when I purchase... (Score:2)
Because the drivers are open source and work out the box on every modern Linux distro.
I like my compiz eye-candy and Intel delivers more than enough performance for it.
I'll tell you what will be scathing.. (Score:2, Insightful)
If, in the future, the trend evolves that all gpu's are integrated.
Intel, nvidia, AMD and ATI...
Who is the odd one out there?
AMD is in the Best Position (Score:5, Interesting)
Lots of people here and analysts have written off AMD. I think AMD is in a great position if they can survive their short term debt problems which is looking increasingly likely.
Consider the following:
AMD is in a great position like no other company to capitalize on the coming CPU / GPU convergence. Everyone jeered when AMD bought ATI but it is looking to be a great strategic move if they can execute on their strategy.
AMD has the best mix of technology, they just have to put it to good use.
Re:AMD is in the Best Position (Score:5, Interesting)
Well, if you read the reviews, AMDs integrated graphics sollution 780g kicks ass. Only the very very newest Intel integrated chipset is slightly better, but that uses around 20W compared to AMD's chipset's 1W
And when you move up to 790gx with side port ram.. (Score:2)
And when you move up to 790gx with side port ram then AMD systems gets even faster with out useing system ram. Also intel poor drivers are unlikely to make any INTEL GPU good.
Re: (Score:3, Interesting)
Just looking at this from a manufacturing side:
AMD is roughly two years behind Intel in semiconductor process technology. Due to this and other reasons (SOI, R&D/SGAA vs. revenue) they are in a very bad cost position. Even if they have a better design, Intel is easily able to offset this with pure manufacturing power.
The playground is more level for Nvidia vs. ATI since both rely on foundries.
It's tough to tell whether ATI/AMD will be able to capitalize on this situation. They are very lucky to have a n
Re: (Score:2)
Re:AMD is in the Best Position (Score:5, Interesting)
Nvidia doesn't have an x86 design / manufacturing experience, x86 license, or even x86 technology they want to announce
True. They do, however, have an ARM Cortex A8 system on chip, sporting up to four Cortex cores and an nVidia GPU in a package that consumes under 1W (down to under 250mW for the low-end parts). Considering the fact that the ARM market is currently an order of magnitude bigger than the x86 market and growing around two-three times faster, I'd say they're in a pretty good position.
Re:AMD is in the Best Position (Score:4, Interesting)
True. They do, however, have an ARM Cortex A8 system on chip, sporting up to four Cortex cores and an nVidia GPU in a package that consumes under 1W (down to under 250mW for the low-end parts). Considering the fact that the ARM market is currently an order of magnitude bigger than the x86 market and growing around two-three times faster, I'd say they're in a pretty good position.
True, but the ARM market also has many more players. ARM will license their core to anyone, so you have Intel VS AMD VS VIA in x86 land and TI vs Philips (NXP now -- I LOVE this chip) VS Marvell (not a licensee, but they have a crummy chip for free) VS NVIDIA? VS Analog (that's kind of funny) VS IBM VS Fujitsu VS Freescale VS STM VS Cirrus VS Atmel VS Broadcom VS Nintendo VS Sharp VS Samsung VS ... VS there's probably even Xilinx in there for good measure.
So, the market is larger, but the competition is stiffer.
That said, if they made an EEE like machine with NVidia's graphics and 4x cortex cores, I'd buy one.
Re: (Score:3, Insightful)
Shouldn't that be four Cortext A9 chips, on the grounds that A9 has SMP support but not A8?
That said ... the reason the ARM market is so big is that it goes after a lower power market,
with cell phones and other battery powered gadgets being the canonical example. That market
has not yet felt a real need for SMP. So even if such an NVidia chip gets off the ground, it's
unclear how much it would sell.
The first widely available ARM Cortex
Just missing good mobo chipsets (Score:2)
Spider platform (Score:3, Informative)
AMD is missing someone to develop and manufacture good motherboard chipsets.
Haven't been following the news recently ?!?
ATI/AMD latest serie of chipsets (the 790) is quite good. That's the reason why VIA announced dropping that market in the first place.
The only problem is that currently, nVidia's SLI is a proprietary technology requiring licensing. So that's why a lot of player still buy nvidia's chipsets and avoid ATI's - not that these are bad, on the contrary, but they only lack the license required for SLI.
This SLI problem is also explaining why nVidia may have to consider sto
Phenom landscape is different (Score:3, Interesting)
For something that was back in the early K8 days, it seems like 99% of the boards on the market today for AMD CPUs have nvidia/via chipsets.
And since Phenom and AM2+ socket appeared, 99% of the boards on the market for these use nvidia/ati chipset.
The few VIA based motherboards you can see usually are based on derivative of the KT800 chipset that was already available back in the early K8 days (as the memory controller in on the CPU and the chipset only communicates using HyperTransport - one can pretty much mix'n'mach most chipset almost regardless of the processor generation).
And these mainboards are targeted to the budget segment (usually fe
Re:AMD is in the Best Position (Score:5, Interesting)
Re: (Score:2, Interesting)
Lots of people here and analysts have written off AMD. I think AMD is in a great position if they can survive their short term debt problems which is looking increasingly likely.
Everyone forgets that AMD purchase ATI not for their GPU line (bonus) but for their North/South Bridge chipsets as it offered them the ability to finally provide a complete solution as Intel Does. For example - Asus only designs the PCB used in their Motherboard while using either an Intel chipset (Socket 775/ICH9/GMA3100) or an AMD (Socket AM2) with either an ATI northbride or the Nvidia Nforce Chipset. The only thing AMD gets from this is the Socket and CPU sell, while ATI or Nvidia gets the rest of the b
THIS JUST IN! (Score:2, Funny)
Company says competitor's product sucks! News at 11.
So Larrabee large will be the equivalent... (Score:2)
of a Geforce 8 series(which came out in 2006)???
EXCELLENT!
Thanks AMD for suggesting intel, Im gonna save lots of money on my next motherboard by not needing an nvidia graphics card!
What bullshit. (Score:5, Interesting)
From the SIGGRAPH paper they need something like 25 cores to run GoW at 60Hz. That's 1Ghz cores for comparison though. LRB will probably run at something like 3Ghz, meaning you only need like 8-9 cores to run GoW at 60, and with benchmarks stretching up to 48 cores you can see that this has the potential of being very fast indeed.
More importantly, the LRB has much better utilization since there aren't any fixed function divisions in the hardware. E.g. most of the time you're not using the blend units. So why have all that hardware for doing floating point maths in the blending units when 99% of the time you're not actually using it? On LRB everything is utilized all the time. Blending, interpolation, stencil/alpha testing etc. is all done using the same functionality, meaning that when you turn something off (like blending) you get better performance rather than just leaving parts of your chip idle.
I'd also like to point out that having a software pipeline means faster iteration, meaning that they have a huge opportunity to simply out-optimize nvidida and amd, even for the D3D/OGL pipelines.
Furthermore, imagine intel suppyling half a dozen "profiles" for their pipeline where they optimize for various scenarios (e.g. deferred rendering, shadow volume heavy rendering, etc. etc.). The user can then try each with their games and run each game with a slightly different profile. More importantly, however, is that new games could just spend 30 minutes figuring out which profile suits them best, set a flag in the registry somewhere, and automatically get a big boost on LRB cards. That's a tiny amount of work to get LRB-specific performance wins.
The next step in LRB-specific optimizations is to allow developers to essentially set up a LRB-config file for their title with lots of variables and tuning (remember that LRB uses a JIT compiled inner-loop that combines the setup, tests, pixel shader etc.). This would again be a very simple thing to do (and intel would probably do it for you if your title is high profile enough), and could potentially give you a massive win.
And then of course the next step after that is LRB-specific code. I.e. you write stuff outside D3D/OGL to leverage the LRB specifically. This probably won't happen for many games, but you only need to convince Tim Sweeney and Carmack to do it, and then most of the high profile games will benefit automatically (through licensing). My guess is that you don't need to do much convincing. I'm a graphcis programmer myself and I'm gagging to get my hands on one of these chips! If/when we do I'll be at work on weekends and holidays coding up cool tech for it. I'd be surprised if Sweeney/Carmack aren't the same.
I think LRB can be plenty competitive with nvidia and amd using the standard pipelines, and there's a very appealing low-fricion path for developers to take to leverage the LRB specifically with varying degrees of effort.
Re: (Score:2)
Re:What bullshit. (Score:4, Interesting)
Some notes from Tim Sweeney in a discussion on this:
"Note that the quoted core counts for AMD and NVIDIA are misleading.
A GPU vendor quoting a "240 cores" is actually referring to a 15-core chip, with each core supporting 16-wide vectors (15*16=240). This would be roughly comparable to a 15-core Larrabee chip.
Also keep in mind, a game engine need not use an architecture such as this heterongeneously. A cleaner implementation approach would be to compile and run 100% of the codebase on the GPU, treating the CPU solely as an I/O controller. Then, the programming model is homogeneous, cache-coherent, and straightforward.
Given that GPUs in the 2009 timeframe will have multiple TFLOPs of computing power, versus under 100 GFLOPS for the CPU, there's little to lose by underutilizing the CPU.
If Larrabee-like functionality eventually migrates onto the main CPU, then you're back to being purely homogeneous, with no computing power wasted.
I agree that a homogeneous architecture is not just ideal, but a prerequisite to most developers adopting large-scale parallel programming.
In consumer software, games are likely the only applications whose developers are hardcore enough to even contemplate a heterogeneous model. And even then, the programming model is sufficiently tricky that the non-homogeneous components will be underutilized.
The big lesson we can learn from GPUs is that a powerful, wide vector engine can boost the performance of many parallel applications dramatically. This adds a whole new dimension to the performance equation: it's now a function of Cores * Clock Rate * Vector Width.
For the past decade, this point has been obscured by the underperformance of SIMD vector extensions like SSE and Altivec. But, in those cases, the basic idea was sound, but the resulting vector model wasn't a win because it was far too narrow and lacked the essential scatter/gather vector memory addressing instructions.
All of this shows there's a compelling case for Intel and AMD to put Larrabee-like vector units future mainstream CPUs, gaining 16x more performance on data-parallel code very economically.
Tim Sweeney
Epic Games"
Good future ahead (Score:2)
Regardless of Larrabee being crap performance when it's released in 2010, this is a step in the right direction even if Intel doesn't know how to make very good graphics cards. In time, I'm sure, there won't be a difference between CPU/GPU, and all the memory will be shared (for the majority of systems). We'll be looking back saying "wow, why would anyone want that extra memory from the graphics card just sitting there while my system has maxed it's main physical memory?". If Intel drops into the graphics m
I'f prefer stable releae from 2006... (Score:5, Interesting)
so... (Score:2, Funny)
So intel will only be about 4 years behind current in their graphics system when it comes out.
In that case, it's probably the biggest leap they'll have ever made.
Curious how the shoe is on the other foot (Score:2)
FTA:
Funny how in the 9500pro's release there was such a focus on not using raw video horsepower to draw frames, but use occlusion and other things to save time/memory/bandwidth and increase speed in rendering.
Versus the Gforce line of depending on raw horsepower and drawing everthing in a scene.
Brute force or thinking ahead?
I have to admit, I like the idea that Kneo24 had up
From 2006? (Score:2, Interesting)
Re: (Score:3, Interesting)
Core +Larrabee on same chip? (Score:3, Insightful)
As Moore's law makes silicon cheaper, what are you going to do with it? more cache, more cores... why not a GPU? Concurrent software to utilize multi-cores is not yet mainstream (maybe never), so that leaves cache and GPU.
In a way, the existence of separate GPUs is just a sign that the CPU wasn't powerful enough to deliver the graphics the market wanted (and would pay for). When CPU's are powerful enough (clock speed or multi-core), they'll subsume the GPU, as they did maths co-processors and cache. ie. The silicon would be partitioned into CPU, GPU and cache - but it would all be on the one chip (called the "CPU" no doubt).
Intel already owns a fair bit of the integrated graphics market. They have great access to channels. Even if this is only half as good as a separate GPU, they will increase market share. I can't see what could stop them... except maybe a patented technology that can't be worked around. Some manufacturers of separate GPU's will survive in specialized niches. Some.
The trick is apparently in the non-linear addition (Score:2)
Here is one area in which NVIDIA clearly has the upper hand over Intel. They apparently have figured out how to perform incremental [reference.com] functions in a non-linear fashion!
...
Great. Just 13 years after finally proving Fermat's Last Theorem [wikipedia.org], NVIDIA throws a new challenge at the Mathematicians! Now they'll never get any real work done
Re: (Score:3, Insightful)
but the extra programability larrabee have as its just a bunch of cpus with some gpu instructions
Agreed -- why stick to GPU applications, when you have a general purpose multicore machine? How about getting those new instructions into general usage -- remember how MMX was originally introduced for stuff we now run on GPUs.
As for traditional GPU applications, there's already an OpenGL driver for the Cell SPUs in development. A similar driver for a generic multicore machine would be nice, particularly if it's not limited to Larrabee and x86. Of course we already have software implementations of OpenGL
Re: (Score:2)
If you think GPGPU is hard to program, you are using the wrong tools. Check out RapidMind. It's very easy.
C//
Re: (Score:2)
So you're claiming perceptions can never change, so Intel should just give up? More competition in this arena can only be good for us.