Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia Claims Intel's Larrabee Is "a GPU From 2006" 278

Barence sends this excerpt from PC Pro: "Nvidia has delivered a scathing criticism of Intel's Larrabee, dismissing the multi-core CPU/GPU as wishful thinking — while admitting it needs to catch up with AMD's current Radeon graphics cards. 'Intel is not a stupid company,' conceded John Mottram, chief architect for the company's GT200 core. 'They've put a lot of people behind this, so clearly they believe it's viable. But the products on our roadmap are competitive to this thing as they've painted it. And the reality is going to fall short of the optimistic way they've painted it. As [blogger and CPU architect] Peter Glaskowsky said, the "large" Larrabee in 2010 will have roughly the same performance as a 2006 GPU from Nvidia or ATI.' Speaking ahead of the opening of the annual NVISION expo on Monday, he also admitted Nvidia 'underestimated ATI with respect to their product.'"
This discussion has been archived. No new comments can be posted.

Nvidia Claims Intel's Larrabee Is "a GPU From 2006"

Comments Filter:
  • by HonkyLips ( 654494 ) on Sunday August 24, 2008 @09:09AM (#24725681)
    A recent journal article on ArsTechnica points to an Intel blog on Larrabee: http://arstechnica.com/journals/hardware.ars/2008/05/01/larrabee-engineer-on-personal-blog-larrabee-is-all-about-rasterization [arstechnica.com] Curious.
  • by FoolsGold ( 1139759 ) on Sunday August 24, 2008 @09:36AM (#24725789)

    NVIDIA's proprietary hardware is what's capable of playing the games I want to play at the frame rate and quality I want.

    For goodness sake already, why won't people stop being so ideological and just USE the damn hardware if it works better than the alternative. Pick what you need from a practical viewpoint, NOT on ideology. Life's not worth wasting one's efforts of the ideology of a fucking graphics chipset already!

  • by sammyF70 ( 1154563 ) on Sunday August 24, 2008 @09:56AM (#24725887) Homepage Journal
    you should get your facts straight : ATI's linux drivers are atrocious (might have changed in the last 6 months or so, but I wouldn't bet on it) Between the two, only nVidia has halfway good drivers for their products.
  • by Excors ( 807434 ) on Sunday August 24, 2008 @10:18AM (#24726017)

    Intel is aiming at number crunchers (note that their chip uses doubles, not floats).

    That's not true. From their paper [intel.com]:

    Larrabee gains its computational density from the 16-wide vector processing unit (VPU), which executes integer, single-precision float, and double-precision float instructions.

    And it's definitely aimed largely at games: the paper gives performance studies of DirectX 9 rendering from Half Life 2, FEAR and Gears of War.

  • by owlman17 ( 871857 ) on Sunday August 24, 2008 @10:22AM (#24726035)

    I'm using a 71.86.04 driver (released last January 2008) for my Riva TNT2 on an old PII-366. Works pretty fine on a vanilla 2.4.x kernel.

  • by Anonymous Coward on Sunday August 24, 2008 @10:31AM (#24726079)

    And for those of us who don't code and do other things for a living, I guess we are just shit of of luck then?

    I spend my money on hardware and drivers. In other words, I'm paying someone else to do it right.

  • by DrYak ( 748999 ) on Sunday August 24, 2008 @11:51AM (#24726573) Homepage

    The nVidia drivers are binary only, so they are not available in the standard source repositories and are not compiled and included by default in most opensource distribution.

    Ubuntu has made the necessary arrangement and provides, out-of-the-box a tool that can automatically download and install binary drivers from within the usual setup tool.

    It think that's why the parent poster may refer to.

    That means that, instead of having to manually download a package and execute it (from the command line) - which isn't complicated but require some interaction with the computer - installing a binary driver under Ubuntu simply means clicking the button "yes" on a dialog asking "the following hardware requires non-free proprietary driver, would you like to install them".
    It's made trivial enough so computer non-litterate users can still do it easily - well, almost. The users still need to think that maybe they should get some software to make the graphics work better.

    Behind the scene, cliquing "yes" automatically add the non-free drivers repository to apt-get and selects the necessary package for installation.

    The results are similar (although differently implemented) to opensuse's one-click install (where you click on a link in a web page to a file with name ending in ".ymp") and the corresponding repositories are added to YaST and packages selected for installation.

  • by Anpheus ( 908711 ) on Sunday August 24, 2008 @11:55AM (#24726591)

    Did you not see the bit right after where you bolded text? ... double-precision float...

  • Spider platform (Score:3, Informative)

    by DrYak ( 748999 ) on Sunday August 24, 2008 @12:08PM (#24726667) Homepage

    AMD is missing someone to develop and manufacture good motherboard chipsets.

    Haven't been following the news recently ?!?

    ATI/AMD latest serie of chipsets (the 790) is quite good. That's the reason why VIA announced dropping that market in the first place.

    The only problem is that currently, nVidia's SLI is a proprietary technology requiring licensing. So that's why a lot of player still buy nvidia's chipsets and avoid ATI's - not that these are bad, on the contrary, but they only lack the license required for SLI.

    This SLI problem is also explaining why nVidia may have to consider stopping producing intel chipset : They never licensed their SLI technology to Intel to have SLI-compatible Intel-made chipsets. (Either forcing gamers to use nVidia chipsets or requiring convoluted hacks with SLI chipsets acting as bridges between the main northbridge and the GPUs as in Skulltrail).
    And Intel is now retaliating by refusing nVidia access to QuickPath Interconnect.

    So either nVidia will have to drop the Intel chipset market (and only produce SLI-bridge like in the Skulltrail hack).
    Or nVidia will have to give possibility to license SLI, and thus lose an interesting market that they had managed to lock.
    Hence the rumors you mention.

    Nonetheless they aren't going to stop producing chipsets for AMD (still popular among gamer) nor for VIA (they have even announced new chipset able to play DX10 games and Vista in all its aero glory on VIA Isaiah ITX platforms)

  • by Ideaphile ( 678292 ) on Sunday August 24, 2008 @03:39PM (#24728679)
    Although I appreciate the attention from NVIDIA and Slashdot, I can't support that alleged quote from my blog (http://speedsnfeeds.com).

    First, what's being described as a quote is actually just John Montrym's summary from my original post, which is here:

    http://news.cnet.com/8301-13512_3-10006184-23.html [cnet.com]

    What I actually described as equating to "the performance of a 2006-vintage... graphics chip" was a performance standard defined by Intel itself-- running the game F.E.A.R. at 60 fps in 1,600 x 1,200-pixel resolution with four-sample antialiasing.

    Intel used this figure for some comparisons of rendering performance. If Larrabee ran at 1 GHz, for example, Intel's figures show that it would take somewhere from 7 to 25 Larrabee cores to reach that 60 Hz frame rate.

    Larrabee will probably run much faster than that, at least on desktop variants.

    Well... rather than writing the whole response here, I think I'd rather write it up for my blog and publish it there. Please surf on over and check it out:

    http://news.cnet.com/8301-13512_3-10024280-23.html [cnet.com]

    Comments are welcome here or there.

    . png
  • by CajunArson ( 465943 ) on Sunday August 24, 2008 @11:54PM (#24732603) Journal

    Both Intel and nVidia - proprietary driver companies - should be on defensive right now.
    You obviously don't know much about Intel's commitment to open-source in its drivers. Instead of recently dumping some of the specs to its chips to the community at large, Intel has actively paid developers to maintain top-quality 100% FOSS drivers for Linux and X11 for years, making its commitment light-years ahead of ATI or Nvidia. Ever hear of Keith Packard, you know, the leading developer of the entire X system? He's an Intel employee. For all the accolades that ATI gets for dumping a bunch of specs on the web, Intel has put vastly more time & money into supporting OSS, but still gets labeled as "closed source" by fanboys.

  • by Almahtar ( 991773 ) on Monday August 25, 2008 @07:33AM (#24734953) Journal
    Interestingly enough, ATI seemed to pander more to Vista than nVidia. When DX10's driver compatibility requirements still included memory virtualization, ATI managed to work that into their drivers after a lot of time and $.

    nVidia never did - they finally said "screw it, we can't.", after which Microsoft removed that requirement. Interestingly enough that means that there's no real barrier between DX10 being ported to XP, but we'll leave that alone for now.

    Meanwhile for the last few years in my experience nVidia's Linux drivers have been leaps and bounds ahead of ATI's. So I really don't see where you get the feeling nVidia was more invested in Vista than Intel or ATI.
  • by MojoStan ( 776183 ) on Monday August 25, 2008 @10:21AM (#24736539)

    Nvidia are getting very scared now that ATi are beating them senseless.

    Nvidia is really feeling the pinch with ATi taking up the higher end of the market (pro-gear/high end HD) and intel suring up the lower end (GMA, etc). Nvidia pretty much are stuck with consumers buying their middle of the line gear (8600/9600).

    When you aim high you tend to hurt real when you fall from grace, the whole 8800 to 9800 leap was abysmal at best unlike their main competitor who really pulled their finger out to release the 3xxx & 4xxx series.

    I guess you're referring to AMD/ATI's successful HD 4000 launch about a month ago, but you also seem to be omitting NVIDIA's GTX 200 series. At the high end (non-workstation), the GTX 280 outperforms the HD 4870 [arstechnica.com]. The $550 HD 4870 X2 (released about two weeks ago) outperforms the $450 GTX 280, but consumes a heck of a lot more power [anandtech.com].

    Also, NVIDIA seems to have been beating AMD/ATI senseless for years. According to Jon Peddie Research [jonpeddie.com], for total graphics chips, NVIDIA had 31.4% market share in Q2 2008 vs. AMD's 18.1%. In Q2 2007, NVIDIA had 32.5% and AMD had 19.5%.

    For notebook GPUs, NVIDIA led AMD 23.6% to 17.9% in Q2 2008, and 27.0% to 17.4% in Q1 2008. For "graphics add-in boards" [jonpeddie.com], NVIDIA led AMD 65% to 35% in Q1 2008.

    The "leap" from NVIDIA 8000 series to 9000 series (which was hardly "abysmal") is more appropriately compared to ATI's leap from HD 2000 series to HD 3000 series. NVIDIA's 8000 series was better than ATI's HD 2000 series. ATI responded with the slightly improved (but well-priced) HD 3000 series, and NVIDIA countered with their 9000 series and price cuts to 8800GT/GTS, which drove down the prices of ATI's best-performing cards even more.

    Until AMD released the HD 4000 series a month ago, AMD couldn't produce cards that could compete with NVIDIA's $400+ cards. After the successful HD 4000 launch, NVIDIA was forced to slash prices, but this is a very recent develpment. Personally, I'd choose an ATI card over NVIDIA now, but AMD/ATI has been getting whipped by NVIDIA for years.

There are two ways to write error-free programs; only the third one works.

Working...