Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Nvidia's Chief Scientist on the Future of the GPU 143

teh bigz writes "There's been a lot of talk about integrating the GPU into the CPU, but David Kirk believes that the two will continue to co-exist. Bit-tech got to sit down with Nvidia's Chief Scientist for an interview that discusses the changing roles of CPUs and GPUs, GPU computing (CUDA), Larrabee, and what he thinks about Intel's and AMD's futures. From the article: 'What would happen if multi-core processors increase core counts further though, does David believe that this will give consumers enough power to deliver what most of them need and, as a result of that, would it erode away at Nvidia's consumer installed base? "No, that's ridiculous — it would be at least a thousand times too slow [for graphics]," he said. "Adding four more cores, for example, is not going anywhere near close to what is required.""
This discussion has been archived. No new comments can be posted.

Nvidia's Chief Scientist on the Future of the GPU

Comments Filter:
  • Ugh. (Score:2, Insightful)

    by Anonymous Coward on Wednesday April 30, 2008 @01:41PM (#23253130)
    From TFA> The ability to do one thing really quickly doesn't help you that much when you have a lot of things, but the ability to do a lot of things doesn't help that much when you have just one thing to do. However, if you modify the CPU so that it's doing multiple things, then when you're only doing one thing it's not going to be any faster.

    David Kirk takes 2 minutes to get ready for work every morning because he can shit, shower and shave at the same time.
  • by arbiter1 ( 1204146 ) on Wednesday April 30, 2008 @01:55PM (#23253304)
    truthfully only real application for the gpu/cpu hybrid would be in laptop use where they can get away with using lower end gpu chips
  • Future is set (Score:4, Insightful)

    by Archangel Michael ( 180766 ) on Wednesday April 30, 2008 @02:02PM (#23253390) Journal
    The pattern set by the whole CPU / Math Co-Processor integration showed the way. For those old enough to remember, once upon a time the CPU and Math Co-Processor were separate socketed chips. Specifically you had to add the chip to the MOBO to get math functions integrated.

    The argument back then is eerily similar to the same as proposed by NV chief, namely the average user wouldn't "need" a Math Co-Processor. Then came along the Spreadsheet, and suddenly that point was moot.

    Fast forward today, if we had a dedicated GPU integrated with the CPU, it would eventually simplify things so that the next "killer app" could make use of commonly available GPU.

    Sorry, NV, but AMD and INTEL will be integrating GPU into the chip, bypassing bus issues and streamlining the timing. I suspect that VIDEO processing will be the next "Killer App". YouTube is just a precursor to what will become shortly.

  • And as we all knew (Score:3, Insightful)

    by aliquis ( 678370 ) on Wednesday April 30, 2008 @02:20PM (#23253572)
    Only Amiga made it possible! (Thanks to custom chips, not in spite of them.)

    It doesn't seem likely that one generic item would be better at something than many specific ones. Sure CPU+GPU would just be all in one chip but why would that be better than many chips? Maybe if it had RAM inside aswell and that enabled faster FSB.
  • Re:Why not.... (Score:3, Insightful)

    by nick_davison ( 217681 ) on Wednesday April 30, 2008 @02:50PM (#23253918)

    Instead of 4 CPU cores on a quad-core chip, why not put 2xCPU cores and 2xGPU cores?
    Because now they have to make [number of CPU options] x [number of GPU options] variants rather than [number of CPU options] + [number of GPU options].

    Even taking a small subset of the market:
    8600GT, 8800GT, 8800GTS, 6600, 6700, 6800

    Six products sit on shelves. Users buy what they want. As a competitor to say the 8600GT comes out, Best Buy has to discount one product line.

    To give users the same choices as an integrated solution, that'd be 9 variants:

    8600GT/6600 - Budget
    8600GT/6700 - Typical desktop user
    8600GT/6800 - Photoshop user/media encoder
    8800GT/6600 - Poor gamer
    8800GT/6700 - Mid range gamer
    8800GT/6800 - Serious desktop user who likes to game
    8800GTS/6600 - Exclusive but somewhat poor gamer
    8800GTS/6700 - Gaming enthusiast
    8800GTS/6800 - Hardcore power gamer/3D Modeller

    Most users are now left scratching their heads as to whether the similarly priced 8600GT/6800 or the 8800GTS/6600 is better or worse for them than the also similarly priced 8800GT/6700.

    Plus, every time one part of the market is perceived as less valuable, the stores have to price many different skus.

    Now add in the gamer who bought a $200 GPU and a $300 CPU a little while before a great new mid range GPU option turns up. They can toss their $200 investment which sucks but that's probably it when it comes to upgrading. Or the guy who bought the $450 (we'll grant a small discount for single purchases) combined unit now has to toss both. Plus he most likely has to buy a new motherboard and memory because memory speed requirements and processor sockets change faster than Britney Spears' moods can swing.
  • by Anonymous Coward on Wednesday April 30, 2008 @03:08PM (#23254228)
    I think a better question is "Why wouldn't we have a separate multi-core GPU along with the multi-core CPU?" While I agree that nVidia is obviously going to protect it's own best interests, I don't see the GPU/CPU separation going away completely. Obviously there will be combination-core boards in the future for lower-end graphics, but the demand on GPU cycles is only going to increase as desktops/games/apps get better. However, one of the huge reasons that video cards are a productive industry is that there are plenty of high-end graphical demands out there, from hardcore gamers to Autocad applications. Ever seen the number of cycles/graphical processesing power it takes to run a digital 911 map? Unbelievable!

    Seriously, if there is anything that history has taught us, it's that there's room for the integrated (low-end) and dedicated (high-end) graphics at the same time, as they server different niches.

    Oh, and never get involved in a land war in Asia ;-)
  • Re:FOR NOW (Score:3, Insightful)

    by Dolda2000 ( 759023 ) <fredrik@dolda200 0 . c om> on Wednesday April 30, 2008 @04:53PM (#23256142) Homepage
    Why would one even want to have a GPU on the same die as the CPU? Maybe I'm just being dense here, but I don't see the advantage.

    On the other hand, I certainly do see possibly disadvantages with it. For one thing, they would reasonably be sharing one bus interface in that case, which would lead to possibly less parallelism in the system.

    I absolutely love your sig, though. :)

I've noticed several design suggestions in your code.

Working...