Nvidia's Chief Scientist on the Future of the GPU 143
teh bigz writes "There's been a lot of talk about integrating the GPU into the CPU, but David Kirk believes that the two will continue to co-exist. Bit-tech got to sit down with Nvidia's Chief Scientist for an interview that discusses the changing roles of CPUs and GPUs, GPU computing (CUDA), Larrabee, and what he thinks about Intel's and AMD's futures. From the article: 'What would happen if multi-core processors increase core counts further though, does David believe that this will give consumers enough power to deliver what most of them need and, as a result of that, would it erode away at Nvidia's consumer installed base? "No, that's ridiculous — it would be at least a thousand times too slow [for graphics]," he said. "Adding four more cores, for example, is not going anywhere near close to what is required.""
Ugh. (Score:2, Insightful)
David Kirk takes 2 minutes to get ready for work every morning because he can shit, shower and shave at the same time.
Re:CPU based GPU will not work as good as long as (Score:2, Insightful)
Future is set (Score:4, Insightful)
The argument back then is eerily similar to the same as proposed by NV chief, namely the average user wouldn't "need" a Math Co-Processor. Then came along the Spreadsheet, and suddenly that point was moot.
Fast forward today, if we had a dedicated GPU integrated with the CPU, it would eventually simplify things so that the next "killer app" could make use of commonly available GPU.
Sorry, NV, but AMD and INTEL will be integrating GPU into the chip, bypassing bus issues and streamlining the timing. I suspect that VIDEO processing will be the next "Killer App". YouTube is just a precursor to what will become shortly.
And as we all knew (Score:3, Insightful)
It doesn't seem likely that one generic item would be better at something than many specific ones. Sure CPU+GPU would just be all in one chip but why would that be better than many chips? Maybe if it had RAM inside aswell and that enabled faster FSB.
Re:Why not.... (Score:3, Insightful)
Even taking a small subset of the market:
8600GT, 8800GT, 8800GTS, 6600, 6700, 6800
Six products sit on shelves. Users buy what they want. As a competitor to say the 8600GT comes out, Best Buy has to discount one product line.
To give users the same choices as an integrated solution, that'd be 9 variants:
8600GT/6600 - Budget
8600GT/6700 - Typical desktop user
8600GT/6800 - Photoshop user/media encoder
8800GT/6600 - Poor gamer
8800GT/6700 - Mid range gamer
8800GT/6800 - Serious desktop user who likes to game
8800GTS/6600 - Exclusive but somewhat poor gamer
8800GTS/6700 - Gaming enthusiast
8800GTS/6800 - Hardcore power gamer/3D Modeller
Most users are now left scratching their heads as to whether the similarly priced 8600GT/6800 or the 8800GTS/6600 is better or worse for them than the also similarly priced 8800GT/6700.
Plus, every time one part of the market is perceived as less valuable, the stores have to price many different skus.
Now add in the gamer who bought a $200 GPU and a $300 CPU a little while before a great new mid range GPU option turns up. They can toss their $200 investment which sucks but that's probably it when it comes to upgrading. Or the guy who bought the $450 (we'll grant a small discount for single purchases) combined unit now has to toss both. Plus he most likely has to buy a new motherboard and memory because memory speed requirements and processor sockets change faster than Britney Spears' moods can swing.
Re:Why wouldn't you have a gpu core in a multiple (Score:2, Insightful)
Seriously, if there is anything that history has taught us, it's that there's room for the integrated (low-end) and dedicated (high-end) graphics at the same time, as they server different niches.
Oh, and never get involved in a land war in Asia
Re:FOR NOW (Score:3, Insightful)
On the other hand, I certainly do see possibly disadvantages with it. For one thing, they would reasonably be sharing one bus interface in that case, which would lead to possibly less parallelism in the system.
I absolutely love your sig, though. :)