Nvidia's Chief Scientist on the Future of the GPU 143
teh bigz writes "There's been a lot of talk about integrating the GPU into the CPU, but David Kirk believes that the two will continue to co-exist. Bit-tech got to sit down with Nvidia's Chief Scientist for an interview that discusses the changing roles of CPUs and GPUs, GPU computing (CUDA), Larrabee, and what he thinks about Intel's and AMD's futures. From the article: 'What would happen if multi-core processors increase core counts further though, does David believe that this will give consumers enough power to deliver what most of them need and, as a result of that, would it erode away at Nvidia's consumer installed base? "No, that's ridiculous — it would be at least a thousand times too slow [for graphics]," he said. "Adding four more cores, for example, is not going anywhere near close to what is required.""
NV on the war path? (Score:4, Interesting)
http://www.pcper.com/article.php?aid=530 [pcper.com]
Must be part of the "attack Intel" strategy?
FOR NOW (Score:3, Interesting)
Re:CPU based GPU will not work as good as long as (Score:1, Interesting)
NVidia are putting a brave face on it but they're not fooling anybody.
Why wouldn't you have a gpu core in a multiple ... (Score:4, Interesting)
A logical improvement at this point would be to start specializing cores to specific types of jobs. As the processor assigns jobs to particular cores, it would preferentially assign tasks to the cores best suited for that type of processing.
VIA (Score:3, Interesting)
This has been in the back of my mind for awhile... Could NV be looking at the integrated roadmap of ATI/AMD and thinking, long term, that perhaps they should consider more than a simple business relationship with VIA?
Every time I walk out to my car I see raytracing. (Score:3, Interesting)
You can't do that without raytracing, you just can't, and if you don't do it it looks fake. You get "shiny effect" windows with scenery painted on them, and that tells you "that's a window" but it doesn't make it look like one. It's like putting stick figures in and saying that's how you model humans.
And if Professor Slusallek could do that in realtime with a hardwired raytracer... in 2005, I don't see how nVidia's going to do it with even 100,000 GPU cores in a cost-effective fashion. Raytracing is something that hardware does very well, and that's highly parallelizable, but both Intel and nVidia are attacking it in far too brute-force a fashion using the wrong kinds of tools.
Realtime Ray Tracing and Multicore CPU's (Score:5, Interesting)
Re:Very surprising (Score:3, Interesting)
Moving to a combined CPU/GPU wouldn't obsolete NVidia's product-line. Quite the opposite, in fact. NVidia would get to become something called a Fabless semiconductor company [wikipedia.org]. Basically, companies like Intel could license the GPU designs from NVidia and integrate them into their own CPU dies. This means that Intel would handle the manufacturing and NVidia would see larger profit margins. NVidia (IIRC) already does this with their 3D chips targeted at ARM chips and cell phones.
The problem is that the GPU chipset looks nothing like the CPU chipset. The GPU is designed for massive parallelism, while CPUs have traditionally been designed for single-threaded operation. While CPUs are definitely moving in the multithreaded direction and GPUs are moving in the general-purpose direction, it's still too early to combine them. Attempting to do so would get you the worst of both worlds rather than the best. (i.e. Shared Memory Architecture [wikipedia.org] )
So I don't think NVidia's chief scientist is off on this. (If he was, we'd probably already see GPU integration in the current generation of game consoles; all of which are custom chips.) The time will come, but it's not here yet.
Re:Why wouldn't you have a gpu core in a multiple (Score:1, Interesting)
Intel has already figured out that for the vast majority of home users have finally caught on that they don't NEED more processing power. Intel knows they have to find some other way to keep people buying more in the future. How many home users need more than a C2D E4500? Will MS Word, web browser and an email client change that much in the next 3-5 years that will demand more horsepower that is available today?
Then again, you might need 32 CPU cores on a single die if you want to run that AT&T browser
Re:VIA (Score:2, Interesting)
Re:FOR NOW (Score:3, Interesting)
Think about low end computers, IMHO putting the GPU in the same die as the CPU will provide better performance/cost than embedded in the motherboard.
And a huge number of computers have integrated video so this is an important market too.