Intel Reveals the Future of the CPU-GPU War 231
Arun Demeure writes "Beyond3D has once again obtained new information on Intel's plans to compete against NVIDIA and AMD's graphics processors, in what the Chief Architect of the project presents as a 'battle for control of the computing platform.' He describes a new computing architecture based on the many-core paradigm with super-wide execution units, and the reasoning behind some of the design choices. Looks like computer scientists and software programmers everywhere will have to adapt to these new concepts, as there will be no silver bullet to achieve high efficiency on new and exotic architectures."
yay (Score:2, Interesting)
future of computing? (Score:2, Interesting)
We need a new architecture (Score:5, Interesting)
It cannot be argued that x86 is best architecture ever made, we all know it's not... but it is the one with the most research. We need the top companies in the industry, Intel, AMD, MS, etc. to sit down and design an entirely new specification going forward.
New processor architecture, a new form factor, a new power supply, etc...
Google has demonstrated that a single voltage PSU is more efficient, and completely do able. There is little reason that we still use internal cards to add functionality to our systems, couldn't these be more like cartridges so you don't need to open the case?
Why not do away with most of the legacy technology in one swoop and update the entire industry to a new standard.
PS, I know why, money, too much investment in the old to be worth creating the new. But I can dream can't I?
Re:Great! (Score:4, Interesting)
Similiar to what sun is doing with Niagara (Score:4, Interesting)
The first Niagara CPUs were terrible at floating point math, so they were only good for web-servers. The next generation I hear are supposed to be better at FPU ops.
Re:Sure there is (Score:4, Interesting)
Re:Great! (Score:2, Interesting)
Re:Great! (Score:5, Interesting)
Easiest way to make sure a product doesn't meet expectations is to raise expectations.
Re:Sure there is (Score:3, Interesting)
Re:Sure there is (Score:4, Interesting)
Re:Sure there is (Score:3, Interesting)
In the mean time, you were given a nice example -- ATC systems and telephone exchanges, and in the mean time, you can have a research paper about map reduce [216.239.37.132] -- if you don't belive me, belive the peer reviewed research.
Re:Sure there is (Score:2, Interesting)
Of course Google's programmers can write cool parallel programs with this powerful library, but it's not a functional programming library! It's a C++ library that borrows some map and reduce ideas from functional languages.
I'd rephrase the objection to the OP as: "Show me the climate simulator in Haskell. Show me the ML hypersonic flow code for computing reentry flow over the shuttle."
Re:Sure there is (Score:2, Interesting)
When people write games, they do all kinds of crazy stunts to ensure they have as few multiplications as possible. Can you really trust a compiler to get the code right for that tight inner loop? Figuring out parallelism might be hard, but game programming has always been hard.
Also, you avoided mentioning memory. It doesn't matter if Haskell uses marginally less memory if it's in the wrong place when you need it. Is that texture in RAM, VRAM, or swap? That sort of thing makes a big difference in terms of performance. And games must maintain a certain framerate. Sometimes it's completely unacceptable to use swap, even if the time gets made up later.
I'm not against functional languages, it's just a question of using the right tool for the job. You use high level languages for high level tasks, and low level languages for low level tasks. If you're writing a compiler or an AI or a raytracer, where realtime performance it not an issue, sure, functional languages are great. But games have always been married to the hardware, and I don't see how that could change any time soon.
Re:Not quite (Score:3, Interesting)
Re:Sure there is (Score:3, Interesting)
Re:Sure there is (Score:3, Interesting)