Intel Reveals the Future of the CPU-GPU War 231
Arun Demeure writes "Beyond3D has once again obtained new information on Intel's plans to compete against NVIDIA and AMD's graphics processors, in what the Chief Architect of the project presents as a 'battle for control of the computing platform.' He describes a new computing architecture based on the many-core paradigm with super-wide execution units, and the reasoning behind some of the design choices. Looks like computer scientists and software programmers everywhere will have to adapt to these new concepts, as there will be no silver bullet to achieve high efficiency on new and exotic architectures."
Sure there is (Score:5, Insightful)
Cell (Score:3, Insightful)
Astroturf (Score:5, Insightful)
Arun Demeure writes "Beyond3D has once again obtained new information...
If you are going to submit your own articles [beyond3d.com] to Slashdot, at least have the decency to admit this instead of talking about yourself in the third-person.
Re:We need a new architecture (Score:5, Insightful)
Re:Sure there is (Score:5, Insightful)
This kind of pisses me off. People who are functional programming ethusists are always telling other people that they should be using functional languages but they never write anything significant in these languages.
Re:yay (Score:5, Insightful)
So you can bash intel graphics all you like but for F/OSS users they could end up as the only game in town. We're not usually playing the latest first person shooters, performance only need be "good enough".
Re:Sure there is (Score:5, Insightful)
Extra credit if you can do transaction-level device control over USB.
Schwab
Not a troll -- Meta-Mod unfair (Score:1, Insightful)
The 950 is barely passable, especially with Vista.
Not really Intel's fault. Their target was the "barely passable" segment, leaving the real GPU makers the rest of the field. Probably Intel's main reason to offer this was a need by the OEMs for Intel to have a 1-stop shopping solution.
My Dell has the 950 and Vista Business and I wish I had upgraded to a more powerful GPU.
BTW, I am not the same AC as the original post.
Re:Sure there is (Score:3, Insightful)
Re:Sure there is (Score:2, Insightful)
Re:Sure there is (Score:3, Insightful)
Re:Sure there is (Score:5, Insightful)
Frag [haskell.org] which was done for an undergrad dissertation using yampa and haskell to make an FPS.
Haskell Doom [cin.ufpe.br] which is pretty obvious.
A few more examples [cin.ufpe.br].
I dunno if that satisfies your requirements or not. Though I don't quite get how this is relevant to the GP's post. This seems like more of a gripe with Haskell than anything. But if I've missed something, please elaborate.
Re:Sure there is (Score:5, Insightful)
I've done some rudimentary reading on functional programming languages -- mostly Haskell and LISP (which is sorta FP) -- and I believe you when cite all the claimed benefits. The architecture of the languages certainly enables it.
However, every time I've tried to get a handle on Haskell, all the examples presented tend to be abstract. In other words, they contrive a problem that Haskell is fairly well-suited to solving, and then write a solution in Haskell, using data structures and representations entirely internal to Haskell. "Poof! Elegance!" Well, um...
I'm a gaming, graphics, and device driver geek, and so my explorations of new stuff tend to lean heavily in that direction. I'm interested in more "concrete" expressions of software operation. Could Haskell offer new or interesting possibilities in network packet filtering? Perhaps, but first you have to read reams of text on how to bludgeon the language into reading and writing raw bits.
The other issue with FP is that they tend to treat all problems as a collection of simultaneous equations -- things that can be evaluated at any time in any order. There's a huge class of computing problems that can't be described that way. You can't unprint a page on the line printer. There are facilities for sequencing/synchronizing operations (Haskell's monads, for instance), but I get the impression that FP's elegance starts to fall apart when you start using them.
Understand that my exposure to FP in general and Haskell in particular is less than perfunctory, and am very likely misunderstanding a great deal. I'd like to learn and understand more about FP, but so far I haven't encountered the "Ah-hah!" example yet.
Schwab
Intel against NVIDIA/ATI/AMD? OSS? (Score:5, Insightful)
NVIDIA's Linux drivers are pretty good, but ATI/AMD's are god awful, and both NVIDIA's & AMD/ATI's are much more difficult to use than Intels.
I'd love to see an Intel GPU/CPU platform that was performance competitive with ATI/AMD or NVIDIA's offerings.
Good for linux (Score:3, Insightful)
Re:Sure there is (Score:3, Insightful)
Again, no one disagrees with your idea that writing in a functional style is a good idea for parallel programming, but the OP said that we should give up on two specific languages and pick up a functional one. Clearly a program in a functional style can be written in C++ (which is a superset of C89, more or less, which is one of the two languages mentioned by the OP). The challege to the OP was to show the world a massively parallel program of significance written in a functional language, not one written directly in a procedural language but in a functioal style. You showed us the latter, we'd still like to see the former.
Re:Sure there is (Score:2, Insightful)
It's an old argument (Score:4, Insightful)
Back in the 70's, people like Jack Dennis used to promise the DARPA that they could parallelize the old Fortran code used to do complex military simulations by converting the Fortran code to a pure functional language. It would be wonderful! Well, they couldn't, and it wasn't.
The above notwithstanding, IF you can coerce a problem into a form in which a functional language can be effectively employed, the benefits can be huge. The code tends to be more elegant and more readable; algorithms that would be difficult to write in an applicative language like C become easy; data structure manipulation is trivial; and so on. Arguments that functional languages are "slow" have been debunked. Arguments that functional languages must be interpreted are wrong.
And, all the syntactic nonsense of C++ and the rest of the "object oriented" languages can be (mercifully) shed. Pure functional languages are object oriented by nature. However, functional languages do have their own idiosyncracies, such as the infamous Lisp "quote", and implementation-dependent funarg problems. So there are cobwebs still.
To sum up: If you have a hard algorithmic problem to solve, a functional language will probably be a better choice, even if you end up re-coding the algorithm in an applicative language later. If you have a device driver to write, though, roll up your sleeves and get out the C manual. But first: make sure to put a debug wrapper around your mallocs (and pad your malloc blocks with patterns on both sides) so you can trap double-frees, underwrites, and overwrites. It will pay many dividends.
Re:Great! (Score:3, Insightful)
Except when it ain't. Lemme see, entire new programming model..... haven't we heard this song before? Something about HP & Intel going down on the Itanic? Ok, Intel survived the experience but HP is pretty much out of the processor game and barely hanging in otherwise.
Yes it would be great if we could finally escape the legacy baggage of x86, but it ain't going to happen anytime soon. The pain isn't there yet that would convince people to migrate.