AMD Banks On Flood of Stream Apps 124
Slatterz writes "Closely integrating GPU and CPU systems was one of the motivations for AMD's $5.4bn acquisition of ATI in 2006. Now AMD is looking to expand its Stream project, which uses graphics chip processing cores to perform computing tasks normally sent to the CPU, a process known as General Purpose computing on Graphics Processing Units (GPGPU). By leveraging thousands of processing cores on a graphics card for general computing calculations, tasks such as scientific simulations or geographic modelling, which are traditionally the realm of supercomputers, can be performed on smaller, more affordable systems. AMD will release a new driver for its Radeon series on 10 December which will extend Stream capabilities to consumer cards." Reader Vigile adds: "While third-party consumer applications from CyberLink and ArcSoft are due in Q1 2009, in early December AMD will release a new Catalyst driver that opens up stream computing on all 4000-series parts and a new Avivo Video Converter application that promises to drastically increase transcoding speeds. AMD also has partnered with Aprius to build 8-GPU stream computing servers to compete with NVIDIA's Tesla brand."
Re:Useless without free drivers! (Score:3, Informative)
We're not talking about video games here. Some people use computers for important work, not just for screwing around.
Hope it's not like the last transcoding software.. (Score:4, Informative)
Last time I looked at the Catalyst/Avivo hardware transcoding software it was somwhat less usable than I hoped. The thing that killed it for me was the lack of batching or cl options. It actually turned out to be less time consuming for me to use a software transcoder with batching and leave it on overnight and while I was at work than it did to go back over to the pc after every finished run to setup the next file for transcoding on the vid card. The quality of the video that was transcoded in hardware was a bit on the patchy side as well.
Somthing that I would be interested in is integrating support into burning software to speed up the transcoding side of DVD video burning. Unfortunately it doesn't look like it's happening any time soon. I think the problem is that by the time technology has matured enough to make it viable, the increase in CPU speed will have made it redundant.
Re:Open standard API (Score:4, Informative)
OpenCL will be the standard; it should support real processors, ATI, NVidia, and maybe Cell if someone bothers to write a backend.
Re:Useless without free drivers! (Score:5, Informative)
Yes, thank you for telling me. I use mine for cancer research. That includes GPGPU, by the way. Yes, I'm serious.
I don't believe I know anyone who uses the source code for their video driver. All the GPGPU people use one of the GPU programming languages. The hard core ones use assembly. The young 'uns will grow up with CUDA. None of those requires the source code for the driver.
Re:Open standard API (Score:3, Informative)
NVIDIA's Tesla products all support single precision IEEE-754 floating point, and their 10-series supports double precision.
Re:When I was a kid, my mom used to tell me (Score:5, Informative)
i'm a graphic designer so run Windows. i haven't touched Linux or Unix in over half a decade. but i'm not a selfish jackass who thinks that only my needs are important, and as long as they are met everyone should just go to hell.
there's nothing arrogant about expecting hardware manufacturers to support the 3 most popular OSes: Windows, OS X, and Linux. and it's precisely because people understand that hardware manufacturers can't be expected to support every single OS out there (even well-known ones like Solaris, FreeBSD, BeOS, etc.) that people are pushing for open source drivers.
your mom may not have told you this, but businesses depend on their customers to make money. so listening to consumers and meeting consumer demands is generally a good idea (ever heard of market research?). by allowing their hardware to be used on a wider range of platforms they are broadening the market for their products.
AMD isn't in the business of selling video card drivers, just the video cards. that is why they have open sourced their Radeon drivers in the past. and if we were all as simple-minded as your mom, then no one would ever speak up for themselves. and hardware manufacturers aren't run by mind readers.
Re:Useless without free drivers! (Score:3, Informative)
I think you're letting your personal ideology cloud your view of the world around you.
I think you missed the sounds of sarcasm. Not too hard to do, as it was confusingly mixed with some other, simpler HHOS-style text.
Of COURSE nobody would trust their critical systems to, say, an OS they don't have the source for!
Most major companies don't. They happily run employee desktops on Microsoft Windows, because they can easily swap them out when they break.
Not a critical system, then. Critical systems are the machines that cause serious problems when they fail.
They run critical legacy systems on IBM mainframes (or whatever). And they run new critical systems on platforms that are almost entirely FOSS.
IBM sells more Linux than AIX today, and they sell quite a bit of Linux across their line, at least on the systems-formerly-known-as-S/390-and-RS/6000. I'm not sure if I'm disagreeing with you, or proving your point, but whatever.
All I know for sure is that the EULA for Windows prohibits using it to control a nuclear reactor, or at least it used to, and it bloody well should.
Re:Open standard API (Score:4, Informative)
NVIDIA's Tesla products all support single precision IEEE-754 floating point, and their 10-series supports double precision.
Nvidia is moving to OpenCL compliance, as well.
Re:AMD Is Out to Lunch (Score:5, Informative)
Theres a lot of tall claims here, but the one that sticks out as most needing some kind of justification is that "The industry seems to have decided that the best approach to parallel computing is to mix two incompatible parallel programming models (vector SIMD and CPU multithreading)". GPU's mix these models fine and I havent seen anyone bitching about the thread schedulers on them or bitching about not being able to use every transistor on a single stream processor at the same time. How you can claim these models are incompatible, when in fact its the only working model we have and it works fine for those using it, is beyond me. You criticize the SIMD model, but the GPU is not SIMD: it is a host of many different SIMD processors, and that in turn makes it MIMD.
Moving on to what you suggest, I fail to see how superscalar out-of-order execution is not MIMD, and we've been doing that shit for years. The decoder pulls in a crap ton of things to do, assigns them to work units, and they get crunched in parallel. Multiple inputs, multiple data sources, smart cpu to try to crunch it all. Intel tried to take is a step further with EPIC explicitly parallel instruction computing and look how that fucking turned out: how many people here know what Itanium even fucking is?
The "how to solve the parallel programming crisis" link is pretty hilarious. Yes, lists of interlocking queues are badass. Unfortunately the naive implementation discussed at the link provide no allowances for cache locality. In all probability the first implementation will involve data corruption and crap performance. Ultimately the post devolves into handwaving bullshit that "the solution I am proposing will require a reinvention of the computer and of software construction methodology". This is laughable. Just because stream processing isnt insanely easy doesnt mean we have to reinvent it just so we arent burdened with dealing with multiple tasks. Even if you do reinvent it, as say XMT has done, you still have to cope with many of the same issues (xmt's utility in my mind is a bridge between vastly-superscalar and less-demanding EPIC).
Good post, I just strongly disagree. The GPU is close to the KISS philosophy: the hardware is dumb as a brick and extremely wide, its up to the programmers to take advantage of it. I find this to be ideal. I've seen lots of muckraking shit saying "this is hard and we'll inevitably build something better/easier" but a lot of people thought the internet was too simple & stupid to work too.
Re:what for the CPU? (Score:3, Informative)
It is still capable of hight speed rendering, so it can still be used as a GPU.