Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Graphics Software Upgrades

AMD Banks On Flood of Stream Apps 124

Slatterz writes "Closely integrating GPU and CPU systems was one of the motivations for AMD's $5.4bn acquisition of ATI in 2006. Now AMD is looking to expand its Stream project, which uses graphics chip processing cores to perform computing tasks normally sent to the CPU, a process known as General Purpose computing on Graphics Processing Units (GPGPU). By leveraging thousands of processing cores on a graphics card for general computing calculations, tasks such as scientific simulations or geographic modelling, which are traditionally the realm of supercomputers, can be performed on smaller, more affordable systems. AMD will release a new driver for its Radeon series on 10 December which will extend Stream capabilities to consumer cards." Reader Vigile adds: "While third-party consumer applications from CyberLink and ArcSoft are due in Q1 2009, in early December AMD will release a new Catalyst driver that opens up stream computing on all 4000-series parts and a new Avivo Video Converter application that promises to drastically increase transcoding speeds. AMD also has partnered with Aprius to build 8-GPU stream computing servers to compete with NVIDIA's Tesla brand."
This discussion has been archived. No new comments can be posted.

AMD Banks On Flood of Stream Apps

Comments Filter:
  • by ceoyoyo ( 59147 ) on Thursday November 13, 2008 @09:42PM (#25756037)

    Uh, what? Just like your video card is useless for displaying graphics without open source drivers?

  • by TubeSteak ( 669689 ) on Thursday November 13, 2008 @09:59PM (#25756203) Journal

    Surely I'm not the only one who thinks this'll be useless without open-source drivers, so you can actually make your fancy cluster use these vector-processing units.

    You may or may not be surprised by this, but not all of the magic happens in hardware, which is why you don't see open sourced drivers for a lot of stuff.

    Sometimes it just makes sense to put the optimizations in the driver, so that when you tweak them later, you don't have to flash the BIOS.

  • by Wesley Felter ( 138342 ) <wesley@felter.org> on Thursday November 13, 2008 @10:04PM (#25756261) Homepage

    Plenty of people seem to be getting serious work done with NVidia's proprietary Linux CUDA drivers.

  • blah blah blah (Score:2, Insightful)

    by coryking ( 104614 ) * on Thursday November 13, 2008 @10:15PM (#25756347) Homepage Journal

    You chose to run on a platform knowing full well these things aren't likely to be supported. Very little sympathy from me. Sorry.

    If you want to encourage more drivers on your platform of chose, perhaps you might consider making it easier for hardware companies to target your kernel. Maybe consider, oh I don't know, a stable, predictable ABI?

    Maybe loose the attitude as well. The world doesn't owe you or your OS choices anything. All you can do is focus your efforts at making your platform of choice attractive to those whose support you wish to seek.

    For hardware vendors, it is easy. Simply make it cheap and easy to write drivers on your platform of choice. Ask yourself, "Is it cheap and easy for hardware vendors to target my platform"? If the answer is "No", then figure out what you can do to make it cheap and easy for them. If that is impossible because if violates some guiding values of your platform, well shucks, either be a man and deal with it, or reconsider your value system. Whining about how the world isn't giving what it owes you doesn't help anybody.

  • by Anonymous Coward on Thursday November 13, 2008 @10:28PM (#25756493)

    No standard API yet, because the NVIDIA chips only work on integers and the Stream processors actually can do double precision. From a scientific computing standpoint, portability of my codes is almost as simple as a patching process to get the keywords correct (nevermind the memory handling and feedback, that is still pretty different) and that is only because of the floating point. It's a pain in the you know what trying to keep numbers straight when you have to multiply everything by 10000.

  • by ceoyoyo ( 59147 ) on Thursday November 13, 2008 @10:30PM (#25756507)

    Any my point is, why? All you need is a decent API. Claiming it's useless without open source drivers is just a silly ruse by an open source zealot to advance an agenda.

    Open source has a lot of things going for it, but it's more fanatical followers are not among them.

  • by Louis Savain ( 65843 ) on Thursday November 13, 2008 @10:38PM (#25756579) Homepage

    And so are Intel and Nvidia. Vector processing is indeed the way to go but GPUs use a specific and highly restricitve form of vector processing called SIMD (single instruction, multiple data). SIMD is great only for data-parallel applications like graphics but chokes to a crawl on general purpose parallel programs. The industry seems to have decided that the best approach to parallel computing is to mix two incompatible parallel programming models (vector SIMD and CPU multithreading) in one heterogeneous processor, the GPGPU. This is a match made in hell and they know it. Programming those suckers is like pulling teeth with a crowbar.

    Neither multithreading not SIMD vector processing is the solution to the parallel programing crisis. What is needed is a multicore processor in which all the cores perform pure MIMD vector processing. Given the right dev tools, this sort of homogeneous processing environment would do wonders for productivity. This is something that Tim Sweeny [wikipedia.org] has talked about recently (see Twilight of the GPU [slashdot.org]). Fortunately, there is a way to design and program parallel computers that does not involve the use of threads or SIMD. Read How to Solve the Parallel Programming Crisis [blogspot.com] for more.

    In conclusion, I will say that the writing is on the wall. Both the CPU and the GPU are on their death beds but AMD and Intel will be the last to get the news. The good thing is that there are other players in the multicore business who will get the message.

  • by waferhead ( 557795 ) <[moc.oohay] [ta] [daehrefaw]> on Thursday November 13, 2008 @10:41PM (#25756603)

    You do realize this article has ~absolutely nothing to do with gaming, or even normal users, right?

    The systems discussed using CUDA or GPGPU will probably spend ~100% of their lives running flat out, doing simulations or such.

    Visualize a Beowulf Cluster of these. Really.

  • by coryking ( 104614 ) * on Thursday November 13, 2008 @10:48PM (#25756649) Homepage Journal

    Already, 99% of people don't use 99% of the power of their CPUs 99% of the time

    So by your logic, those people would be happy with a computer that was 1% as fast as what it is now?

    Make no mistake, once you actually hit that 1% of the time you need 100% of your CPU, the more the better. I can think of two horsepower intense things a normal, every day joe now expects his computer to do:

    1) Retouching photos
    2) Retouching and transcoding video (from camera/video camera -> DVD)

    Dont underestimate transcoding video either. More and more people will be using digital video cameras and expect to be able to output to DVD or Bluray.

  • by ceoyoyo ( 59147 ) on Thursday November 13, 2008 @11:00PM (#25756729)

    Like I said about zealots making things up....

    Your argument about the business world not using non-open source is spot on. Excellent example. Of COURSE nobody would trust their critical systems to, say, an OS they don't have the source for! Never mind closed source apps! Naturally they only buy video cards that have open source drivers too.

  • by lysergic.acid ( 845423 ) on Thursday November 13, 2008 @11:17PM (#25756829) Homepage

    what agenda are they advancing? the agenda of being able to use this feature on the platform they are running?

    sure, if all hardware manufacturers were in the habit or releasing Unix & Linux drivers then closed-source binaries and a decent API would be fine. but the reality is that many manufacturers do not have good Linux/Unix support. that is fine. but if they want to leave it to the community to develop the Linux/Unix drivers themselves then it would be really helpful to have open source Windows drivers to use as a template.

    it's not useless to you since you're running Windows, but not everyone uses a Windows platform for their research. for those people it would be useless without either, open source drivers or a set of Linux/Unix drivers. i mean, if you're already running a Beowulf cluster of Linux/BSD/Solaris machines then it might not be practical to convert them to a Windows cluster (can you even run a Beowulf cluster of Windows machines?), not to mention the cost of buying 64 new Windows licenses and porting all of your existing applications to Windows.

    it's probably an exaggeration to say that closed-source drivers are useless. and perhaps AMD will release Linux/Solaris/Unix drivers. but if they're not going to then open sourcing the Windows drivers and the hardware specs would be the next best thing. and the outcry for open source drivers isn't without some merit since past Linux support by AMD/ATI with proprietary drivers have left much to be desired, with Linux drivers only receiving updates half as a often as the Windows drivers and consistently underperforming against comparable graphics cards.

  • by ceoyoyo ( 59147 ) on Thursday November 13, 2008 @11:30PM (#25756927)

    It's fine to lobby for open source drivers. It's also great if you want to run something on your chosen platform and you want the company who makes the hardware to support that. Both of those I can wholeheartedly support.

    Claiming that something is useless without open source drivers is either dishonest or deluded. As I said, I don't think the important goals of the open source movement are served by either lying or ranting about your delusions.

  • by coryking ( 104614 ) * on Thursday November 13, 2008 @11:32PM (#25756943) Homepage Journal

    with Linux drivers only receiving updates half as a often as the Windows drivers and consistently underperforming against comparable graphics cards

    If something hurts, stop doing it.

    You expect the world to cater to your lifestyle choices. You made the choice to run a platform that isn't well supported by video card manufacturers. Either stop using the platform, or find video cards that work on your platform. What if there is no good video cards for your platform? Tough luck. Sorry. You should have considered that before installing the OS, eh?

    It is beyond arrogant to expect the world to cater to your choice of operating system.

  • by White Flame ( 1074973 ) on Friday November 14, 2008 @12:14AM (#25757183)

    Wait a minute. Typically the SIMD of GPU commands is for handling vector triples (coordinates or colors) and matrices, which completely translates into supercomputer tasks that are being talked about in TFA: "tasks such as scientific simulations or geographic modelling".

    GPUs nowadays have hundreds of parallelized vector/matrix processors and the drivers & hardware take care of scheduling them all through those pipelines for you. Within the targeted fields, I can't see a downside of this sort of development.

  • by Fulcrum of Evil ( 560260 ) on Friday November 14, 2008 @01:17AM (#25757529)

    You expect the world to cater to your lifestyle choices.

    Of course - we are a customer base, and we expect to have our needs catered to.

    ou made the choice to run a platform that isn't well supported by video card manufacturers. Either stop using the platform, or find video cards that work on your platform. What if there is no good video cards for your platform? Tough luck. Sorry. You should have considered that before installing the OS, eh?

    Third choice: lobby for support from major chip manufacturers. What makes you think a large group of users is powerless, anyway? 5% of 100M is 5MM people, with some of them having cause to buy 100+ units of product.

    It is beyond arrogant to expect the world to cater to your choice of operating system.

    Don't need the world. Just need a couple companies.

  • by LordMyren ( 15499 ) on Friday November 14, 2008 @05:09AM (#25758405) Homepage

    Yes yes yes and yes.

    However, AMD already said it was backing OpenCL. I'm pissed as fuck I didnt hear anything about OpenCL this press cycle, but they're the only major graphic company to have ever stated they were getting behind OpenCL: I'm holding onto hope.

    You're right: no one uses Brook. Trying to market it as any way part of the future is a joke and a mistake: a bad one hopefully brought on by a 2.50$ share price and pathetic marketting sods. On the other hand, I think people using CUDA are daft too; its pre-programmed obsolesence, marrying yourself to proprietary tech that one company no matter how hard they try will never prop up all by themselves.

    OpenCL isnt due out until Snow Leopard, which is rumored to be next spring. Theres still a helluva lot of time.

  • by Brit_in_the_USA ( 936704 ) on Friday November 14, 2008 @11:35AM (#25760663)
    The traction for this will come when someone releases opensource audio, and later, video encoder libraries using GPU acceleration based upon this (or another) abstraction layer.

    MP3, OGG, FLAC - get these out the door (especially the first one) and a host of popular GUI and CLI encoders would jump on the bandwagon. If there are huge speed gains and there are no incompatibility issues because the abstraction layer and drivers are *stable* and retain *backwards compatibility* with new releases then more people will see the light and there will be pressure to do the same with video encoders. Before you know it the abstraction layer would become defaco and all GPU makers would follow suit - at that point we (the consumer) would win in having something that works on more than one OS on one particular card from one particular GPU maker and we can get on with some cool innovations.

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...