Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware Hacking Science

Simulating Supernovae with Graphics Cards 85

astroboy writes "As graphics cards get more powerful, Los Alamos and Utah scientists have developed a package, Scout, to use those usually-languishing FLOPs to do simulations, and to visualize of them on the on the run. As an example, they have released movie of part of the evolution of a core-collapse supernovae"
This discussion has been archived. No new comments can be posted.

Simulating Supernovae with Graphics Cards

Comments Filter:
  • by TripMaster Monkey ( 862126 ) * on Saturday June 11, 2005 @06:55PM (#12791319)


    From TFA:


    To make the technology much more powerful, McCormick is working on a version of Scout that will work when several computers are linked together.

    I guess they can.

    ^_^
  • ...not to directly link to movies in the article yet? I predict their site's gonna do a little core-collapsing of its own in the next few minutes...
  • by Bloodlent ( 797259 ) <iron_chef_sanjiNO@SPAMyahoo.com> on Saturday June 11, 2005 @07:01PM (#12791348)
    But they can't bring back Suprnova? Dammit! How am I gonna get my Desperate Housewives?!
  • by TripMaster Monkey ( 862126 ) * on Saturday June 11, 2005 @07:02PM (#12791356)


    From TFA:


    Peter Schröder, a computer simulation expert at the California Institute of Technology, believes graphics processors have great potential for scientific research. "There is a real market driving this hardware that we can use for scientific computation," he told New Scientist.

    Actually, Peter and his buds just got sick of getting scragged in DeathMatch because the video cards in their lab computers are teh SUXX0R.
    Now, they have a blank check to get whatever video cards they want.

    ^_^
  • worst video ever....
  • by Chris Tucker ( 302549 ) on Saturday June 11, 2005 @07:10PM (#12791410) Homepage
    Well, you've destroyed those nice scientists computers.

    Go to your rooms and I want you to think long and hard about what you've done!
  • by a_greer2005 ( 863926 ) on Saturday June 11, 2005 @07:13PM (#12791437)
    I have seen graphics cards go supernova - just overclock one and you can see it too...FOR REAL, no 3D sim crap,..
    • I tried... When I OC too high it just locks the computer up and scrambles the onscreen fonts... For a real show turn off the lights, voltmod a card, remove the hsf during HL2, and pour saltwater on it when it starts smoking. ;)
  • boot up the GIMP: filters>light effects>supernova dunno what the big deal is?
  • Movie torrent (Score:4, Informative)

    by slavemowgli ( 585321 ) on Saturday June 11, 2005 @07:15PM (#12791444) Homepage
    BitTorrent for the movie, in case of Slashdotting: here [schneelocke.net]
    • by Anonymous Coward on Saturday June 11, 2005 @07:26PM (#12791492)
      MOTION PICTURE ASSOCIATION OF AMERICA, INC.
      15503 VENTURA BOULEVARD
      ENCINO, CALIFORNIA 91436
      UNITED STATES
      PHONE: (818) 728-8127
      Email: MPAA23@pacbell.net
      Anti-Piracy Operations

      Date: June 11, 2005

      Dear slavemowgli:

      The Motion Picture Association of America is authorized to act on behalf of the following copyright owners:

      Columbia Pictures Industries, Inc. Disney Enterprises, Inc. Metro-Goldwyn-Mayer Studios Inc. Paramount Pictures Corporation TriStar Pictures, Inc. Twentieth Century Fox Film Corporation United Artists Pictures, Inc. United Artists Corporation Universal City Studios, Inc. Warner Bros., a Division of Time Warner Entertainment Company, L.P.

      We have knowledge that you posted a torrent to one of our client's movie (The Scout : http://www.imdb.com/title/tt0111094/ [imdb.com]) and are demanding that you withdraw this link at once.

      Failure to do so will make you loose more then just your modpoints :P ...

  • torrents and mirrors (Score:2, Informative)

    by xbmodder ( 805757 )
    the PDF http://xbmodder.us/scout.pdf [xbmodder.us] the torrent for the .mov: http://xbmodder.us/Scout.mov.torrent [xbmodder.us] The torrent for avi (divx4) http://xbmodder.us/Scout.avi.torrent [xbmodder.us]
  • by smoany ( 832744 ) on Saturday June 11, 2005 @07:21PM (#12791478)
    Usually, I think that New Scientist is pretty accurate as far as laymen-science articles go, but they've let a big mistake slip be.

    From the article:
    "The Scout programming language, developed at Los Alamos National Laboratory (LANL) in California, US, lets scientists run complex calculations on a computer's graphics processing unit (GPU) instead of its central processing unit (CPU).

    Los Alamos National Labs (LANL) is based in (fittingly) Los Alamos, New Mexico. it is currently operated by the University of California, which has contracted for the ability to manage the lab. This may have caused the confusion.

    Also, Lawrence Livermore National Labs (LLNL) is based in Northern California, so that may have caused the confusion as well.

    Not a terribly serious concern, but their fact's should be straight. The lab is not in California, it is in New Mexico... Editors: shame on you!
    • by smoany ( 832744 )
      Of course, in my hurry to post my response, I let a few big editing slips pass by...

      It should be "Slip by" not "Slip be"

      Also, it should read "facts" not "fact's".

      Oh well. I never said I was good at editing, only that New Scientist should have been.
    • Acually, smoany, in an effort to be even MORE correct, you might be wrong in one way. I live in Livermore, CA and it is home to not one, but TWO national laboratories; LNLL and Sandia National Laboratories. Although, LNLL and Los Alamos might do some collaborative work, Sandia, in fact, is Los Alamos' sister Lab. Now, I'm not sure on all the details, but if I had to make an educated guess I'd assume that you meant Sandia Lab in Livermore, CA. Check this out, you'll notice that the two labs are separated by

    • [...] but they've let a big mistake slip be.

      [...]

      Not a terribly serious concern, but their fact's should be straight. [...] Editors: shame on you!


      Shame on your editor! :)
  • Supernovea are really just galactic servers that have been royally slashdotted.
  • by geneing ( 756949 ) on Saturday June 11, 2005 @07:40PM (#12791552)
    If I understand correctly graphics cards don't implement IEEE floating point standard. This means that you can expect all kinds of wierd problems with complicated floating point computations ahref=http://www.cs.berkeley.edu/~wkahan/ieee754st atus/754story.html [slashdot.org]http://www.cs.berkeley.edu/~wkah an/ieee754status/754story.html>. I wonder how they know they can trust results of their simulations.
    • by mmp ( 121767 )
      NVIDIA's GPUs are only one or two bits short of perfect 32 bit IEEE floats. (ATI's are still at 24 bit floats.)

      See Karl Hillesland and Anselmo Lastra's cool work on measuring this error on current GPUs, GPU Floating-Point Paranoia [unc.edu] for much more information.

      -matt
      • It's not a question of the size of a fp number. There are many subtle points in designing "safe" floating point arithmetic.

        IEEE 754 compliance makes fp operations slower, which is why hardware doesn't often support it (famous example Cray where SQRT(1-COS(X)) could return with an error root of a negative number).

        Roundoff errors might not matter for graphics (who cares about being one pixel off?), but it is a huge problem for numerical computations.

        Also, does GPU signal overflow/underflow/division by

    • Err, what? All graphics cards that implement ARB_color_buffer_float has to implement IEEE 32-bit floats, as stipulated by ARB extension specification. (of course, this is assuming that the scientists are using the color buffer to encode information)
      http://oss.sgi.com/projects/ogl-sample/registry/AR B/color_buffer_float.txt [sgi.com]
      Basically, any up-to-date ATi or NVidia gfx cards are capable of true IEEE 32-bit floating point numbers. What really worries me about the research is that they're not using 64-bit!
    • It's not a question of the size of a fp number. There are many subtle points in designing "safe" floating point arithmetic.

      IEEE 754 compliance makes fp operations slower, which is why hardware doesn't often support it (famous example Cray where SQRT(1-COS(X)) could return with an error root of a negative number).

      Roundoff errors might not matter for graphics (who cares about being one pixel off?), but it is a huge problem for numerical computations.

      Also, does GPU signal overflow/underflow/division by

    • well maybe they have a reference? they check their results against known quantities.

      but then again, they may be "unethical" scientists...
    • I did RFTA, and they don't actually run the simulation on the graphics card. On the contrary, they had to downsample the data from 320^3 to 256^3 just to fit it into the GPU's memory. All they did in the GPU was a bit of post-processing (and the rendering, which looks nice enough).

      In a more general sense, I wouldn't "trust" the result of a hydro-only simulation of a SN explosion in any detail. Too much physics left out, and a lot of chaotic dynamics which are only barely resolved (or not at all). An e

  • I was on a flight out of SLC and the guy next to me was working on modelling explosion with some software they had developed - I believe it was this. Quite interesting and we had a nice discussion during the flight.

    The software can do a lot of simulations that previously took a lot mor ehorsepower.
  • by mikael ( 484 ) on Saturday June 11, 2005 @07:52PM (#12791628)
    ... Are they designing a nova bomb?
  • Interesting video clip, but somewhat disappointing! I think Marvin the Martian said it correctly: "Where's the kaboom? There was supposed to be an earth-shattering kaboom!".
  • Source? License? (Score:4, Insightful)

    by tbo ( 35008 ) on Saturday June 11, 2005 @08:33PM (#12791791) Journal
    I know this is slashdot, and I appreciate all the Beowulf cluster jokes, especially since they're actually appropriate here, but nobody is asking any meaningful questions. By my calculations, the noise-to-signal ratio is illegal div_by_zero.

    Where can I get Scout? What is the license? What platforms are supported? I'm working on an open-source scientific computing package for doing quantum simulations [sourceforge.net], and I'd like to use Scout for visualization, but this article provides no information on where to get Scout or even if the licensing would allow me to use it.

    It's also not clear exactly how you'd link Scout up with an existing app. Does Scout produce machine code that you stick into your app somehow? Are there C or C++ wrappers for using Scout?
  • It's probably just me, but did anyone else think the way the model ended up in the last frame of the movie look remarkably like a table lamp?

    Something distinctly Douglas Adams about it all. Maybe they were infinite improbability constants being entered in the console panel.
  • Sad days. (Score:3, Interesting)

    by imsabbel ( 611519 ) on Saturday June 11, 2005 @08:53PM (#12791881)
    Slashdot should rename itself to "news for computer kiddies and layed of cynical IT-veterans who lost touch with technology".

    Both this story and the last one (the quad core one) were nice technical stuff, perfect for nerds.
    And lets take a look here. at the time of that posting , only 2 or 3 comments are even remotely touching the subject. The rest is stupid jokes and dumb ranting.
    The quad core article is even worse, were the only non-joke posters are to stupid to tell apart SMT and dualcore.

    Also it seems to be a sad trend that the initial reaction to ANYTHING even slightly technical/scientific seems to be a self preservation (" im not stupid, this stuff is just ununderstandable !!!11") joke posting.
    • Both this story and the last one (the quad core one) were nice technical stuff, perfect for nerds. And lets take a look here. at the time of that posting , only 2 or 3 comments are even remotely touching the subject. The rest is stupid jokes and dumb ranting. The quad core article is even worse, were the only non-joke posters are to stupid to tell apart SMT and dualcore.
      You're aware it's Saturday right? A weekend day in the US, most people are off work. Even geeks and nerds tend to go do something fun
    • I know! And the rest of the posters are too stupid to know the difference between the words to and too!
  • by Frumious Wombat ( 845680 ) on Saturday June 11, 2005 @09:34PM (#12792081)
    Every few years it seems that some variant of using the GPU comes back for scientific computing. I seem to remember in the early 90s a group using the graphics card for the additional memory it could provide. I run quantum-chemistry simulations for a living (basically large quantities of matrix algebra), so anything that could speed up calculations currently taking weeks would be appreciated.

    Personally, I'd like to see someone port BLAS (or the ATLAS variant) to a set of standard gpus, so that we could speed up matrix ops. I've been hoping for a more general-purpose solution making it to market, such as the old Celerity strap-on vector unit except for modern IA32/AMD64/PPC, but this may be the better solution.

    For those of us who don't have a budget for a Power5 or Cray system, maybe a pair of PCI-e cards running the matrix algebra and FFT routines would be the way to go.
    • It's been found [stanford.edu] that GPUs, despite their impressive floating point capabilities, can't compare to heavily-optimized and cache coherent CPU implementations of large matrix operations, such as ATLAS. The exception is when the result is to be displayed anyway, as in scientific visualization and Scout. The real drawback of GPUs is the readback speeds. When the result is done, if it isn't to be displayed, it must be read back into the CPU memory. This is notoriously inefficient. PCIe is improving this, but it's
  • At this point I would like to mention that the University of Utah is awesome. *quickly shovels any mention of Flieschmann and/or Pons under the rug* ;)
  • seti@home (Score:2, Interesting)

    In a similar vein, the seti@home project [berkeley.edu] is currently developing a new project called "Astropulse" [berkeley.edu] to scan the skies for optical signals from ET. This is also designed to use GPU code [sourceforge.net] to perform the signal analysis. (It would be interesting to see how this woud perform on a PS3 [wikipedia.org], especially now the PS3 is rumoured to ship with Linux pre-installed [theinquirer.net])
  • to visualize of them on the on the run?
  • There have been a few posts complaining (accurately) that the majority of the response to this story has been all jokes and no thinking. The reason for the Beowulf clusters we all joke about is to do big math problems, including simulations of proteins and other big molecules, weather and climate, cosmology stuff like supernovae, etc. FLOPS are our friends, and we should make better use of them, especially cheap ones like the FLOPS in graphics cards (see http://www.eet.com/showArticle.jhtml?articleID=55 3 0 [eet.com]
  • The program running this simulation has the Sh**iest user interface.

Remember to say hello to your bank teller.

Working...