Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Supercomputing Hardware

Japan Wants to Build 10 Petaflop Supercomputer 353

deepexplorer writes "Japan wants to gain the fastest supercomputer spot back. Japan wants to develop a supercomputer that can operate at 10 petaflops, or 10 quadrillion calculations per second, which is 73 times faster than the Blue Gene. Current fastest supercomputer is the partially finished Blue Gene is capable of 136.8 teraflops and the target when finished is 360 teraflops."
This discussion has been archived. No new comments can be posted.

Japan Wants to Build 10 Petaflop Supercomputer

Comments Filter:
  • by Anonymous Coward on Monday July 25, 2005 @09:59PM (#13162306)
    Well I want a Stargate, but that doesn't mean I'm gonna get one. I bet OpenOffice.org will still take 5 minutes to start on it.
  • by Anonymous Coward on Monday July 25, 2005 @09:59PM (#13162308)
    The supercomputer will be pocket-sized and ran on two AA batteries.
  • Ahh (Score:5, Funny)

    by pHatidic ( 163975 ) on Monday July 25, 2005 @10:00PM (#13162313)
    I see they are upgrading to get ready for Longhorn.
  • by Anonymous Coward on Monday July 25, 2005 @10:00PM (#13162314)
    These 136.8 teaflops could have been avoided if the proper specifications were used before hardware development and programming began. Essential tea technical info. [bbc.co.uk]
    • by Anonymous Coward
      Actually, this is a key component of the infinite improbability drive when combined with not teaflops.
  • by Raul654 ( 453029 ) on Monday July 25, 2005 @10:02PM (#13162323) Homepage
    BlueGene/L is the fastest super computer at the moment; however, BlueGene/C (which, for the record, I'm working on as part of my PhD) will be finished very soon (it was supposed to be out of the foundry by the end of August, but the project is running slightly behind schedule). I'm told there are, as yet, no plans to publish any performance benchmarks.
  • by account_deleted ( 4530225 ) on Monday July 25, 2005 @10:03PM (#13162330)
    Comment removed based on user account deletion
  • by daveschroeder ( 516195 ) * on Monday July 25, 2005 @10:03PM (#13162331)
    ...and I want a pony.

    Guess which two things aren't happening anytime soon?
  • by Valarauk ( 670014 ) on Monday July 25, 2005 @10:05PM (#13162346)
    I mean seriously... Doom 4 isn't even out yet.
  • by Anonymous Coward on Monday July 25, 2005 @10:06PM (#13162352)
    Big deal, the white mice have had this beat for years...
  • by icepick72 ( 834363 ) on Monday July 25, 2005 @10:09PM (#13162367)
    Japan wants to gain the fastest supercomputer spot back.
    Japan wants to develop ...

    Japan wants a lot of things now doesn't it. Well, Japan will just have to be a good little country and maybe Santa will come.

  • by medep ( 830402 ) on Monday July 25, 2005 @10:12PM (#13162381)
    japan is thinking "but we just bought this computer, it's obsolete already? shit a brick!" anyone in the market for a slightly used supercomputer?
  • Man... (Score:3, Funny)

    by Lobster Cowboy ( 605052 ) on Monday July 25, 2005 @10:19PM (#13162410)
    The Japanese are really sensitive about the whole "small penis" thing.
    • Re:Man... (Score:2, Informative)

      by guardiangod ( 880192 )
      If you think about it, US has the same "penis thing" more or less.

      Japanese has the ESC that reaches 36TFlops in 2002. US, not wanting to be left behind, build Blue Gene that will reaches 1PFlops by 2006 (28x in 4 years relative to ESC).

      Japan, in response to that, wants to build one that reaches 10PFlops by 2010 (10x in 6 years relative to Blue Gene).

      Japanese - surpasses opponent by 10 times in 6 years.
      US - surpasses opponent by 28 times in 4 years.

      Now, tell me, which one has the more serious disease k
  • wow... (Score:4, Funny)

    by idiotdevel ( 654397 ) on Monday July 25, 2005 @10:23PM (#13162424)
    yeah... um... so I'm guessing OpenOffice would at least startup semi-fast on that machine

  • I can just picture the case mod.
  • by IoN_PuLse ( 788965 ) on Monday July 25, 2005 @10:28PM (#13162449) Homepage
    If only there was a supercomputer that could revise news posts before they go live? It could be in the form of *gasp* an editor!?
  • by Duncan3 ( 10537 ) on Monday July 25, 2005 @10:30PM (#13162456) Homepage
    That much heat in one place has got to wake up something doesn't it?
  • Overclock (Score:2, Insightful)

    by 3770 ( 560838 )
    Big deal!!!

    I only have to overclock my Pentium 4 83000 times to beat that little pocket calculator.

    (Pentium 4 3.06 GHz has a theoretical max of 12 Gigaflops)
  • PETA (Score:5, Funny)

    by ndansmith ( 582590 ) on Monday July 25, 2005 @10:35PM (#13162472)
    No animals will be harmed in the production of this computer.
  • Superconducting supercomputer. Too expensive but maybe need to build one to see how they work.
    http://www.hq.nasa.gov/hpcc/insights/vol6/supercom .htm [nasa.gov]

    Using 'general' processors is cheap but the wrong direction according to the best supercomputer expert from Stanford. He designed some cray computers.

    http://content.techweb.com/wire/26802955 [techweb.com]
    • The designer of custom computers says that going general purpose is a bad idea? Say it isn't so.

      I think Cray had their chance, many times. They succeeded many times but also failed many times. There are certainly drawbacks to cheap modular supercomputers (read: clusters), but the cost of a true supercomputer is so high that not many universities, governments or corporations can afford them or justify spending the difference.

      The problem is that the cost of developing custom computer chips (CPUs and supp
  • by EvilLile ( 669198 ) on Monday July 25, 2005 @10:49PM (#13162549)
    I guess you could say they're Peta-philes.
  • by mswope ( 242988 ) on Monday July 25, 2005 @10:53PM (#13162565) Journal
    How many BogoMips is that?
    • Re:10 Petaflops? (Score:3, Informative)

      by eluusive ( 642298 )
      How did that get rated insightful? Do you mods have no idea what bogomips are? It stands for BOGUS MIPS. It's how linux deals with certain timing issues. Basically, how long it takes to go through a loop that does absolutly nothing. It has no meaning in terms of flops or even MIPS. http://en.wikipedia.org/wiki/Bogomips [wikipedia.org]
  • by suitepotato ( 863945 ) on Monday July 25, 2005 @10:57PM (#13162579)
    First, it seems almost powerful enough that it might start and run Adobe Premiere within four or five hours instead of six or seven.

    Second, Kingdom of Loathing would finally have zero lag on the server side.

    Third, it might be slightly more resistant to Slashdoting and building a router out of one of these might complete the defense.

    Fourth, by the time this ends up on my desktop, Duke Nukem Forever will be in beta.

    Other than that, should make wonderful blurb filler regarding chess matches with Russians for kids' science news periodicals.
  • Bugger when your tea flops. Especially a high tea.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Monday July 25, 2005 @11:06PM (#13162611)
    Comment removed based on user account deletion
  • by patio11 ( 857072 ) on Monday July 25, 2005 @11:20PM (#13162668)
    I'm sure you guys have heard of our propensity for building bridges here? Including long bridges to islands with no real need for them, built in multiples sufficient to carry the entire population of the island off of it at a single time? Which are then built to withstand typhoons and earthquakes (well, OK, THATS not irrational). This is the same thing, except for the tech industry. And the US government does the same thing -- NASA and a good deal of the Department of Defense R&D fund are basically slushfunds to keep engineers employed in the hope that they come up with something useful in the meantime (and I would be remiss if I didn't point out that pork is well-appreciated come election time).

    I don't really know why we love gigantic computers, though. I live in a prefecture which is Japan's answer to rural Iowa and we built a 1,300 node distributed supercomputer without any idea of a feasible application to run on it -- we ended up computing a few zillion solutions to N-Queens before mothballing the project (I was hoping for enough CPU time to take the world record back from the real supercomputer at the Japanese university that currently holds it, but unfortunately it was not to be).

    • by demachina ( 71715 ) on Tuesday July 26, 2005 @12:48AM (#13162922)
      Much of the U.S. fixation goes back to the signing of the Comprehensive Nuclear Test Ban Treaty in 1996. The U.S. had a bunch of powerful labs full of top scientists whose job in life was to build and test nuclear weapons. This treaty pretty much put them out of business. Clinton distracted them by giving them millions of dollars to build gigantic supercomputers. The goal was to simulate nuclear explosions, predict how the U.S. nuclear stockpile would age and insure it would still work if the need arose without ever testing it ever again. They use to prove this by taking one out and setting it off in Nevada to make sure it still worked. Now they write simulations. Maybe the are very good at those simulations and they can in fact insure the nuclear arsenal is sage and potent. Unfortunately if they never set one off again they will never now if their simulations are any good. They might just be wasting billions of dollars.

      In many respects the national labs are like NASA, they are high tech job programs for deep thinkers who would be dangerous if they were unemployed like their counterparts in Russia.

      So they build giant computers, and hopefully figure out useful code to run on them though its not clear if they do have anything useful to run on them. There are always weather sims and protein foldings to do.

      The worst problem is the tyranny of Moore's law. They take years to complete and by the time they are fully operational they are obsolete so you just start building a new one.

      You wonder how people designed engineered marvels like the first fission and fusion bombs, Apollo and the SR-71 back in the day when they had next to no computing power. Now we have this extraordinary computing power but we have real problems building interesting things in the real world. The Shuttle made massive use of CFD, CAE etc but its a complete lemon. We keeping doing massive simulations of nuclear bombs but we never actually set any off and really don't even want them anymore. Well thats not true the Bush administration is in fact trying to restart development of new nukes and in fact want to build one for busting bunkers and caves. If they manage to get it built not only will the test ban treaty be out the window but the U.S. will start using them as a matter of routine in conventional wars and maybe just to take out a suspected nest of terrorists here and there. Maybe all this computing power will help make them in to exceptionally good tactical weapons which will get a lot of mileage.
    • But in the US they have real goals with most of the super computers. Nuclear weapons research/testing is a popular one. Apparantly supercomputers are good enough these days to actually test current stockpiles via simulation. This is useful, given that the US is a signatory on a nuclear test-ban, which applies to actual detonations only (you can screw aroudn on camputers all you want, just no actual blowing up of weapons).

      Weather modeling is another favourite.

      As I understand it, all the Blue Gene series ar
    • Haha, no kidding, but at least they aren't spending it on totally moronic crap like a flight suit for a plane ride for the president [wikipedia.org]. I'd be really funny to see Koizumi or the emperor pulling a stunt like that, though.

      I live in a prefecture which is Japan's answer to rural Iowa

      Hmmm... Kumamoto? Aomori? Hokkaido? Enquiring minds want to know what part of Japan is their answer to rural Iowa...

      Not that I should talk, I'm from Kagoshima. It's not so different from the American deep south, really, including
  • Columbia (Score:5, Insightful)

    by RobiOne ( 226066 ) on Monday July 25, 2005 @11:26PM (#13162694) Homepage Journal
    Why don't they ever mention the real world stats or operational supercomputers?
    They keep saying BlueGene/L when it's not even completed (maybe it finally is). There's also /C which is falling behind, but at least they're not reporting any numbers until it actually works.

    The fastest operational (like anything else matters) supercomputer is Columbia at NASA. And guess what? It's doing a ton of usefull work, like helping make sure the Space Shuttle launches without a hitch by computing all the Thermal Protection System problems and various other analyses.

    Look at the number of processors it uses and it's performance compared to the others. It's one of the more efficient of the bunch.

    Just wait until they upgrade it..

    Top500 should include different rankings, like efficiency or measurable areas other than projected TFlops. In the end it's not how many you got, but how well you can use them.
    • Re:Columbia (Score:4, Informative)

      by maswan ( 106561 ) <slashdot2.maswan@mw@mw> on Tuesday July 26, 2005 @01:11AM (#13162993) Homepage
      The top500 list only includes existing supercomputers, not future ones. You have to run the benchmark, not guess how fast it will go.

      Now, for a more "realistic" benchmark than hplinpack, this has been tried and talked about for quite some time. It is a hard problem actually, because different supercomputers are designed for different usage. HPL is a useful upper bound for realistic calculations over the whole computer, but it is far from the whole truth.

      The BlueGenes out there have don real work, in doing the signal processing of a distributed radio telescope (the one in the netherlands) and protein folding/molecular dynamics (the US one).

      And while efficiency might be important, remember that if you can get a machine twice as big by going down to 90% of the efficiency for the same price, the smart move is usually, but not always, to buy the larger machine.

      /Mattias Wadenstein - sysadmin at #388 on the list

  • That's almost as powerful as the PS3 will be! Amazing.
  • so should it still rhyme with beta, zeta, eta, and theta? Or should it be pronounced like pita bread?
  • by saratchandra ( 847748 ) on Monday July 25, 2005 @11:38PM (#13162733) Homepage
    Recently at the Linux Clusters HPC Conference http://www.linuxclustersinstitute.org/Linux-HPC-Re volution/ [linuxclust...titute.org] , I learnt about LLNL's plans(Lawrence Livermore National Labs) for the next biggest supercomputer.

    From what I recall about Peloton(that's what the presenter called it), they wish to have a 14.8 TF/s scalable unit with 4x Infiniband interconnect. This scalable unit itself is more than half the power of Thunder(ranked 7 in Top 500) http://top500.org/lists/plists.php?Y=2005&M=06 [top500.org] They plan to have 16 such scalable units.

    For those who are interested in the specs: Peloton is 16 SU with 236.5 TeraFLOP/s, 215 TiB memory, 5.0 PB global disk system with 6,720 SMPs and 48+24 = 72 IBA 4x DDR sw. Power is 4.05 MW.

  • Re: (Score:2, Funny)

    Comment removed based on user account deletion
  • According to Sony's published specs, the PlayStation 3 will top that number easily!
  • Hot Air Popcorn popper.

    The butter pot is sold separately.

  • by heroine ( 1220 ) on Tuesday July 26, 2005 @02:21AM (#13163199) Homepage
    Nowadays the supercomputer contest is just a matter of who can buy the most Opteron PC's and Cisco routers from Newegg and connect them. You might as well buy a few million DVD's from Best Buy and say you have the world's largest hard drive.

    Eventually small countries will connect all the computers of their entire population with distributed clients and call that the world's largest supercomputer.

    This business of entering a command, waiting a minute for zillions of nodes across a slow network to start, and waiting another minute for all the nodes to finish is hardly what supercomputing used to be.

    It would be more interesting to see who does the most work with the least latency or who does the most work with the simplest programming model. Anyone can write a massively parallel program to utilize every Opteron in the world but a computer which can do the same work sequentially seems like a much bigger step forward.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...