Japan Wants to Build 10 Petaflop Supercomputer 353
deepexplorer writes "Japan wants to gain the fastest supercomputer spot back. Japan wants to develop a supercomputer that can operate at 10 petaflops, or 10 quadrillion calculations per second, which is 73 times faster than the
Blue Gene. Current fastest supercomputer is the partially finished Blue Gene is capable of 136.8 teraflops and the target when finished is 360 teraflops."
BlueGene/C will be finished soon (Score:5, Interesting)
Re:Mommy, mommy (Score:0, Interesting)
Re:BlueGene/C will be finished soon (Score:4, Interesting)
Is there a reason why they aren't even close to alphabetical order?
Do the letters stand for something else?
Re:BlueGene/C will be finished soon (Score:4, Interesting)
If you want faster , here ya go. (Score:2, Interesting)
http://www.hq.nasa.gov/hpcc/insights/vol6/superco
Using 'general' processors is cheap but the wrong direction according to the best supercomputer expert from Stanford. He designed some cray computers.
http://content.techweb.com/wire/26802955 [techweb.com]
Re:BlueGene/C will be finished soon (Score:3, Interesting)
Re:teaflops (Score:4, Interesting)
Seriously, what do the editors do here? They don't check the writing, they don't check the accuracy of stories, and forget about it if you want them to post a correction to something...
Re:It's all about the $'s (Score:3, Interesting)
BlueGene's PPC chips ARE custom for that line of computer, though VTech's Mac cluster is pretty much off-the-shelf.
Itanium isn't custom they are not hard to get, just that there isn't much demand. I think they are kind of nifty, though not competitive for general server use, might be OK for supercomputers, and has high-availability features not found in Xeon and Opteron.
I'm not sure if there is anything special about Opterons now, other than having more hypertransport links, and being special binned parts to take higher temperatures and consume less wattage than a comparable Athlon64, much like Xeon is to Pentium4. I think 1xx Opterions are basically the same as Athlon64.
Re:BlueGene/C will be finished soon (Score:4, Interesting)
Its a government makework project (Score:5, Interesting)
I don't really know why we love gigantic computers, though. I live in a prefecture which is Japan's answer to rural Iowa and we built a 1,300 node distributed supercomputer without any idea of a feasible application to run on it -- we ended up computing a few zillion solutions to N-Queens before mothballing the project (I was hoping for enough CPU time to take the world record back from the real supercomputer at the Japanese university that currently holds it, but unfortunately it was not to be).
Re:To put 10 Petaflops in perspective (Score:5, Interesting)
So this new Japanese supercomputer is running at a whopping 10 brainsecs!!! Imagine, you could simulate about 9 people or 47 slashdotters in that supercomputer (some of the power would be required to manage the simulatioins).
Seriously though, AI research will go mainstream with the first supercomputer that can process at greater than 1 brainsec.
Re:Its a government makework project (Score:5, Interesting)
In many respects the national labs are like NASA, they are high tech job programs for deep thinkers who would be dangerous if they were unemployed like their counterparts in Russia.
So they build giant computers, and hopefully figure out useful code to run on them though its not clear if they do have anything useful to run on them. There are always weather sims and protein foldings to do.
The worst problem is the tyranny of Moore's law. They take years to complete and by the time they are fully operational they are obsolete so you just start building a new one.
You wonder how people designed engineered marvels like the first fission and fusion bombs, Apollo and the SR-71 back in the day when they had next to no computing power. Now we have this extraordinary computing power but we have real problems building interesting things in the real world. The Shuttle made massive use of CFD, CAE etc but its a complete lemon. We keeping doing massive simulations of nuclear bombs but we never actually set any off and really don't even want them anymore. Well thats not true the Bush administration is in fact trying to restart development of new nukes and in fact want to build one for busting bunkers and caves. If they manage to get it built not only will the test ban treaty be out the window but the U.S. will start using them as a matter of routine in conventional wars and maybe just to take out a suspected nest of terrorists here and there. Maybe all this computing power will help make them in to exceptionally good tactical weapons which will get a lot of mileage.
Definition of supercomputer getting stretched (Score:4, Interesting)
Eventually small countries will connect all the computers of their entire population with distributed clients and call that the world's largest supercomputer.
This business of entering a command, waiting a minute for zillions of nodes across a slow network to start, and waiting another minute for all the nodes to finish is hardly what supercomputing used to be.
It would be more interesting to see who does the most work with the least latency or who does the most work with the simplest programming model. Anyone can write a massively parallel program to utilize every Opteron in the world but a computer which can do the same work sequentially seems like a much bigger step forward.