IBM Building 20 Petaflop Computer For the US Gov't 248
eldavojohn writes "When it's built, 'Sequoia' will outshine every super computer on the top 500 list today. The specs on this 96 rack beast are a bit hard to comprehend as it consists of 1.6 million processors and some 1.6TB of memory. That's 1.6 million processors — not cores. Its purpose? Primarily to keep track of nuclear waste & simulate explosions of nuclear munitions, but also for research into astronomy, energy, the human genome, and climate change. Hopefully the government uses this magnificent tool wisely when it gets it in 2012."
and just for old time's sake... (Score:4, Funny)
Can you imagine a Beowolf cluster of those?
Re:and just for old time's sake... (Score:5, Funny)
Can you imagine a Beowolf cluster of those?
No. No I can't. I can't imagine a beowolf cluster of one those. Even if Natalie Portman (covered in grits) was in my base killing my overlords like an insensitive clod. Even if netcraft confirmed it, then it confirmed netcraft in soviet russia. Especially if Cowboy Neal gave me a three step plan leading to profit I could not imagine it.
Enough with the meme. Or not, because I must be new here.
Re:and just for old time's sake... (Score:5, Funny)
Re: (Score:2)
You forgot goat.cx
Unfortunatly, I can't...
Re: (Score:2)
I misspelled it...my bad
goatse.cx
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:3, Funny)
In Soviet Russia a beowulf cluster of those imagines you.
Re: (Score:2)
It's actually going to be serving QuakeLive when it goes live ... finally ...
Re:and just for old time's sake... (Score:5, Interesting)
While you are joking about games (like Quake and Crysis), this computer does sound like a giant graphics card.
It can do 20 Pflops with 1.6 million processors, so 12.5Gflops per processor, but with 1.6TB of memory, it means its only got 1Mb per processor.
So it sounds like some kind of giant specialised GPU with local memory.
Re: (Score:2)
Re: (Score:3, Informative)
The summary is wrong. I actually did RTFA, and it said 1.6 petabytes, not 1.6 terabytes.
Re: (Score:3, Informative)
Except of course that TFA was wrong and it was 1.6 PB / 1.6 MCPUs = 1 GB/CPU.
I imagine though it could run a few gigs (jobs of short or uncertain duration).
Re: (Score:2)
No conspiracy, it has merely faded to yesterday's news. Which can be found in the right column.
Mmm... (Score:5, Funny)
Nice rack(s).
OH NOES!!! (Score:4, Funny)
Re:OH NOES!!! (Score:5, Funny)
Re: (Score:3, Informative)
Well, if IBM builds skynet, then we win the war by saying PWRDNSYS OPTION(*IMMED).
Re: (Score:2)
At 6:20 pm EST there was a SEG FAULT and skynet must reboot to continue genocide
I think what you mean to say was:
Operatior action required on device SKYNET1A (Cancel Reply Ignore)
Unchecked Function
Reply:__________________________________________________
Oh, yes. (Score:5, Funny)
There are many theories as to what this question might be, and now IBM is building a system that will solve this issue once and for all.
Re:Oh, yes. (Score:5, Funny)
No, no. It's being used to calculate the new national debt.
Re: (Score:2, Funny)
No, no. It's being used to calculate the new national debt.
They're going to need a bigger computer.
Re: (Score:2)
No, it can process over 3 tax returns per day.
Re: (Score:2)
I bet it's cracking AES encryption.
Government wants to crack encrypted files of enemies. Realises that it can build a computer capable of doing it for some billions of dollars. Finds excuse to build such a machine.
Although brute-forcing the entire AES keyspace is still infeasible, brute forcing every possible password consisting of 30 typeable characters long certainly isn't.
2012? (Score:2, Offtopic)
Re:2012? (Score:5, Funny)
A group of computer scientists build the world's most powerful computer. Let us call it "HyperThought." HyperThought is massively parallel, it contains neural networks, it has teraflop speed., etc. The computer scientists give HyperThought a shakedown run. It easily computes Pi to 10000 places, and factors a 100 digit number. The scientists try find a difficult question that may stump it. Finally, one scientist exclaims: "I know!" "HyperThought," she asks "is there a God?" "There is now," replies the computer.
End of the world in 2012 (Score:2, Informative)
Re: (Score:2)
In all seriousness, how much processing power would it take to run a program that designs newer and better processors? I would think that 20 petaflops and a good algorithm would be able to produce a processor that is an improvement over the current generation. Then again, I know next to nothing about processor design, so I could be totally wrong.
Re: (Score:2)
The problem is not the hardware, it's the software. Who's going to write the initial algorithm?
Ok, given enough processing power you could do a genetic algorithm for processor design that actually provides useful solutions within a reasonable amount of time, but I have a feeling we're far from that point.
Re: (Score:3, Interesting)
The processor that would be more fit would: draw less power, compute stuff faster, be cheap to produce, etc. Then it could either have a compatible instruction set, or a new one; in case of a new one, it would have to be able to come up with a way of automatically translating stuff from the old instruction set, or targetting a compiler at it.
The case with the new instruction sets sounds really, really interesting. I think the actual hardware de
Re:End of the world in 2012 (Score:4, Informative)
Well, whatever the case, once such a computer is built, someone had better ask it whether or not entropy can be reversed.
Re: (Score:2)
You're right, we have the horsepower.
Now, if only we could produce the software Y'know, the set of "good algorithms" to produce the layouts and the other set of "good algorithms" to test fitness and ... everything else necessary to automatically produce solutions.
*That* seems to be the hard task at the moment. I don't design processors either, so I don't know what types of issues the current design software has but to me it seems that this is probably the hurdle they are facing on that front, not processo
at least they admit its true purpose (Score:3, Funny)
Re:at least they admit its true purpose (Score:5, Funny)
Primarily to keep track of nuclear waste
And this can't be done with say, Excel?
Re:at least they admit its true purpose (Score:5, Insightful)
And this can't be done with say, Excel?
Ahh, Excel... the first choice in corporate database management systems.
How many other slashdotters work at fortune XXX firms where on paper some executive bean counter says "we use oracle" but on the ground all databases are done in Excel (along with a smattering of everything else?)
It is a step up from three jobs ago, where at another fortune XXX the database management system of choice was what boiled down to an administrative assistant and Lotus's word processing solution. Yes we used plain english to request that Patti make changes instead of sql update statements. Also our sql select statements always began with "hey Patti, could you look up...". Any yes, all "ORDER BY" stanzas were in fact powered by swear words and performed by cut and paste.
Sadly I am not making any of this up.
Re: (Score:2)
No be fair there are an awful lot of Access 2 DBs* out there, Excel being a spreadsheet and all NOT a database.
*oh so useful and forwards compatible
Re: (Score:2)
RTFA that's the point ! They are building this supercomputer to run Excel 2012.
don't smell right (Score:4, Funny)
Re: (Score:2)
Yeah, something's not right here. I came to the same conclusion.
Did someone tell the author that they had that much L1 memory, and they didn't understand the difference?
Re: (Score:2)
1.6 million processors in 96 racks is 16000 processors per rack, or about 400 processors per U. To me that sounds like an evolution of the cell processor and 1 MB per cell sounds reasonable.
Re: (Score:2)
It's SIMD. It's a simple set of instructions running on a whole lot of little sets of data. This is the same thing video cards do and it is a great way to solve (some) problems.
Re: (Score:2, Informative)
Processors, not cores (Score:5, Insightful)
Re: (Score:3, Funny)
Yeah, but how many CPUs is that?
Re: (Score:2)
Mmm, doesn't the difference between a core and a processor have to do with how they are connected?
Re: (Score:2)
1.6M Processors, but only 1.6 TB memory? (Score:2)
I would have expected it to have a bit more memory with that many processors.
Re:1.6M Processors, but only 1.6 TB memory? (Score:5, Informative)
My bet is that this is a typo.
1.6 PB seems more reasonable.
Re:1.6M Processors, but only 1.6 TB memory? (Score:5, Informative)
Re: (Score:2)
"BlueGene/P uses a modified PowerPC 450 processor running at 850 MHz with four cores per chip and as many as 4,096 processors in a rack. The Sequoia system will use 45nm processors with as many as 16 cores per chip running at a significantly faster data rate.
Both BlueGene/P and Sequoia consist of clusters built up from 96 racks of systems. Sequoia will have 1.6 petabytes of memory feeding its 1.6 million cores, but many details of its design have not yet been disclosed."
There we go. It is 100,000 processors, with 16 cores each (yes, a core is a processor, but since the summary went out of its way to make this distinction, we should continue to do so for a fair comparison). Summary is wrong (big surprise there).
Re: (Score:2)
Re:1.6M Processors, but only 1.6 TB memory? (Score:5, Informative)
it is indeed 1.6 petabytes: http://www.eetimes.com/news/design/showArticle.jhtml?articleID=213000489 [eetimes.com]
flops not flop (Score:5, Funny)
flops = floating point operation per second
flop = Gigli
The article got it mostly right. It mentioned 500-teraflop once, but every other time it spelled flops correctly. Slashdot, on the other hand, fucked up the title, despite the fact that it pretty much just copied it from the article (poorly).
Re: (Score:2)
Re: (Score:2)
So what does it mean when they talk about them Gigliflops?
Re: (Score:2)
No, genius. His point is that "flops" is singular, with the s standing for second rather than forming a plural. His point is that you say 2 petabyte hard drive and 500 teraflops machine because those are the singular forms.
so they can play raytraced quake mods (Score:3, Funny)
Hopefully the government uses this magnificent tool wisely when it gets it in 2012.
Sounds like they are going to port the quake mods to the raytrace q4 engine.
Fixed if for ya (Score:3, Funny)
"...allowing forecasters to create local weather "events" less than one kilometer across, compared with 10 kilometers today and at speeds up to 40 times faster than current systems."
Cheney's weather machine arrives too late?! (Score:2)
Maybe Biden will resume the new tradition of VP as weather-manipulator.
So let's see.... (Score:5, Insightful)
- IBM is building a computer that will be functional in about 3.5 years.
- The power of this computer, in 3.5 years, will outshine every other supercomputer currently running today.
I should hope so! What's the point of taking 3.5 years to build the thing, if it's going to be 3.5 years out of date by the time they build it?
Heck, in 3.5 years, your desktop computer will be 4 times more powerful than anything currently running today, too.
Duuh.
Re: (Score:2)
If you have a look at http://top500.org/lists/2008/11/performance_development [top500.org] it takes more than 6 years to get 10 times actual performance (quicker than Moore's law, hrm). Given that the actual top is at 1PF, going to 20 in 3.5 years is quite an achievement.
Re: (Score:2)
Heck, in 3.5 years, your desktop computer will be 4 times more powerful than anything currently running today, too.
For being so picky about the terms in the article, you are quite lax with your own. I seriously doubt my desktop, in 3.5 years, will be able to do ~ 6 petaflops. :) (4x more powerful than "anything" currently running today)
Furthermore, 20 vs. ~1.5 petaflops is a goodly sized jump for 3ish years, isn't it? Computer speed growth has seemed to be slowing lately, with an emphasis being on multiple cores, not faster clock speeds like it was 10 years ago. So being able to get 20x the power of the current super
Re: (Score:2)
It's obvious that I was referring to desktop computers with the "anything running today" wording.
If you're going to be that intentionally disingenious, why don't you also say that I claimed desktop computers were going to have 4000+ horsepower, since there are industrial earth moving equipment engines that currently put out over 1000. ....wait a minute.....
Why does the government need it? (Score:2)
Why can't we let private industry own the computer and the government just purchase time on it? I for one would love to have CGI movies rendered in better-than-real time. This way, us the taxpayers don't have to pay for idle time.
Also, I can design a database using SQLite with a web front end for keeping track of uranium or anything else for that matter. As long as it is not measured in individual atoms, it'll run fine on my spare 2.4 Single core celeron. There is no need to update the database 100M times a
Re: (Score:2)
You could probably do that with an old copy of Filemaker Pro and a eeePC.
Sheesh. It's inventory management, not rocket science.
They've cracked RSA! (Score:2)
Dan Brown told me so.
MTBF (Score:4, Interesting)
So the real question in an immense cluster like this, is whats the MTBF?
Simon claims that the Eniac MTBF was 8 hours, although I've seen all kinds of claims on the web from minutes to days.
http://zzsimonb.blogspot.com/2006/06/mtbf-mean-time-between-failure.html [blogspot.com]
I would guess this beast will never be 100% operational at any moment of its existence.
I'm guessing the "cool" part of this won't be the bottomless pile of hardware in one room, but how they maintain this beast. Just working around one of the million CPU fans burning out is no big deal, but how do you deal with a higher level problem like one of the hundreds of network switches failing, etc?
Re: (Score:3, Informative)
Re: (Score:2)
Hazards of too much processing power (Score:2)
Hopefully the government uses this magnificent tool wisely when it gets it in 2012.
SCENE: The Pentagon, 2012
Science Advisor: "President Whoever-You'll-Be, IBM has completed our 20 petaflop computer. It is awaiting your command."
President Whoever-You'll-Be: "Thank you, Advisor. We can use it to compute the long-term effects of nuclear waste disposal, weather fronts, and... just... just how much processing power is in this?"
SA: *deep sigh* "Over 1.6 million processors and a total of 1.6TB of RAM, sir."
PWYB: "My GOD, Advisor. Do you know what that much power could do? It... it could...
3 years from now? (Score:2)
"IBM reckons its 20-petaflops capable Sequoia system will outshine every single current system in the Top500 supercomputer rankings"
So the computer will be ready in 2012, and it will outperform computers from 2009?
These multi-year computer construction projects seem very problematic given the pace of change in technology. Memory changes, CPUs change, and the socket specs change — if it takes 3 years to build, it will be obsolete before it's ready. 2012 could be the year that ATI releases 10-petafl
Can it balance the budget? (Score:2)
Maintenance plan? (Score:2)
Still, can you imagine the maintenance plan on this beast?
Can you imagine the power and cooling involved?
Even at only 25W per processor, we are talking nearly 10MW of power for the processors alone.
Much more interesting than the machine itself would be an article on how they plan to keep it up and running.
balanced computing: flops = memory words (Score:2)
Wait... (Score:2)
Keep track of nuclear waste?
A freakin pencil and paper wouldn't work for that?
The rest of the duties are cool, more simulation and research and less underground testing...that's fine.
But that initial reason is bogus!
Re: (Score:3, Insightful)
It could also be used to search for "suspicious behaviour" by searching Government databases, Credit card companies' databases, credit bureau databases, Choicepoint's, telecommunication companies' databases, airlines, and any other firm that the Government bullies into giving access.
Well, that's not as paranoid as you might think. The case against is quite simply the publicity that's been given to this behemoth of a machine, so I really don't think it's too likely in this particular case.
However this is EXACTLY how you go about putting together a machine for intelligence purposes. The key to running an intelligence service is deniability at as many levels as possible, and keeping anyone from seeing the big picture.
So you comission some huge piece of hardware, with a benign-but-comp
Re: (Score:3, Insightful)
I'm sorry, I don't believe it.
I think using a BlueGene for run-of-the-mill data processing would be a horrible waste of money. There's simply no need for things like a parallel filesystem or PB of RAM or low-latency interconnects. You want to "scale out" for distributed processing like you're talking about, not "scale up".
No, I'd bet intelligence gathering is done on Google-like processor farms.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re:Aluminum foil hat. (Score:5, Insightful)
You don't need 20 petaflops to do that, you need a few tens of teraflops and a really really huge memory and really really fast IO. You'd do much better with some of the 1/4TB memory systems from Sun or IBM + spending a huge pile of money on SSDs than a real supercomputer.
The cost of the IO interconnect is a huge chunk of cash to sink into a supercomputer that you just don't need for that sort of tin foil hat application.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
This computer would however be really good at brute-forcing crypto keys...
Not really, 2^N gets big fast. The sun won't output enough energy over its entire lifetime to allow a maximally efficient computer to even count from 0 to 2^256, let alone try to brute-force a 256-bit key. (From Applied Cryptography which I don't have in front of me).
Re: (Score:2)
Would you rather them set off nukes to study these things? The reason is that it takes a crapload of calculations to map out every reaction between molecules in an area measured in square miles. (and because of a test ban treaty we signed, we can't set off 'real' nukes to test anymore, so we have to simulate it')
Re: (Score:2)
Re: (Score:2)
And that being said, nuke simulation has little to do with quantum chemistry anyways.
So why did you bring it up? The parent didn't, I don't get what you are saying when you ask a question, relate it to the parent's post, and then say it is irrelavent. You might as well have asked him if he knows how tight the car you drive corners, if you are going to say it is irrelavent anyway.
And, do you realize how much processing power 20 petaflops is? That's insane, I'm having a hard time wrapping my head around it. That is well into the territory of the number of molecules in a small object. There
Re:Why always nuclear simulation? (Score:5, Informative)
The parent said that the computer will be used for "mapping every reaction" between molecules. Presumably, since reactions tend to require quantum mechanical descriptions, I guessed the parent meant that the new computer would allow doing such calculations for all reactions in a rather large area.
I don't get what you are saying when you ask a question, relate it to the parent's post, and then say it is irrelavent.
Just a gedanken experiment to amuse myself, while noting that it actually has nothing do with simulating nuclear weapons. Don't get too worked up about it.
And, do you realize how much processing power 20 petaflops is?
Yes, it's about 2 orders of magnitude more than the supercomputer I'm using at the moment. A lot for sure, but still limited to very small system sizes for quantum mechanical calculations. At the moment, even the best methods in practice scale as N^3 or so. With my current 100 TFlops I might do a DFT calculation with O(10000) atoms or so. Two orders of magnitude more CPU power with N^3 scaling gives me roughly a factor of 5 more atoms. 50000 atoms fit into a box of roughly 10x10x10 nm (depending on the material etc., of course). Still a way to go until I'm able to do "square miles"..
If you want to go into classical molecular dynamics, then you're obviously in much better shape. With the current supercomputer that's maybe around 1E9 atoms, and since MD scales linearly, with two orders of magnitude more flops it means around 1E11 atoms. Now these fit into a box on the order of 1 um**3. Again, still quite a way to go to square miles..
In conclusion, atoms are really really tiny, and in 3 dimensions you can pack a lot of them into a very tiny volume.
Also, they so far have not needed to calculate what a nuclear bomb does for each atom (obviously, since it has been nigh impossible), and they probably won't ever need to really. You can study waves and energy effects in great detail, and simulate them accurately, without needing to know where each and every atom goes. This will simply let them be more precise and accurate, as well as speedy.
Yes, that was sort of implied in my previous post. The US nuke labs have been at the forefront in research on numerical methods in topics such as shock propagation (PPM and methods like that) and really really large FEM simulations. Obviously, the actual nuclear reactions are taken into account probabilistically rather than the full quantum mechanical treatment (as my above monologue shows, such a treatment for the primary is far beyond any computer in sight). AFAIK they use Monte Carlo neutron diffusion rather than the classical multigroup diffusion methods that AFAIK are still largely used for civilian reactor design. That being said, I'm sure they are doing a lot of atomic and quantum level simulations as well for small model systems designed to e.g. extract parameters for continuum simulations and such.
Re: (Score:2)
Re:Why always nuclear simulation? (Score:5, Informative)
New designs are used to maximise yield per mass, enabling you to throw a smaller warhead at a target, which means less chance of interception. It also means a smaller package to maintain, and cheaper to build, along with more warheads per unit of material.
Re: (Score:2)
Uh, because it's paid for out of the NNSA [energy.gov] budget?
Re: (Score:2)
Why always nuclear explosions simulation is the primary use for this type of computer?
Would you rather they test nuclear explosions for real?
Re: (Score:3, Insightful)
Because funding for military expenditures is much easier to obtain than funding for climate research.
Because we can accurately model (Score:3, Insightful)
nuclear explosions whereas climate simulations don't have all the variables.
Actually I think they model the effects on decay in current nuclear weapons. Besides its not something I want them to physically test.
Re: (Score:3, Interesting)
Re: (Score:2)
Imagine 1.6 million UAC popups opening when you run a single program...
Re: (Score:2)
imagine running "top" with 1.6 million CPU lines refreshed each 3 seconds (or taskmaganer or cat /proc/cpuinfo)
Re: (Score:2)
Quad Cores?
Re: (Score:2)
It's not a typo, a weather cock [google.com] is just another name for a weather vane, it refers to the traditional shape of weather vanes
Re: (Score:2)