Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Silicon Graphics Supercomputing

SGI Rolls Out "Personal Supercomputers" 303

CWmike writes "They aren't selling personal supercomputers at Best Buy just yet. But that day probably isn't too far off, as the costs continue to fall and supercomputers become easier to use. Silicon Graphics International on Monday released its first so-called personal supercomputer. The new Octane III system is priced from $7,995 with one Xeon 5500 processor. The system can be expanded to an 80-core system with a capacity of up to 960GB of memory. This new supercomputer's peak performance of about 726 GFLOPS won't put it on the Top 500 supercomputer list, but that's not the point of the machine, SGI says. A key feature instead is the system's ease of use."
This discussion has been archived. No new comments can be posted.

SGI Rolls Out "Personal Supercomputers"

Comments Filter:
  • Man... (Score:2, Funny)

    by muckracer ( 1204794 )

    Can you imagine a Beowulf cluster of those? :-)

    • Re: (Score:2, Funny)

      by Anonymous Coward

      Can you imagine a Beowulf cluster of those? :-)

      Yes yes, but does it run Crysis?

      • Re: (Score:2, Insightful)

        by djnforce9 ( 1481137 )

        I used to wonder the same thing about personal super-computers to be honest, but I think you'd end up frustrated and disappointed when trying to run games on these things.

        Notice how it stated "80 core system". Most games are only designed to use up to two cores while maybe some use four (same thing goes for folding @ home). That leaves at least 95% of the super computer's total CPU capacity completely idle (and even if it could technically use all 80 cores, Crysis (or any other modern game) is not THAT dema

      • Re:Man... (Score:4, Funny)

        by kimvette ( 919543 ) on Wednesday September 23, 2009 @01:02PM (#29517677) Homepage Journal

        It not only can run Crysis, but it can run Crysis-on-Vista pretty well. with this supercomputer, maybe now 2009 can be the year Windows Vista will be ready for the desktop!

    • Re: (Score:2, Funny)

      by cashman73 ( 855518 )
      One of these ought to be just enough to be able to run Windows Vista! ;-)
    • Re: (Score:2, Interesting)

      by EvilBudMan ( 588716 )

      We'll I think there may very well be a downside to that. As this stuff gets cheaper, the ability for just anybody to figure out problems increases, and that problem could be how to make bad stuff like nukes, or worse a virus writers dream. Hey, five more years and this will be under $2,000 in the sweet spot possibly. Anyhow, I want one, but maybe they need to only let people run them that have passed a basic test on driving a computer.

      • by Trahloc ( 842734 )
        Naw no worries, anyone who buys one gets a free entry into the NSA/FBI/DHS/CIA/INS/ABC/CNN/CBS/PBS/NBC and even Fox databases.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Come on, you can't be serious.
        Your average desktop pc is a Super-computer compared to a desktop of say 10 years ago.
        Take 10 more years and every pc will be a HPC by today's standards.

        Surely having access to a HPC is not the biggest problem in creating your own nuke, or figuring out any problem.
        It's not like these fast computers automagicly program themselves to solve difficult problems.

        • If someone writes a program, cannot another use it? There is already plenty of software to make both good and bad things. It's all up to the users as usual.

          And...I didn't say that was the biggest problems, but to design a working nuke without raising suspicion by testing it physically is a problem computing power can correct.

          Like you know some counties may have them and not have tested them like possibly Israel. They probably are just using a design that someone else tested, possibly France that built their

      • That downside isn't just restricted to this computer; it's a symptom of technology advancing faster than human nature.

        As has been said before, both on this site and elsewhere, for the first few thousand years of human existence, the extermination of humanity was well out of reach of everyone. As technology advanced it became possible for a group of people, working together, to develop a technology for mass destruction (the specific tech often referenced is nukes). Eventually, the group of people became sm

        • I think you are on to something. That DOES sound like one of the uses that I was thinking about but didn't want to state specifically here but since you already have:

          Yeah, lab work with it is probably the greatest threat. The software is already out there.

          Can't we have freedom and safety? We can do that but we all have to do it and make sure it's done. Even if those odds are very low, my intuition is this: People are Unpredictable. So I predict something will happen, but what it will be exactly, is unknown

        • Re: (Score:3, Interesting)

          by lgw ( 121541 )

          As has been said before, both on this site and elsewhere, for the first few thousand years of human existence, the extermination of humanity was well out of reach of everyone. As technology advanced it became possible for a group of people, working together, to develop a technology for mass destruction (the specific tech often referenced is nukes). Eventually, the group of people became smaller and smaller (theoretically, larger groups of people won't let each other actually use such weapons.

          The first European explorers to come to the Americas in the late 15th and early 16th centuries killed 90% of all humans in Central Amercia, and 95% of all humans in North America, without even trying. Modern technologies for mass destruction can't compete with the wooden boat.

      • Re: (Score:2, Funny)

        by acsinc ( 741167 )

        Only on slashdot are computer viruses worse than nukes.

        • Re: (Score:2, Insightful)

          by MrNaz ( 730548 ) *

          Because only on Slashdot is it commonly understood that computer viruses can give access to more nukes.

        • by gnick ( 1211984 )

          Actually that seems perfectly rational in the context given (i.e. some lone wacko developing one at home.)

          Threat of a nuke: (Potential damage) * (Ability of wacko to obtain special nuclear material) * (Ability of wacko to use material in a bad way) * (Likelihood of a wacko going through the trouble to jump the hurdles, create, and deploy the bomb) = Pretty low

          Threat of a computer virus: (Potential damage) * (Ability of script-kiddie to assemble a nasty virus) * (Likelihood that some script-kiddie might actu

      • Re:Man... (Score:5, Funny)

        by thePowerOfGrayskull ( 905905 ) <[moc.liamg] [ta] [esidarap.cram]> on Wednesday September 23, 2009 @11:12AM (#29515771) Homepage Journal

        how to make bad stuff like nukes, or worse a virus writers dream

        We geeks sure do have our priorities straight.

    • Can you imagine a Beowulf cluster of those? :-)

      Thank you Internet. You are predictable. And I love it.

  • by TechForensics ( 944258 ) on Wednesday September 23, 2009 @09:49AM (#29514717) Homepage Journal

    Wouldn't most people who would NEED a supercomputer be able to build one much more cheaply using a dozen workstations? It's hard to see how this SGI system might be sold (except perhaps as a replacement for an overburdened business-office server).

    • Re: (Score:2, Interesting)

      by NoYob ( 1630681 )

      Wouldn't most people who would NEED a supercomputer be able to build one much more cheaply using a dozen workstations?

      Is there any networked or cabled solution that's as fast as a bus on a motherboard? Having those machines communicate with one another and syncing the computations is a lot of overhead that reduces speed and adds complexity.

      I see computer animation uses for this. I also see math geeks (hobbyists) buying their own to run their current hobby project. Engineering departments using one to run simulations at a faster rate and cheaper.

      It's cheaper than the Apple solution so I see movie editors using this.

      You jus

      • Re: (Score:2, Informative)

        by Anonymous Coward

        The answer to your question is Infiniband, which is actually what is used in the Octane III systems.

      • by ShadowRangerRIT ( 1301549 ) on Wednesday September 23, 2009 @10:42AM (#29515433)

        Adding to the PP: The overhead and redundant hardware involved in dozens of networked machines would also mean that, to achieve equivalent performance, you'd likely be using twice the power if not more (you might save a little if you rack them with a single PSU for the whole rack, but it's still going to use a substantially greater amount of power).

        My home PC (a state of the art gaming PC as of January 2007), discounting the monitor, uses around 360 kilowatts at peak load (running one CPU and one GPU copy of Folding@Home while copying between the various disks to keep them spun up). Of that, only around 60-70 watts is the CPU, call it an even 80 once you add the memory. The GPU, motherboard, hard disks, and power supply losses eat up a lot of the rest.

        If you need 80 cores worth of processing power with frequent interprocess communication, you'll need an 80 core machine, or 100-200 cores split across multiple machines. If we assume eight cores per machine, and 16 machines, if they have even half the power overhead of my machine that's going to run an additional 140 watts per box, or an additional cost of 2240 watts. Over the course of one month, that's roughly 1600 kilowatt/hours of overhead, or about $250-350 dollars of power. Every month. For the entire life of the machine (assume 10 years for a corporate or research box), that's around $36000 (remember, that's on top of the cost of the single box super computer). And that's before you factor in the cost of *cooling* the additional heat produced by the additional machines.

        Don't get me wrong, there are advantages to the networked supercomputer design (redundancy and failover, the cheaper components mentioned, etc.). But there is also a place for the all-in-one super computer.

        • 360 watts, not kilowatts. While incorrect, I would hope the three orders of magnitude difference would make the typo obvious and ignorable, particularly since it doesn't affect any of the other calculations, but apparently not.
    • Re: (Score:3, Informative)

      by vlm ( 69642 )

      Wouldn't most people who would NEED a supercomputer be able to build one much more cheaply using a dozen workstations.

      This is a simplification, but is more or less correct:

      Xeon FSB width 128 bits by 1.333 GHz equals 170 Gigabits/sec bandwidth between processors.

      Commodity ethernet between commodity workstations, 1 Gigabit/sec bandwidth between processors.

      If your application runs on 1/170th the interprocessor bandwidth, agreed, it would be cheaper. If not, then it's not a relevant comparison.

    • Re: (Score:2, Interesting)

      by symbolset ( 646467 )

      Sure, you could do it with a cluster of workstations. You would need some insane interconnects. OR, you could just buy this pre-configured system from SuperMicro [supermicro.com] with dual quad-core Nehalems and 4 Nvidia Tesla C1060 [wikipedia.org] GPU Cards. That's 960 thread processors @1.3 GHz if you don't overclock, 16GB of DDR3 @ 1.6 GHz on a 512 bit bus, 16 threads of system CPU with up to 96GB of system RAM. It pulls close to 4 TFLOPS, in a desktop machine. You probably could break into the top500 [top500.org] with ten of them with decent i

  • I've seen the term 'personal supercomputer' so many times over the past 20 years. It's just baloney marketing. What you have on your desktop RIGHT NOW is more capable than some of the original CDC machines. So what?
    • Since apparently the definition of a supercomputer is a machine capable of 1 gigaflop, SGI was scooped [randomize.com] by Apple 10 years ago!

    • It's like this: just because you say your home PC is a "supercomputer" because it has all the performance of a "supercomputer" doesn't make it one. You need to have a little plastic bar glued to the front wherein is written, in dazzlingly Arial font, "SuperComputer". Otherwise, no one will believe you. Oh, just buy it already.

      Regards,

      SGI Marketing and Management.

    • by SuperBanana ( 662181 ) on Wednesday September 23, 2009 @12:04PM (#29516649)

      I've seen the term 'personal supercomputer' so many times over the past 20 years. It's just baloney marketing. What you have on your desktop RIGHT NOW is more capable than some of the original CDC machines. So what?

      What you have on your desktop RIGHT NOW is most likely more powerful than the Cray Y-MP by a factor of three, if you've got a quad-core Core2 Duo; those babies push +1Gflop.

      It's also 1/50th to 1/100th as capable as this supercomputer (or more- I don't know the relative performance between a current desktop processor and current Xeon.) Yes, it's relative, and relatively speaking, this is most certainly a supercomputer. In terms of memory, the maximum amount of ram you can put into a consumer-available motherboard is around 64GB, maybe 128. This has a maximum of 10 times that.

      80 xeon cores, 1TB of memory, and you call it a "marketing ploy"? And you got modded up "insightful"? May the hand of metamoderation come on down from high.

  • Picture (Score:5, Informative)

    by TechForensics ( 944258 ) on Wednesday September 23, 2009 @09:54AM (#29514817) Homepage Journal

    Picture here: http://www.ubergizmo.com/tags/octane-3 [ubergizmo.com]

  • by damn_registrars ( 1103043 ) <damn.registrars@gmail.com> on Wednesday September 23, 2009 @09:55AM (#29514821) Homepage Journal
    Who was the idiot who thought that it would be a good idea to call this the "Octane III"? This has almost no resemblance to the SGI Octane systems of that past, which were graphics workstations running Irix with MIPS processors. I think the only thing that makes them similar is the price range.

    This goes right up there with Honda constantly recycling their product names; passport [wikipedia.org], odyssey [wikipedia.org], pilot [wikipedia.org], and more recently insight.
  • "They aren't selling personal supercomputers at Best Buy just yet."

    Sure they are, it just depends on what era supercomputer you are comparing that commodity computer to. A modern desktop machine is insanely fast with inconceivable amounts of ram and disk storage if you think back a couple (several) decades. Best-buy will never sell super-anything, it's not their game. But the computers we take for granted are insanely capable machines based on the problems tacked in the past by supercomputers.

    Now get off

  • by sunking2 ( 521698 ) on Wednesday September 23, 2009 @10:01AM (#29514909)
    Isn't this basically the failed business model that put them under the first time?
  • This is a super computer not some thing you would use for normal every day computer activities. I think it would cater more to people that are doing protein bending or other extremely processor intense activities. Most of people wanting to use one probably have access to a University or Government Super Computer so that only leaves the self-employed or small business research market. I would not be surprised if even more of these appear from other companies because that market is one of the few that is grow
  • What will a home user do with an 80 core, 1TB RAM sysetm? Ray tracing? Protein folding? Local weather prediction? All things really high on the list for personal computers.

    Still, you'd never need to heat your house again.
    • Its not a PC, its a workstation. How many home users are going to shell out $8k for a base configuration?
    • What will a home user do with an 80 core, 1TB RAM sysetm? Ray tracing? Protein folding? Local weather prediction? All things really high on the list for personal computers.

      I'm not sure about you, but my most immediate thought would be to simulate an extremely complex neural network, likely easily using over half the ram on that task. Combine that with coding an infrastructure for it to learn patterns well from a source of input like a webcam, and hilarity ensues.

      If it could get to be of about the intellect of a small bird that would be awesome, but even if it is a spectacular failure and nothing valuable is produced, it would be awesome :)

    • Re:Why? (Score:5, Funny)

      by malevolentjelly ( 1057140 ) on Wednesday September 23, 2009 @10:41AM (#29515423) Journal

      What will a home user do with an 80 core, 1TB RAM sysetm? Ray tracing?

      Sometimes I need a giant mirrored ball as a pick me up when I'm down, or a photo-realistic digital recreation of a bowl of fruit. What's wrong with that?

      Protein folding?

      They're not going to fold themselves.

      Local weather prediction?

      I don't trust the NWS, though. I generally try to run my own weather models at home every morning before leaving for work. I have to do something with these petabytes of NASA satellite data.

  • by Tim4444 ( 1122173 ) on Wednesday September 23, 2009 @10:13AM (#29515057)
    Aw crepe, if these become commonplace M$ might rewrite Windows using dot net, and of course Sun would write a knockoff in Java. By then Linux will have 8 different windowing toolkits necessary for the basic apps and 29 sound systems. Oh well, I guess it's back to 0x7C00...
    • I wonder if you're being modded funny for implying that Java is a knockoff of .NET, because that is pretty funny. Or sad...
  • a supercomputer chassis. Not unlike getting an IBM BlueGene with ONE cell processor on ONE card in ONE unit on ONE rack. I suppose it's still a 'supercomputer' (since nobody's really defined what a supercomputer is). The architecture is there for true, multiprocessor multithreading in a highly scalable framework. Way cool!

    Then again, I'm buying up Marvel SheevePlugs as fast as I can afford 'em. With built-in 1000TX networking and a Kingston SOC chip delivering approximately the same performance as a 1

  • Bottom of the Top500 as of June is about Max 17 Tflops, Peak 37 Tflops. So if this is really 0.6 Tflops well... maybe one of these would make one node of a many-noded supercomputer.

    Interesting to compare to a rack of Apple Xserves [apple.com]. Each rack is 8 cores (same cpus as the Octane III it seems). Again about $50k for 80 cores. Looks like sgi is aiming at that segment.

    Can anybody with Xserve experience say how these would compare? I see Apple has something called Xsan too.

  • With that much personal computing power, just put virtualization on there and run as many VMs as you want to your heart's delight! No point in arguing the superiority of various OSes. Just make a VM of one and run it at the same thing as VMs of other OSes. If you want, spin up a few VMs and say you have a Beowulf cluster, which puts the whole idea on its head. Have your own virtual datacenter inside a single computer! If you want to be all MBA about it, create your own "cloud" inside. This is like Leg
  • by DynaSoar ( 714234 ) on Wednesday September 23, 2009 @11:16AM (#29515821) Journal

    Here's just a brief search for personal supercomputers of days gone (not too far) by. Most if not all are cheaper than the SGI. Being older they may not stack up spec-wise, and the definition will always be changing anyway. More than one claim to be 'first', and to SGI's credit they only claim it's 'their' first.

    http://tech.slashdot.org/article.pl?sid=08/11/23/068234 [slashdot.org]

    http://www.researchchannel.org/prog/displayevent.aspx?fID=569&rID=4263 [researchchannel.org]

    http://aslab.com/products/workstations/marquisk942.html [aslab.com]

    http://www.reghardware.co.uk/2006/06/07/tyan_unveils_typhoon/ [reghardware.co.uk]

    http://www.hpcwire.com/features/Cray_Unveils_Personal_Supercomputer.html [hpcwire.com]

  • The definition of a super is a system at the top order of magnitude of speed and memory. Since the current record is two petas, a super would be one hundred teras. A one tera computer is a super of a decade ago.
  • Would someone please enlighten me as to how this is better than a cluster of commodity servers? Or a cluster of workstation class machines? Or a cluster of commodity servers with a workstation class machine as the head node? I'm not seeing it; and, the SGI looks pricey.

For God's sake, stop researching for a while and begin to think!

Working...