Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Supercomputing Handhelds Hardware

Handheld Supercomputers in 10-15 Years? 240

An anonymous reader writes "Supercomputers small enough to fit into the palm of your hand are only 10 or 15 years away, according to Professor Michael Zaiser, a researcher at the University of Edinburgh School of Engineering and Electronics. Zaiser has been researching how tiny nanowires — 1000 times thinner than a human hair — behave when manipulated. Apparently such minuscule wires behave differently under pressure, so it has up until now been impossible to arrange them in tiny microprocessors in a production environment. Zaiser says he's figured out how to make them behave uniformly. These "tamed" nanowires could go inside microprocessors that could, in turn, go inside PCs, laptops, mobile phones or even supercomputers. And the smaller the wires, the smaller the chip can be. "If things continue to go the way they have been in the past few decades, then it's 10 years... The human brain is very good at working on microprocessor problems, so I think we are close — 10 years, maybe 15," Zaiser said."
This discussion has been archived. No new comments can be posted.

Handheld Supercomputers in 10-15 Years?

Comments Filter:
  • by 140Mandak262Jamuna ( 970587 ) on Monday October 29, 2007 @08:30AM (#21156155) Journal
    Before anyone asks. Also you can imagine a beowulf cluster of these, as well as welcome the overlords.
    • by JK_the_Slacker ( 1175625 ) on Monday October 29, 2007 @08:31AM (#21156167) Homepage
      However, these STILL won't run Vista at full speed.
      • by jollyreaper ( 513215 ) on Monday October 29, 2007 @08:48AM (#21156373)

        However, these STILL won't run Vista at full speed.
        You know what the best way to accelerate Vista is? 9.8 meters per second per second.
        • Re: (Score:3, Insightful)

          by hackstraw ( 262471 )
          You know what the best way to accelerate Vista is? 9.8 meters per second per second.

          Throwing things on the floor go much faster than 9.8 m/s^2.

          With respect to the story at hand. We already have handheld supercomputers.

          The Cray 1 was about 100 MFLOPS. Most all cell phones and PDAs CPUs can outperform that.

          I work with "supercomputers", and all I see them as are new, expensive, unreliable, and energy inefficient versions of laptops and things.

          In the same spirit, some people in the biz call these things time
          • Re: (Score:2, Insightful)

            by smussman ( 1160103 )

            You know what the best way to accelerate Vista is? 9.8 meters per second per second.

            Throwing things on the floor go much faster than 9.8 m/s^2.
            PHYSICS ALERT!!!!!! Once it leaves your hand (or whatever device you are throwing it with), the computer will only accelerate at 9.8 m/s^2 (neglecting air resistance). Unless you happen to live on a different planet.
          • by pclminion ( 145572 ) on Monday October 29, 2007 @12:29PM (#21158787)

            Throwing things on the floor go much faster than 9.8 m/s^2.

            No it doesn't, at least once the object leaves your hand. Then it's back under the influence of good old gravity, at 9.8 m/s^2, regardless of how fast you may have thrown it.

      • Funny, I was thinking that, but replacing "Vista" with "Linux" as I've found Vista to be a whole lot faster than a comparable Linux on it. By comparable, I mean one that's got the same feature sets and hardware support (not that Linux fully supports all of the hardware on my laptop). Sure, Puppy Linux flies, but it don't mean I can get anywhere with it.

        But, I'm just an overpaid Microsoft shill sent by them to sow FUD, so what do I know?
        • Re: (Score:2, Insightful)

          by dintech ( 998802 )
          I've found Vista to be a whole lot faster than a comparable Linux

          Did someone make you say this to stop a terrorist attack? Bruce Willis had to do the same thing in Die Hard 3 by standing in a black neighborhood with a racist sign round his neck...
        • Comment removed based on user account deletion
    • by Moraelin ( 679338 ) on Monday October 29, 2007 @11:20AM (#21157973) Journal
      Aye, lad, of course you can imagine a beowulf cluster of these. But that's the easy part. Everyone can do that these days. Why my nephew could imagine a beowulf cluster on a good day, and he's a toddler.

      Now try imagining cooling it. That's the real challenge. That's what makes grown up men cry like little girls.

      I mean, look 15 years back in time. That was in 1992. We still had desktop cases without fans (except maybe on the PSU, though even there not on all), CPUs without heatsinks (and in fact, the chip even included in a big slab of resin or such and it made no difference to cooling anyway), and computers could safely run on PSUs whose wattage was a 2 digit number. We also still had RAM fast enough that you didn't need a CPU cache (nor had a transistor budget for it, anyway). And we thought that a program that takes a whole floppy is bloated. Etc.

      So I'm going to put on my wizard hat and rub the ol' crystal ball, and tell you how I see computing in the future.

      - seein' as case fans started from none, and now we're at two or more 120mm fams and ducts per case, I see the computer of the future as a cube, whose whole face (or maybe side) is one big 14" fan (yes, inch, not cm) blowing air in and another in the back blowing it out. In fact, it will all be one big square wind tunnel, or an oversized hair dryer.

      You'll alos be advised to not put anything more flammable than asbestos behind it, and fence it so your cat or toddler can't get behind the computer and get cooked.

      - a decent power supply will be around 3-4 kilowatts, but Nvidia will recommend 5 kW for their latest graphics card, more if you run a SLI setup.

      - or maybe water cooling will become the standard, and the computer will nicely double as a samovar [wikipedia.org] and espresso machine.

      - heatsinks will be made of pure silver. And ATI will still need something that sounds like a jet fighter at takeofff to keep their GPU at only 90C.

      - continuing the trend, graphics cards will keep needing increasingly more dedicated power connectors, and increasingly more pins on them. We started at 1 with 4 pins, and now we're at "ATI won't activate this or that function if you don't have 8 pins on the second power connector." I foresee that in 15 years we'll be at 6 power connectors with 16 pins each, just to bring enough current to the graphics card.

      - still noone will have invented a better use of all that silicone than adding yet another core, so given that 15 years is no less than 10 cycles of Moore's Law, you'll have anywhere between 2048 and 4096 cores in your PC. More time will be spent passing messages between those and serilizing access to data, in algorithms that were never meant to be massively parallel, than actually computing the useful part. People will still argue that it's the fault of game programmers that they don't split processing 5 NPCs between 2048 CPUs, or for that matter, the fault of compiler makers that they insist on reading the file sequentially instead of each core processing every 2048'th line of the file.

      - We'll be up to, oh, maybe DDR9, or maybe some newer standard. It still won't have lower latency in nanoseconds than the old SDR, but people will still buy it based on theoretical burst speed. Even more ridiculously larger caches will be needed just to keep all those cores working at all, instead of spending thousands of cycles waiting for the RAM to finally answer. On the bright side, though, we'll have enough budget of transistors form 2 to 4 gigabytes of cache on the CPU.

      - As that trend continues, eventually the disparity between RAM and CPU will get so high that it will be entirely feasible to skip RAM completely, and run the programs off the hard drive and the CPU's L3 cache. (The disparity between CPU speed and RAM latency is _already_ as big as that between the 8088 in the IBM PC/XT and the hard drive it had.)

      - People will still take the extra power as an invitation to write bloated and slow code. So even th
      • Speaking of scorch marks on the wall behind the computer, Arthur C. Clarke's Venus Prime [amazon.com] had a Steam Cooled Nano-Supercomputer. It looked like one of those aerators you screw on to the end of the faucet on your kitchen sink. And that's what the main character did with it. The water would vaporize as steam, dissipating enormous amounts of heat.
  • by Ckwop ( 707653 ) * on Monday October 29, 2007 @08:32AM (#21156187) Homepage

    Isn't a super-computer a relative term? I mean, I don't know the exact figure but I would that my Dual Core Intel box at home is probably a good deal faster than a super-computer from the 80s. It is probably hundreds of thousands or perhpas millions of times more powerful than the computers used in the Apollo programme. Surely the measure of what is a super-computer and what isn't must be based upon what the fastest machines are in the world at that time.

    Perhaps what he means is that what we currently do with supercomputers today will be able to be done with low cost computing. I can certainly see that being true. In fifteen years, it may be possible to adequately simulate nuclear weapons tests, climate models, or protein folding from a run-of-the-mill desktop.

    However, the improvements in computing speed will also apply to super-computers. With that extra power you can run more refined models so I can't see how this could obsolete the traditional bulky super-computer.

    In short, I can't really understand the super-computer slant of the article. Why not just talk about general-purpose computing instead?

    Simon

    • by Helios1182 ( 629010 ) on Monday October 29, 2007 @08:35AM (#21156211)
      Talking about general purpose computing doesn't make headlines. Thats why.
      • Re: (Score:3, Insightful)

        by Wiseman1024 ( 993899 )
        Mod parent insightful.

        When people don't have news, they make up them. They go and interview anyone who then pulls numbers out of his ass, and thus the "storage technology of the week", "power source of the week", "processing power prediction of the week", etc. is born.

        These articles should be considered spam.
      • by smchris ( 464899 )
        Oh, well. You get the "Gee, whiz" out of the way now. Kids in 20 years will think their handheld is what a handheld should be and always has been.
    • by FudRucker ( 866063 ) on Monday October 29, 2007 @08:42AM (#21156295)
      you can always tell a supercomputer by the big red "S" on its chest...
      • I believe it should be a 'G'... That hackers movie I once saw was so realistic, that I now believe that every supercomputer just has to be called 'Gibson'. ;P
    • by IndustrialComplex ( 975015 ) on Monday October 29, 2007 @08:42AM (#21156297)
      Because it doesn't result in as much attention grabbing. If I told you in 15 years, you would have a faster general purpose computer, that wouldn't be newsworthy now would it?

      Here are the measurements of my super computer

      200,000 Libraries of Congress, or 17 great lakes.
      15 Empire state buildings, stacked end to end in a giant circle.
      The power consumption of 3 New York Cities.
      All the potatoes in Idaho.
      Seating for 1.5 747 jumbo jets!
      And enough punchcards to circle the moon!
      • by Red Flayer ( 890720 ) on Monday October 29, 2007 @10:24AM (#21157349) Journal

        200,000 Libraries of Congress, or 17 great lakes.
        Thank you for provided that equivalent. I had no idea that 200,000 LoCs (a measurement of data equal to 20 terabytes) equals 17 GLs (a measurement of liquid volume equal to 2.3 x 10^16 L).

        A little back-of-the-napkin calculation, and we can deduce that if those measurements are equal, then there are 110 bytes per Liter of water.

        This makes sense -- if we freeze that Liter, each byte is approximately equivalent to a 1 cm x 3 cm x 3 cm chunk of ice, which I could easily fit into my mouth -- you might even say it's bite-sized.
    • Re: (Score:3, Informative)

      by Anonymous Coward
      More to the point, Supercomputers are not called "Supercomputers" because they are simply faster than other machines. Supercomputers are large-scale vector machines designed for number-crunching capacity. They're great at scientific modeling and simulation, but aren't exactly something all that useful to the average person. (Unless you somehow think that the Cell in the PS3 was the smartest idea ever.)

      Also, like most things in computing, "Supercomputer" is a moving target. Today's supercomputers tend to be
      • You see, the problem with calling a supercomputer "a cluster in a plamtop" is that there's nothing stopping us from stacking a room full of these "palmtop" devices and making an even larger cluster.

        I think the definition of a supercomputer should be changed to something along these line:
        "A super computer is any computer which is considered one of the top-N fastest computers in the world today."

    • by c ( 8461 ) <beauregardcp@gmail.com> on Monday October 29, 2007 @08:54AM (#21156427)
      > Isn't a super-computer a relative term?

      Yup.

      Unless they're talking about something significantly outside the progression we've accepted as Moore's Law. We've come to accept that a super-computer is normally a collection of hundreds of bleeding edge processors. So if they're talking about a handheld ten years from now which is perhaps 1024*(2^(240/18)) times more powerful than a single current bleeding edge CPU, then they could be justified in calling it a super-computer.

      They may also be using super-computer to describe a system fast enough that it doesn't need an upgrade to run whatever Carmack pushes out at the time.

      c.
    • I also wondered a while back how powerful my computer is to supercomputers of the past and found using this page [wikipedia.org] and a rough conversion to GFLops my desktop is only about as good as a supercomputer from the 80s.
    • Re: (Score:3, Funny)

      by hey! ( 33014 )

      Isn't a super-computer a relative term?

      No, I think we should insist on a fixed definition of any performance class, which would serve geeks because we could know unambiguously exactly how much computing capacity anybody means when they use a term like "supercomputer". You could even record a conversation and play it back twenty years later, and everybody would know whether we were talking about enough computing power to, say, crack a 56 bit DES key in less than a week.

      It would benefit our colleagues in ma

      • by Eivind ( 15695 )
        Yeah !

        And we could do it like the guys which gave name to spectra, they shared it in: Low, Medium and High-frequency. Simple. Only, there's not really an upper bound on frequency, now is there ? The result was inevitable.

        We then got VHF - VERY high frequency.

        Then UHF - ULTRA high frequency.

        Then SHF - SUPER high frequency.

        Then EHF - EXTREMELY high frequency.

        The only thing that prevented us from running into SPHF - Stupendously High Frequency was the fact that by this time, we where running into IR-territory.
    • Perhaps what he means is that what we currently do with supercomputers today will be able to be done with low cost computing. I can certainly see that being true.

      I don't just see that as being "true" .. I see that as "um ... no fucking shit sherlock".

      As you already put it, today's PCs ARE super computers relative to the computing power of 10 - 15 years ago. So of course tomorrow's hand-helds will be super computers relative to todays computing power. It's just the way things have gone up until now with no f
    • Re: (Score:3, Interesting)

      The super computer I worked on in 1970 was a CDC 6400, came out in 1966, kid brother to the 6600 of 1964. They had a memory cycle time of 1 microsecond for 60 bits, and I think 64K words but I forget exactly. Instructions executed in various times, but the 6600 could pipeline to an extent, call it a 2-3 Mhz machine with 512K of core memory.

      $10M or so.

      That was the supercomputer of then, and today you can't buy a computer that slow. I don't know what goes in wristwatches these days, but I bet they are fas
    • The clock speed of the legendary Cray 1 was 80 MHz. With two instructions going per cycle, you could theoretically get 160 MFLOPS. These are laughable speeds by today's standards, but back then it was considered unbelievable.

      • You beat me to it; I was just about to post that. The Cray-1 is probably the first machine to be called a supercomputer, so let's take a look at how long it took desktop and handhelds to catch up. It got 160MIPS[1] in 1976.

        In 1993/4, desktops caught up with this, with the Pentium and PowerPC systems both pushing past this number (Alphas got their earlier, but they were not exactly mass market).

        The first handheld chips to push this limit were probably the StrongARM family, in 1995; only two years after t

    • by Lumpy ( 12016 )
      You are 100% correct. we already have handheld supercomputers some of the current subnotebooks are incredibly powerful. Hell we even have write supercomputers by the 1960's definition.

    • by jimicus ( 737525 )
      Yes and no.

      Traditionally, supercomputers were only used to deal with very specific problems which you'd probably write your own software for. They had a lot of very specific hardware designed from the ground upwards for such problems. An algorithm which will get real benefit out of such a system may well perform surprisingly poorly on your dual core laptop.

      However, the amount of R&D going into x86 and related architecture has meant that the likes of Cray had trouble keeping up - so many of their lates
    • Exactly. My iPod has more computing power than the "supercomputers" of the seventies.


      But there has been a general size trend over the last forty years. It's hard to find a computer these days that you can't pick up. Forty years ago a tiny computer was one that could be put on a desk. (And it generally required two people to get it onto the desk.)

  • Already here (Score:3, Insightful)

    by ktappe ( 747125 ) on Monday October 29, 2007 @08:33AM (#21156199)
    Today's handheld devices ARE the supercomputers of decades past. Things are always getting faster and smaller. If you took a WinCE device or iPhone back 15 years, you'd blow peoples' socks off.
    • Mod parent up this is exactly what I thought when I read the article.
    • No, they aren't. A supercomputer can perform a teraflop. That's the definition of supercomputer, and it has been since the word was coined by the government in the 1970s in order to define export restrictions. That's what the article is about: a teraflop in the palm of your hand. That's why that NC State professor was able to cluster eight PS3s and call it a supercomputer. Remember that? He would have been laughed off of campus if supercomputer meant "omg whatever is fast this week."

      Nobody cares whose
      • So the Cray-1 wasn't a supercomputer? It was built in 1976 and could only perform around 250MFLOPS. The Cray MP-X, in 1982, peaked at 800MFLOPS. You seem to be claiming that 'supercomputer' was defined in the '70s to mean something faster than any machine that existed until the mid '80s, even though it was applied to machine that did exist in the '70s.

        Supercomputers from the 1970s are still supercomputers today

        No, not by your definition. Neither the Cray-1 (1976, 250MFLOPS) nor the Cray MP-X (1982, 800MFLOPS) count as supercomputers by your definition. In summ

      • A supercomputer can perform a teraflop. That's the definition of supercomputer, and it has been since the word was coined by the government in the 1970s in order to define export restrictions.

        Since ASCI Red was the first computer with teraflop capability you're saying that there was no such thing as a supercomputer before December 1996?

        I don't think so. In the 1970s there was no such thing as a gigaflop computer, much less a teraflop. Perhaps you need to check the timeline [wikipedia.org] for supercomputers on Wikipedia.

        • I remembered the prefix wrongly. It's gigaflop, not teraflop. The first supercomputer was IBM's Stretch project in 1961. Wikipedia's supercomputer articles are full of crap; for example, it cites 800 MFLOPS for the Cray X-MP, when the X-MP could have between two and 192 CPUs, each of which did 230 MFLOPS. Generally speaking, try to refer to something authoratative, like Cray documents on Cray's websites; Wikipedia is not sufficiently vetted [tri-bit.com]. Yes, yes, I know, that comparison to Brittanica and all; I do
      • No, they aren't. A supercomputer can perform a teraflop. That's the definition of supercomputer, and it has been since the word was coined by the government in the 1970s in order to define export restrictions.

        Wikipedia [wikipedia.org] doesn't mention that definition. Your claim that the word was coined by the government also disagrees with Wikipedia:

        The term "Super Computing" was first used by New York World newspaper in 1929

        And Wikipedia actually has a source for that. According to Wikipedia's list, 1 TFLOPS wasn't even reached until 1997, so I can't imagine the United States restricting the export of something that wouldn't exist for another 20 years. I know Wikipedia isn't the final, definitive source for all human knowledge, but until you can provide a source for your information, Wikipedia is mo

  • by eldavojohn ( 898314 ) * <eldavojohnNO@SPAMgmail.com> on Monday October 29, 2007 @08:34AM (#21156201) Journal
    10-15 years from always, I'll wake up to my alarm clock, powered by cold fusion. I'll stumble down stairs and get the keys to the hover car from the kitchen and grab my hand held supercomputer. On the way to work, I'll play Duke Nukem Forever as my car flies me along the correct path.
    • by infolib ( 618234 ) on Monday October 29, 2007 @08:44AM (#21156317)
      I had a lecturer who explained that when applying for grants you'd always like the research to have imminent application. On the other hand, if you put the deadline too early you, or the people who granted the money, might have to face responsibility for the failure. In between was there was a sweet spot, which he gauged to be around 15 years or so. Ever since then I've honored him by referring to this phenomenon as the "Flensberg Optimum".
    • Re: (Score:3, Funny)

      by stonecypher ( 118140 )

      I'll wake up to my alarm clock, powered by cold fusion
      Okay.

      the keys to the hover car
      Right.

      grab my hand held supercomputer
      Sure.

      I'll play Duke Nukem Forever
      Whoa, whoa, whoa, what do you think we are, idiots?
  • 10-15 years? (Score:2, Redundant)

    by porcupine8 ( 816071 )
    Isn't "supercomputer" a bit of a relative term? Don't we have supercomputing handhelds today, if you look at the original supercomputers?
    • Re: (Score:3, Informative)

      by maeka ( 518272 )
      A quick google search appears to show modern PDAs [ocforums.com] competing nicely with a mid-80's Cray. [wikipedia.org]
  • Most of todays cellphones are the super computers of yesteryear. What's really interesting though is what tomorrows super computers will be.
    • Most of todays cellphones are the super computers of yesteryear.
      A supercomputer is a legal term meaning a computer which can perform one teraflop or more. There is no cellular phone (yet) which has crossed that threshhold, and the very first supercomputer made is still a supercomputer today. Please stop attempting to learn your computer science from Wikipedia, as it's written by people whose knowledge is akin to yours.
  • They will come up with a better name than BrainPal.
  • by wonkavader ( 605434 ) on Monday October 29, 2007 @08:40AM (#21156259)
    We've already had a joke here saying Vista won't run at full speed, but I think there's a kernel of truth, there.

    If you can put a supercomputer in your hand, it's not a supercomputer. A week ago, we had an article here on a guy who'd wired several PS3s together and called it a supercomputer. Folks didn't agree with the supercomputer designation, even though he was getting flops that would clearly have been supercomputer speed just five or six years ago. It's not speed that defines a supercomputer, it's speed relative to what's commonly available.

    If we crunch down machines to incredibly small size, then research institutions will buy one 50 times that size. Every time. What will happen is that that tech (if it's not expensive) will drive PC speeds up, perhaps phenomenally, software development tools will make use of the extra speed to make programming easier at the expense of run-time, and we won't see significant speed increases in the user experience. The user will be able to do more, of course, but he'll be complaining "When I speak into the microphone to tell it to write a three page synopsis of this book in it's library, it stalls and lags, and sometimes I tell it twice, before I get a response, and then it gives me two outputs. This thing is SLOW."
    • If you can put a supercomputer in your hand, it's not a supercomputer. A week ago, we had an article here on a guy who'd wired several PS3s together and called it a supercomputer.

      Yeah, you're just dead wrong, here. That guy is a professor of computer science at North Carolina Sate; his name is Frank Mueller. And, surprise surprise, he knows comp sci better than you do. "Supercomputer" is a legal term coined in the 1970s by the US government to define export restrictions on computing hardware. It has a c

      • And yet, they've changed it [globalsecurity.org] because they realized that things change way too quickly. Not to mention that you're wrong about what they measure... it's not TFLOPS. It's not even FLOPS at all... it's MTOPS, or "Million Theoretical Operations Per Second". And it was just revised in 2002 [globalsecurity.org], in response to changing technology. A supercomputer is NOT a hard-line definition. According to numerous sources [answers.com], it's a relative definition. HTH, HAND
  • by TrumpetPower! ( 190615 ) <ben@trumpetpower.com> on Monday October 29, 2007 @08:41AM (#21156283) Homepage

    No, really. An iPhone is much more powerful than the Cray-1, and probably significantly more powerful than a Cray X-MP. The iPhone certainly has much more RAM and storage than they typical early Crays; I can’t be bothered right now to find out what kind of MFLOP performance an iPhone has.

    Cheers,

    b&

    • Re: (Score:3, Informative)

      by stonecypher ( 118140 )

      No, really. An iPhone is much more powerful than the Cray-1, and probably significantly more powerful than a Cray X-MP.

      I'm not sure why you believe this. I'll assume you mean the Cray 1A, since the Cray 1 is just a specification; it's a bit like talking about the 386, since the 386 ran at about a dozen different clock speeds. The Cray 1A was the first actual implementation of the Cray 1 spec, and was initially installed at Los Alamos. SCD's Cray 1 was installed about six months later, and ran at 160 mega [ucar.edu]

    • Damnit, I forgot to finish my thought. The Cray 1A is not a supercomputer; it's only 16% of the speed required for that moniker. Supercomputer doesn't mean "fastest computer of its day;" it has a specific numeric meaning. It's a legal term invented by the government to give a basis for export restrictions on computing hardware.

      Do your homework.
    • Except that most supercomputers on the Top 500 list aren't defined as such because of their raw memory, or MFLOPS, etc... Supercomputers are different from the average PC/iPhone/whatever consumer device not quantatively - but qualatively. Not of degree, but of kind.
       
      They generally have wider memory buses, lower memory and network latency, etc... etc... designed into them.
      • Except that most supercomputers on the Top 500 list aren't defined as such because of their raw memory, or MFLOPS, etc... Supercomputers are different from the average PC/iPhone/whatever consumer device not quantatively - but qualatively. Not of degree, but of kind.

        Which makes the appellation "supercomputer" even LESS appropriate for this device, wouldn't you say?

  • Nonsense (Score:5, Funny)

    by 93,000 ( 150453 ) on Monday October 29, 2007 @08:42AM (#21156291)
    I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings of Europe will own them.
  • by jollyreaper ( 513215 ) on Monday October 29, 2007 @08:45AM (#21156321)
    Technically, isn't my cell phone a super-computer by the standards of previous generations? Or is it not a matter of processor horsepower but the size of the bus?

    The analogy I've seen comparing big iron midrange and mainframes vs. PC's is "Yeah, the PC is zippy, but it's like a ninja bike. The big iron is like a dump truck. The midrange isn't going to get up to speed as quickly but it's going to be doing a hell of a lot more for the effort."
    • Technically, isn't my cell phone a super-computer by the standards of previous generations?

      No. Supercomputer is a specific FLOPS threshhold established by the government in the 1970s as a basis for export restrictions on hardware. A supercomputer from 1972 is a supercomputer today. It has nothing to do with generational standards; otherwise, stapling 8 PS3s together wouldn't prove anything, and it would be impossible to ever get a supercomputer in one's hand, given that the standards of the day would be

  • by Culture20 ( 968837 ) on Monday October 29, 2007 @08:48AM (#21156377)
    We won't have handheld supercomputers ever. If you have a handheld supercomputer, you can have a cluster of them, or better yet, a desktop sized computer so you're not wasting space with screens, batteries, and casings. Until the input/output problem for tiny devices is solved, handhelds will be PDAs and game devices (maybe doing neat things that today's desktops do, but very few will use them to try to crack the latest encryption algorithm).
  • Maybe you will be able to hold a machine that matches a current super computers power in your hand in ten years but there is one thing it won't be able to do in your hand - run.

    Extrapolating power consumption over the last ten years would seem to indicate that this "super computer in your hand" would probably be glowing red hot. Before we increase computing power much more we need to get a handle on efficiency.
    • Extrapolating power consumption over the last ten years would seem to indicate that this "super computer in your hand" would probably be glowing red hot.

      Oh, bullshit. There are several laptops on the open market right now that are over the 50% line. The only palmtops that glow red hot will be doing so because they use Sony batteries. Saying things like "extrapolating" without actually doing the math really just makes you look like an asshole. You aren't extrapolating. You're guessing.

      • Yeah... except the law is now restricting anything over 190,000MTOPS, which is about 36 P4's working together [globalsecurity.org] according to Intel. There is NO laptop in the world that is even close to that level of performance. Quit your uninformed bashing.
  • Sounds like Fusion power, but always 10-15 years away!
  • The 1970s called. It wants its hype back.
  • *POOF* (Score:4, Funny)

    by thatskinnyguy ( 1129515 ) on Monday October 29, 2007 @09:20AM (#21156659)
    What was that that just flew by me? Oh yeah! It was the vapor that is this article!
  • What was a supercomputer when I got my firs computer [kuro5hin.org] (A Sinclair 1mz w/ 4k memory) is now called a "mobile phone".

    =mcgrew
  • Best handwarmers money can buy. In fact, possibly too good. Oven gloves not included.

    TWW

  • [T]hat will be a huge step for the industry, considering that not so long ago supercomputers filled up enormous rooms or even entire buildings.
    Every freakin' time.
  • My DS is several times more powerful than my old 486sx. (Though still has the same amount of RAM)
  • I have always had trouble with people in the tech industry generally speaking, that refuse to be pedant about their terms and definitions. While it might technically be true that your desktop computer is as powerful as supercomputers of years past, they do not qualify as 'supercomputer' for one reason: The purpose of said computer. Supercomputers are designed to tackle certain problems, or be capable of it. Your desktop machine is designed to be a general purpose machine capable of running .... ughh... Wind
  • by suitti ( 447395 ) on Monday October 29, 2007 @10:04AM (#21157117) Homepage
    I recently picked up a Nokia 770. This device came out a couple years ago, say 2005. In 1985, I worked with a CDC Cyber 205 supercomputer. So, this is really 20 years, not 15. I have benchmark results for both, so why not compare?

    The Nokia has 64 MB RAM. The '205 had 16 MB RAM. The Nokia kicks scaler code at about 40 to 100 MIPS. The '205 kicked scaler code at 35 to 70 MIPS. The Nokia has a DSP, which seems to be able to kick about 200 MFLOPS (i could be wrong). The '205 had twin vector pipes with a peak performance of 200 MFLOPS each, but it was rare to get more than 80% of that. My point is that they're comparable. The Nokia came with 192 MB file store, but now has 2.1 GB, and can mount my desktop filesystems over WiFi with better than 1 MB/sec throughput. The '205 had about 1 GB disk, and could mount mag tapes. Both sport native compilers for C, Fortran, etc. The Nokia was about $150. The '205 was about $15,000,000. That's a factor of 100,000 improvement in price/performance. The Nokia runs on batteries and fits in my shirt pocket, with the form factor of my old Palm Pilot. The '205 had a motor-generator power conditioner (the flywheel acts like a battery in power failure) and fit in large machine room with temperature and humidity carefully controlled.

    Would i call the Nokia a supercomputer? No. Supercomputers cost more than a million dollars when they are new. Would i build a beowulf cluster of Nokia's? Maybe. With WiFi, one might put together an ad-hoc grid pretty easily. I only have one. But my 4 year old desktop is more than 30 times faster, so it's going to be hard to justify from a pure performance standpoint. Yes, my desktop has better price/performance than the Nokia.

    I've not yet run a SETI@Home unit on the Nokia. It'd be much better than the one i ran on the 486/33...
  • The way I love my computers, a handheld supercomputer is made for incessant fondling.

  • What is correct? Both.

    By alrights todays average computers ARE supercomputers, you just have to measure them against the supercomputers of the past.

    By that same token you will NEVER have a handheld superocmputer because by simply combining a couple of them together you would have an ever more powerfull computer.

    So the article basiclly states, in the future you will have more processing power then you have today. Mmm, yeah, that might happen.

  • They'll still be about the same RELATIVE TO desktops, servers, and other machines... handheld devices.

    Hell, my palm pilot way outpowers my CoCo II. Are any of you OLD ENOUGH to remember the CoCo??
  • Several years ago, I benchmark a Palm Tungsten handheld at over 3 megaflops (double precision floats), which was around the linpack performance of a CDC 6600, the first supercomputer designed by Seymour Cray. There are already lots of much smaller cell phones which can beat that.
  • My cell phone (motorola Q) is better in every respect except screen size then my first computer (486 dx)

    Phone
    CPU: PXA272 312 MHz
    memory: 64 mb
    Drive: 64 mb built in flash 1gb mini-SD card

    First comp
    CPU: 486 dx 33mhz
    Memory: 8 mb ram
    drive: 350 mb HD

    It's not a huge leap to assume todays desktop will be tomorrows mobile device 20 years down the line.

  • He's got a pretty bizarre definition of "supercomputer." I've always understood supercomputers to be the fastest, craziest computers currently available. Obviously, this changes over time. I propose that a hand-held computer BY DEFINITION cannot be a "supercomputer." It may be a very, very fast computer. But take thousands of such hand-held "supercomputers" and slap them together, that's a REAL supercomputer. Just like it's always been.

    In 1980, many of our desktop machines would have been considered "supe

Put your Nose to the Grindstone! -- Amalgamated Plastic Surgeons and Toolmakers, Ltd.

Working...