Handheld Supercomputers in 10-15 Years? 240
An anonymous reader writes "Supercomputers small enough to fit into the palm of your hand are only 10 or 15 years away, according to Professor Michael Zaiser, a researcher at the University of Edinburgh School of Engineering and Electronics.
Zaiser has been researching how tiny nanowires — 1000 times thinner than a human hair — behave when manipulated. Apparently such minuscule wires behave differently under pressure, so it has up until now been impossible to arrange them in tiny microprocessors in a production environment. Zaiser says he's figured out how to make them behave uniformly.
These "tamed" nanowires could go inside microprocessors that could, in turn, go inside PCs, laptops, mobile phones or even supercomputers. And the smaller the wires, the smaller the chip can be.
"If things continue to go the way they have been in the past few decades, then it's 10 years... The human brain is very good at working on microprocessor problems, so I think we are close — 10 years, maybe 15," Zaiser said."
Captain obvious to the rescue! (Score:2, Interesting)
We already have handheld supercomputers (Score:4, Interesting)
No, really. An iPhone is much more powerful than the Cray-1, and probably significantly more powerful than a Cray X-MP. The iPhone certainly has much more RAM and storage than they typical early Crays; I can’t be bothered right now to find out what kind of MFLOP performance an iPhone has.
Cheers,
b&
Re:The Not Too Far Future (Score:5, Interesting)
No handheld supercomputers (Score:3, Interesting)
Re:Why supercomputers? (Score:5, Interesting)
Yup.
Unless they're talking about something significantly outside the progression we've accepted as Moore's Law. We've come to accept that a super-computer is normally a collection of hundreds of bleeding edge processors. So if they're talking about a handheld ten years from now which is perhaps 1024*(2^(240/18)) times more powerful than a single current bleeding edge CPU, then they could be justified in calling it a super-computer.
They may also be using super-computer to describe a system fast enough that it doesn't need an upgrade to run whatever Carmack pushes out at the time.
c.
Re:Why supercomputers? (Score:2, Interesting)
It was an amazing computer for its time (in some ways it still is), but computers quickly met and surpassed its design, all on a single chip.
Re:Why supercomputers? (Score:3, Interesting)
$10M or so.
That was the supercomputer of then, and today you can't buy a computer that slow. I don't know what goes in wristwatches these days, but I bet they are faster.
As for 1980-85, those very early PCs were faster in Mhz but didn't do as much per instruction, and didn't have quite that much memory, but they were surely close.
Yeh, this clown will have a handheld 2007 supercomputer in 2022. Big deal. So will everybody. It will be your cell phone / iPod replacement and they will be as ordinary as wristwatches used to be before they fell out of fashion. But there will be faster computers, probably not hand held, and they will be the supercomputers of that day.
handhelds vs supercomputer benchmarks (Score:3, Interesting)
The Nokia has 64 MB RAM. The '205 had 16 MB RAM. The Nokia kicks scaler code at about 40 to 100 MIPS. The '205 kicked scaler code at 35 to 70 MIPS. The Nokia has a DSP, which seems to be able to kick about 200 MFLOPS (i could be wrong). The '205 had twin vector pipes with a peak performance of 200 MFLOPS each, but it was rare to get more than 80% of that. My point is that they're comparable. The Nokia came with 192 MB file store, but now has 2.1 GB, and can mount my desktop filesystems over WiFi with better than 1 MB/sec throughput. The '205 had about 1 GB disk, and could mount mag tapes. Both sport native compilers for C, Fortran, etc. The Nokia was about $150. The '205 was about $15,000,000. That's a factor of 100,000 improvement in price/performance. The Nokia runs on batteries and fits in my shirt pocket, with the form factor of my old Palm Pilot. The '205 had a motor-generator power conditioner (the flywheel acts like a battery in power failure) and fit in large machine room with temperature and humidity carefully controlled.
Would i call the Nokia a supercomputer? No. Supercomputers cost more than a million dollars when they are new. Would i build a beowulf cluster of Nokia's? Maybe. With WiFi, one might put together an ad-hoc grid pretty easily. I only have one. But my 4 year old desktop is more than 30 times faster, so it's going to be hard to justify from a pure performance standpoint. Yes, my desktop has better price/performance than the Nokia.
I've not yet run a SETI@Home unit on the Nokia. It'd be much better than the one i ran on the 486/33...
Aye, but that's the easy part (Score:5, Interesting)
Now try imagining cooling it. That's the real challenge. That's what makes grown up men cry like little girls.
I mean, look 15 years back in time. That was in 1992. We still had desktop cases without fans (except maybe on the PSU, though even there not on all), CPUs without heatsinks (and in fact, the chip even included in a big slab of resin or such and it made no difference to cooling anyway), and computers could safely run on PSUs whose wattage was a 2 digit number. We also still had RAM fast enough that you didn't need a CPU cache (nor had a transistor budget for it, anyway). And we thought that a program that takes a whole floppy is bloated. Etc.
So I'm going to put on my wizard hat and rub the ol' crystal ball, and tell you how I see computing in the future.
- seein' as case fans started from none, and now we're at two or more 120mm fams and ducts per case, I see the computer of the future as a cube, whose whole face (or maybe side) is one big 14" fan (yes, inch, not cm) blowing air in and another in the back blowing it out. In fact, it will all be one big square wind tunnel, or an oversized hair dryer.
You'll alos be advised to not put anything more flammable than asbestos behind it, and fence it so your cat or toddler can't get behind the computer and get cooked.
- a decent power supply will be around 3-4 kilowatts, but Nvidia will recommend 5 kW for their latest graphics card, more if you run a SLI setup.
- or maybe water cooling will become the standard, and the computer will nicely double as a samovar [wikipedia.org] and espresso machine.
- heatsinks will be made of pure silver. And ATI will still need something that sounds like a jet fighter at takeofff to keep their GPU at only 90C.
- continuing the trend, graphics cards will keep needing increasingly more dedicated power connectors, and increasingly more pins on them. We started at 1 with 4 pins, and now we're at "ATI won't activate this or that function if you don't have 8 pins on the second power connector." I foresee that in 15 years we'll be at 6 power connectors with 16 pins each, just to bring enough current to the graphics card.
- still noone will have invented a better use of all that silicone than adding yet another core, so given that 15 years is no less than 10 cycles of Moore's Law, you'll have anywhere between 2048 and 4096 cores in your PC. More time will be spent passing messages between those and serilizing access to data, in algorithms that were never meant to be massively parallel, than actually computing the useful part. People will still argue that it's the fault of game programmers that they don't split processing 5 NPCs between 2048 CPUs, or for that matter, the fault of compiler makers that they insist on reading the file sequentially instead of each core processing every 2048'th line of the file.
- We'll be up to, oh, maybe DDR9, or maybe some newer standard. It still won't have lower latency in nanoseconds than the old SDR, but people will still buy it based on theoretical burst speed. Even more ridiculously larger caches will be needed just to keep all those cores working at all, instead of spending thousands of cycles waiting for the RAM to finally answer. On the bright side, though, we'll have enough budget of transistors form 2 to 4 gigabytes of cache on the CPU.
- As that trend continues, eventually the disparity between RAM and CPU will get so high that it will be entirely feasible to skip RAM completely, and run the programs off the hard drive and the CPU's L3 cache. (The disparity between CPU speed and RAM latency is _already_ as big as that between the 8088 in the IBM PC/XT and the hard drive it had.)
- People will still take the extra power as an invitation to write bloated and slow code. So even th
Re:Am I missing something? (Score:2, Interesting)