DARPA Looks Beyond Moore's Law 217
ddtstudio writes "DARPA (the folks who brought you the Internet) is, according to eWeek, looking more than ten years down the road when, they say, chip makers are going to have to have totally new chip fabrication technologies. Quantum gates? Indium Phosphide? Let's keep in mind that Moore's Law was more an observation than a predictive law of nature, despite how people treat it that way."
The Diamond Age (Score:3, Informative)
Re:Stacked chips (Sloooowwww) (Score:2, Informative)
b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection. Which means the communication between layers of stacked chips would be thousands of times slower. Not very good for microprocessors..
Working link! (Score:0, Informative)
No, not just a wild guess (Score:5, Informative)
Please note that the observation was well enough educated that it includes the fact that its validity will be limited in time frame and that before it becomes completely obsolete the multiplying factor will change, as it already has a couple of times.
In order to understand Moore's Law one must read his entire essay, not just have some vague idea of one portion of it.
Just as being able to quote "E=mc^2" in no way implies you have the slightest understanding of the Special Theory of Relativity.
KFG
Re:Moore law will be no more (Score:3, Informative)
Not really. The functionality offered by software has pretty much flatlined (with the major exception being "media", e.g. mp3, mpeg, divx, etc). HOWEVER, the bloat and overhead of software continues to keep pace (and often surpasses) with the speed of hardware. This trend has no end in sight (mo features, mo features, mo features. Lookat those scaled miniature window/icons sitting in my dock updating realtime, oooh, aaaah. Lookat that 3d rotating desktop). Not meaning to pick on Apple here (I own several myself), but they are at the vanguard of eye candy code bloat, with Microsoft trying quickly to catch up.
Re:Qubit (Score:1, Informative)
That said, with the potential applications of quantum computing in cryptography (especially brute-force cracking and decryption), it's unlikely that anything close to the bleeding edge is in the public eye.
Moore's law is already ending (Score:4, Informative)
Historically, designing CPUs for higher-level purposes, other than simply designing them to execute traditional assembly language, has been deemed a failure. This is because generic hardware advanced so quickly that the custom processors were outdated as soon as they were finished. Witness Wirth's Lilith, which was soon outperformed by an off-the-shelf 32-bit CPU from National Semiconductor (remember them?). The Lisp machine is a higher profile example.
But now things are not so clear. Ericsson designed a processor to run their Erlang concurrent-functional programming language, a language they use to develop high-end, high-availability applications. The FPGA prototype was outperforming the highly-optimized emulator that had been using up to that point by a factor of 30. This was with the FPGA at a clock speed of ~20MHz, and the emulator running on an UltraSPARC at ~500MHz. And remember, this was with an FPGA prototype, one that didn't even include branch prediction. Power dissipation was on the order of a watt or two.
Quite likely, we're going to start seeing more of this approach. Figure out what it is that you actually want to *do*, then design for that. Don't design for an overly general case. For example, 90% of desktop CPU use could get by without floating point math, especially if there were some key fixed point instructions in the integer unit. But every Pentium 4 and Athlon not only includes 80-bit floating point units, but massive FP vector processing units as well. (Not to mention outmoded MMX instructions that are almost completely ignored.)
DARPA & Quantum Computing (Score:3, Informative)
A lot of posters sem to think that DARPA, the US military, or the US government is a unified thing. It's not. Each part often have their own agendas. Research is very frequently driven by those agendas.
However, DARPA often CYAs when it comes to research too. If you come up with a whacky idea that might just work they often will fund it even though it is in competition with another they have. The reason being that they then can see which whacky idea actually works. Often none do. or one does. or nother that seemed like a sure thing doesn't.
A long story short, if quantum computing doesn't turn out to be all that, they've covered their techno @$$3$.
Re:Stacked chips (Sloooowwww) (Score:2, Informative)
False, there is just one active layer of single crystalline silicon that contains the devices. The remaining layers are interconnects.
b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection.
How, why? the lateral extend of any die is usually bigger than its height. In fact the distance would be much shorter. Active layers would be seperated by less than 100micrometers.
IBM thinks so (Score:5, Informative)
Moore's Law is not a "law" (Score:4, Informative)
"Moore's Law" is no more a "law" in the sense of physics (or anything else for that matter), than any other basic observation made by a scientist or physicist.
Oddly, you'd have a hard time believing it wasn't a Law of Nature by the apocalyptic cries from the technology industry when "Moore's Law" falls behind - spouting that something *has* to be done immediately for Moore's Law to continue, lest the nuclear reaction in the Sun cease. Or something.
At the time it was coined by the *press* in 1965, only a small fraction of what we now know was known about the physics of integrated circuits and semiconductors at the time. So, looking back it's easy to see that the exponential trend in density would continue as long as the knowledge and abilility to manipluate materials increased exponentially.
Yes, it is rather surprising that Moore's observation has held true as long as it has. And this isn't to say that the growth trend won't continue, but it will certainly level off for periods while materials or manufacturing research comes up with some new knowledge to advance the industry.
As the article indicates, things are likely headed for a plateau, possibly toward the end of this decade or start of the next. And at that point, Moore's observation will simply no longer be true or appropriate.
Let the cries of armageddon begin as "Moore's Law" is finally recognized as an observation that will eventually be outlived.
For a little "Moore" background, see http://www.intel.com/research/silicon/mooreslaw.h
Re:Stacked chips (Sloooowwww) (Score:3, Informative)
And right on b) -- the distance between 2 dice stacked is much shorter than 2 side-by-side. But this is totally irrelevant, mostly due to previous posters
First, by area-IO I meed input/output (IO) drivers or receivers that can be placed anywhere in an area, rather than only around the circumference (preipheral IO). We have area-IO at the package level (such as BGA, or Ball-Grid-Array and FCBGA, or Flip-Chip Ball Grid Array [best for area-IO, and expensive]) and area-IO at the die level. Do we connect the dice before or after packaging?
Either way presents problems. Such as (for pre-packaging connections):
How do you electrically connect 2 area-IO dice? Usually, a die has little square landing pads, and these are only about 50um square, spaced every 200-250um or so on center in 2-D arrays of up to 70x70 and more. To be able to do anything with these tightly packed little signals, we drop special tiny drops of metal that stick to the pads, and press this up against a package substrate (ceramic), which includes routes to space those signals out more, like every 1.0mm or so. Even this is expensive and hard to mount to PCB, since it's hard to ensure both things are perfectly flat (package and PCB) so that all balls connect.
In fact, we rely on the package (often including an internal metal "stiffener") to keep the die nice and flat, which helps avoid de-lamination (layers peeling apart). Two dice pressed next to each other would require some space between them to make the connection (i.e., some bumps for the connection, and valleys for no connect areas), and this and the elasticity of the electrical connection medium would leave enough play to let the dice warp all over the place.
It'd be even harder to tell which ball(s) aren't connected. We do this now by confirming that the PCB is OK (usually pretty easy, so it makes a good reference), make the chip send specially-controllable data out (and take data in on inputs), then check to see what's right and wrong by measuring at the board level. If my board is another chip, how do I know which one I am debugging? This debugging (we call mfg testing) happens to all chips, not just some samples. If it isn't, failure rates will go up to unacceptable levels (like 20-50% or more).
Testability is hard if you stack dice before or after packaging. Design is a bizzotch too, since you can't very well even model one whole chip at a time (and how the circuit performs depends on process, voltage, and temperature), much less two chips stacked with an insulator and some kind of very short, very small, very fragile, very susceptible to noise and crosstalk hunk of 1000+ wires between them. One local hot spot at X,Y on die A can mess up operation at x,y on die B, and we'd never be able to practically predict that.
Most importantly of all, part of the reason chip design even works at all, and that we can churn them out for pennies each (after massive design and capital outlay for a fab), is that we can simplify the design dramatically by making assumptions, modelling the target device in isolation, verifying it in isolation, and then being able to safely assume this (truly wrong) assumption of isolation is close enough to true that the part will work in the system. Single packaged die are relatively infinitely insulated from everything except the I/O we carefully design. Stacked dice would not be -- they would interact strongly with each other in unpredicatab
Re:The Diamond Age (Score:5, Informative)
Since diamonds have a much higher thermal conductivity (ie they can take the heat), they'd make better chips than silicon if only they were more affordable. Industrial diamonds are expected to make the whole industry's prices fall drastically by increasing supply and breaking the De Beers cartel .
More about the De Beers cartel:
Page 1 [theatlantic.com] Page 2 [theatlantic.com] Page 3 [theatlantic.com]
Everything2 link [everything2.com]
Personally I think these are awesome feats of engineering, and a way to give your significant other a stone without feeling morally, and literally, bankrupt.
Re:It's bigger than Moore's law (Score:2, Informative)
Re:The Diamond Age (Score:1, Informative)
On top of that, the established jewelry diamond houses etch their name/logo into the diamond at a microscopic level.
So unless your girlfriend likes the glow-in-the-dark and/or doesn't care about whether the diamond comes from an established house (taking into account that the glow-in-the-dark effect has indeed been overcome as some Russians claimed), you'd still have to be the morally and financially bankrupt person to stop your girlfriend from having a fit* as the monetary value of the diamond is the major player (after aesthetics) for a diamond-fitted jewel.
( * http://ask.slashdot.org/article.pl?sid=02/08/13/2
Re:Enough with "moore's law" (Score:1, Informative)