DARPA Looks Beyond Moore's Law 217
ddtstudio writes "DARPA (the folks who brought you the Internet) is, according to eWeek, looking more than ten years down the road when, they say, chip makers are going to have to have totally new chip fabrication technologies. Quantum gates? Indium Phosphide? Let's keep in mind that Moore's Law was more an observation than a predictive law of nature, despite how people treat it that way."
what next... (Score:5, Funny)
Re:what next... (Score:3, Insightful)
Re:what next... (Score:1)
Re:what next... (Score:2)
In the 80s, DARPA had "an internet". Only after Gore led Congress to fund widespread adoption was "the Internet" created.
(Yes, there was a time long ago when people actually imagined that multiple non-connected "internets" could exist!)
Wooh there cowboy. (Score:1)
Re:Wooh there cowboy. (Score:4, Insightful)
Stacked chips (Score:4, Interesting)
they already are (Score:1)
Re:they already are (Score:2)
Re:Stacked chips (Score:3, Funny)
For a minute there I misread and thought your subject line was "Stacked chicks"! Then I realized you were just talking about some computer stuff. Dang!
GMD
Re:Stacked chips (Score:2)
a very very small Willie :-)
Diamonds are no longer a girl's best friend (Score:2, Interesting)
I'm just worried about what my wife will say when the diamond in my machine is bigger than the one on her finger...
-B
Re:Stacked chips (Sloooowwww) (Score:2, Informative)
b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection. Which means the communication between layers of stacked chips would be thousands of times slower. Not very good for microprocessors..
Re:Stacked chips (Sloooowwww) (Score:1, Funny)
(sorry, I just like the name)
Re:Stacked chips (Sloooowwww) (Score:5, Insightful)
But also thousands or hundreds of thousands of times smaller than going outside the package; which would make it ideal for multi-processors, array processors, or large local caches.
Re:Stacked chips (Sloooowwww) (Score:2, Informative)
False, there is just one active layer of single crystalline silicon that contains the devices. The remaining layers are interconnects.
b) If you are talking about stacking dice (That is, literally stacking chips inside the package) then the distance the information would have to travel when going trough the "vertical interconnects" would be thousands or tens of thousands bigger than the distance of any on-chip interconnection.
How, why? the lat
Re:Stacked chips (Sloooowwww) (Score:3, Informative)
And right on b) -- the distance between 2 dice stacked is much shorter than 2 side-by-side. But this is totally irrelevant, mostly due to previous posters
Re:Stacked chips (Sloooowwww) (Score:2)
Some observations:
Given that, I'm sure they could figgure out a way to make the distance between any two points on two wafers 1cm2 less than 0.5cm., say by making the interconnects gold studs a micron or so high all over the surface of the wafer, and aligning them face-to-face.
Re:Stacked chips (Sloooowwww) (Score:3, Insightful)
But how do you get "micron high" little gold studs to stick to the die in exactly the right places? How do you make sure each gold stud is exactly the same height (can't have a short one anywhere, even by a femto-meter)? Then, how do you physically/mechanically line them up exactly and keep them together perfectly for long priods of time under fairly wide ranges in vibration and temperature ranges? How do you prevent the dice from warping if each stud isn't 100% identical (such as if you
Re:Stacked chips (Score:2)
Not really. Modern cmos design fabricates everything from a single silicon wafer using a large number of photoresist layers to create different regions of silicon and toss several layers of metal interconnects on top. All the transistors are on one level, placing an upper limit to the number of gates within a given physical area. What would be exceedingly cool, though, is the ability to stack arbitrary layers of silicon on top, providing the capa
Re:Stacked chips (Score:2, Funny)
That's an easy one! Between each wafer, you place a delightful creme filling. The filling would be of such consistency that no matter how hot the wafers are allowed to get, the creme does not melt.
I propose we call this new technology Overall Reduction of Exothermic Output, or OREO for short.
--
Rate Naked People [fuckmeter.com] ! (not work-safe)
IBM thinks so (Score:5, Informative)
Re:Stacked chips (Score:2)
Re:Stacked chips (Score:2)
I've heard that this may be possible, utilizing somehow little channels through the inside of the chip that would carry liquid nitrogen. I think before fab technology approaches that point, however, we may have better technologies.
Reality distortion field (Score:3, Funny)
It's called (Score:1, Funny)
The Bush Method is so simple, it's amazing no one thought of it before 2000. All you have to do is take the thing about reality you want to distort, and state that it has changed, whether or not it hasn't. The amazing thing is, if you say it enough times publicly, it actually becomes true.
The Bush
Re:It's called (Score:4, Insightful)
Why do you give Bush the credit? This shit is Marketing 101 and Politics 102.
Enough with "moore's law" (Score:5, Insightful)
No, not just a wild guess (Score:5, Informative)
Please note that the observation was well enough educated that it includes the fact that its validity will be limited in time frame and that before it becomes completely obsolete the multiplying factor will change, as it already has a couple of times.
In order to understand Moore's Law one must read his entire essay, not just have some vague idea of one portion of it.
Just as being able to quote "E=mc^2" in no way implies you have the slightest understanding of the Special Theory of Relativity.
KFG
Re:No, not just a wild guess (Score:2)
Re:No, not just a wild guess (Score:2)
Re:Enough with "moore's law" (Score:2)
Though more than a "wild guess", you do have it right when you mention that it has no basis in physical reality.
I don't think I'd blame IP so much as marketing, though. The major player in the field, Intel, holds most of the relevant IP.
So why has Moore's Law worked for so long?
Because Intel schedules
Re:Enough with "moore's law" (Score:2)
Our markets have been totally manipulated by these made-up notions like Moore's Law and now that the game is up, people are acting shocked when the problem is obvious.
I was very impressed with this article for putting the time
Re:Enough with "moore's law" (Score:2)
. . . then it'll soon be time to sell this company and start another.
Re:Enough with "moore's law" (Score:3)
It's not a physical law, an observation or or eve a wild guess. This was Intel's Gorden Moore. It was a marketing plan.
Not a guess ! (Score:3, Insightful)
The manufacturers try to strike a balance between a high R&D investment (with rapid advances in technology) and keeping the technology in production long enough to generate a good return on that investment. Moores Law represents the 'sweet spot' that manufacturers had settled on.
While it's held quite well in recent decades, there's no guarantee it will continue to hold. If they hit a technological wall, or economic conditions cause a drop in inve
Re:Enough with "moore's law" (Score:3, Insightful)
GaAs (Score:1)
Moore law will be no more (Score:2, Insightful)
Re:Moore law will be no more (Score:3, Informative)
Not really. The functionality offered by software has pretty much flatlined (with the major exception being "media", e.g. mp3, mpeg, divx, etc). HOWEVER, the bloat and overhead of software continues to keep pace (and often surpasses) with the speed of hardware. This trend has
Re:Moore law will be no more (Score:2)
Those spinny flashy bits of eye candy in OS X make it significantly easier for me to use. Therefore OS X is not bloated. You may disagree, but you have the option to go command-line.
Now, MS Office on the other hand...
Re:Moore law will be no more (Score:2)
I'd vote for more efficient software personally, but that's also because I'm a pack rat that can't let go of any of my hold hardware.
Moore will fail (Score:1)
The Diamond Age (Score:3, Informative)
Re:The Diamond Age (Score:5, Informative)
Since diamonds have a much higher thermal conductivity (ie they can take the heat), they'd make better chips than silicon if only they were more affordable. Industrial diamonds are expected to make the whole industry's prices fall drastically by increasing supply and breaking the De Beers cartel .
More about the De Beers cartel:
Page 1 [theatlantic.com] Page 2 [theatlantic.com] Page 3 [theatlantic.com]
Everything2 link [everything2.com]
Personally I think these are awesome feats of engineering, and a way to give your significant other a stone without feeling morally, and literally, bankrupt.
Darpa brought us the internet? (Score:1, Funny)
More about Moore (Score:2, Funny)
Therefore i propose: "Moores Law 2: Anyone mentioning his name in a discussion aboout semiconductors, CPU's or transitsors have lost the discussion."
Re:More about Moore (Score:2)
Re:More about Moore (Score:2)
Firstly, one really can't have a meaningful discussion of the semi-conductor industry without understanding Moore/Moore's "Law".
Secondly, and I would have thought people would understand this by now, but Moore's "Law" does not cover CPU Speed!! It merely relates the density of transistors on silicon with respect to time. It does NOT attempt to take into factor various computer architecture adv
What about diamonds? (Score:5, Interesting)
Besides, I've been hearing about the death of Moore's Law for the last ten years.
Re:What about diamonds? (Score:2)
It's a popular filler topic for industry journalists who have nothing better to report about. They'll just point out that the latest processor from AMD/Intel/etc is "reaching the limits of current technology" and then progress ipso facto to "Moore's Law could be dead in a few years".
Re:What about diamonds? (Score:2)
The facts seem to show that we will have even faster development of processors than Moore's law states.
The idea behind this is based on the rate of technological development in human history. If you were to graph the rate of technological advancement for recorded history you would see a long line of incremental yet minimal growth punctuated by the recent 100-150 years where the increase is almost geometric.
Re:What about diamonds? (Score:2)
The funny thing (to me at least) is that very few people have fully digested the implications of exponential progress. They're in for a rude awakening over the next couple decades.
--
Re:What about diamonds? (Score:2)
To call Kurzweil insane indicates you either haven't really read his or others' ideas on this subject, or you have a bad case of cognitive dissonance.
It's not a pie in the sky prediction of the future so much as it's extrapolating based on thousands of years of observation of the rate of technological change.
So why do you have problem with this observed exponential trendline? It won't go away (unless humans go away).
--
I hereby predict (Score:2, Troll)
Example:
10 GOTU 4o
30 Re+URN; G0SUB 42
40 Print "Welcom to Windoes!":PRINGT "JUS KIFFING! HAHAHA!"
43 RUN
50 REM Copyright SCO(TM)(R)(C) 2012, NOT! HAHAHAHA
6o GOt0 14.3
Hey, it's a joke! Relax - no angry human brains will be used either!
umm (Score:4, Insightful)
Let's not and say we did.*
Seriously, I doubt that many people think that Moore's law is on an equal footing as say gravity and quantum mechanics. Still, an observation that has held more or less for nearly 40 years is worth considering as a very valuable guideline. Let's keep this in mind as well.
(*Why do vacuous comments like this make it into slashdot stories?)
Re:It's bigger than Moore's law (Score:2, Informative)
Paradigm shift (Score:4, Funny)
See, if we get everybody to take xanax or zoloft, there's no limit to how fast computers will appear to be working.
better yet (Score:2)
Let's just kill everyone, then our computers will seem infinitely fast! Dude, if you're gonna dream, Dream Big!
Maybe they simply need to... (Score:1)
I bet that's what it really is, anyway.
In keeping with the unique branding style . . . (Score:2)
Stefan "It's finally out!" [sjgames.com] Jones
Re:In keeping with the unique branding style . . . (Score:2)
Re:In keeping with the unique branding style . . . (Score:2)
In related news... (Score:1)
Clockwork's Corollary to Moore's Law (Score:4, Funny)
Qubit (Score:1)
Their web site [darpa.mil] talks a little bit about DARPA's quantum computing projects, but the page seems to be a little outdated. Anyone know if they're pursuing this as well?
DARPA & Quantum Computing (Score:3, Informative)
A lot of posters sem to think that DARPA, the US military, or the US government is a unified thing. It's not. Each part often have their own agendas. Research is very frequently driven by those agendas.
However, DARPA often CYAs when it comes to research too. If you come up with a whacky idea that might just work they often will fund it even though it is in competition with another they have. The reason being that they then can see which whacky idea actually works. Often none do. or one does. or no
Re:DARPA & Quantum Computing (Score:2)
By observation rules (Score:2)
I hate Moore's Law (Score:3, Insightful)
Parallel Computing (Score:2, Interesting)
All these other things they are talking about are vaporware. Parallel computing is here and in use now.
Re:Parallel Computing (Score:3, Interesting)
Incidentally, there is also a limit to how fast your parallel computer will get... it's call the bus. If you can't build high speed interconnects, or if your software isn't designed well (not as easy as it
Re:Parallel Computing (Score:3, Interesting)
Two major answers occur to me:
Answer one is that we figure out how to automatically decompose problems into independently solvable threads.. a quite difficult problem.
Answer two is that we build special purpose parallel processors to handle parallelizable t
parallel computing not the ultimate answer (Score:2)
That's pointless. Why would I prefer 8 chips? Wouldn't it make sense to make a die that's 8 times as big? Then, at the same feature size (0.18 or whatever), you get the same number of total transistors in both systems, same area dedicated to CPU per rig, but less slow (ie, FSB) interconnect
Re:size isn't the answer either (Score:2)
That's certainly true, but there's a lot more wiggle room in terms of increasing effective die size. It isn't usually foreign material per se that is the problem - more frequently, it's from defects in the crystal structure of the Si crystal. There are a number of research groups looking into eradicating these problems
Re:size isn't the answer either (Score:2)
Moore's law is already ending (Score:4, Informative)
Historically, designing CPUs for higher-level purposes, other than simply designing them to execute traditional assembly language, has been deemed a failure. This is because generic hardware advanced so quickly that the custom processors were outdated as soon as they were finished. Witness Wirth's Lilith, which was soon outperformed by an off-the-shelf 32-bit CPU from National Semiconductor (remember them?). The Lisp machine is a higher profile example.
But now things are not so clear. Ericsson designed a processor to run their Erlang concurrent-functional programming language, a language they use to develop high-end, high-availability applications. The FPGA prototype was outperforming the highly-optimized emulator that had been using up to that point by a factor of 30. This was with the FPGA at a clock speed of ~20MHz, and the emulator running on an UltraSPARC at ~500MHz. And remember, this was with an FPGA prototype, one that didn't even include branch prediction. Power dissipation was on the order of a watt or two.
Quite likely, we're going to start seeing more of this approach. Figure out what it is that you actually want to *do*, then design for that. Don't design for an overly general case. For example, 90% of desktop CPU use could get by without floating point math, especially if there were some key fixed point instructions in the integer unit. But every Pentium 4 and Athlon not only includes 80-bit floating point units, but massive FP vector processing units as well. (Not to mention outmoded MMX instructions that are almost completely ignored.)
Re:Moore's law is already ending (Score:2)
LOL! You MUST be trolling. Seriously! I'll bite anyway, though. How many people, with their computer:
1) play audio/video
2) edit audio/video
3) he
Floating Point (Score:2)
Re:Moore's law is already ending (Score:5, Insightful)
Well, except for games.
And anything that uses 3D.
And audio/video playback and work.
And image editing.
And some spreadsheets.
What's that leave, web surfing and word processing? No, even the web surfing is going to use the FPU as soon as you hit a Flash or Java applet.
Re:Moore's law is already ending (Score:4, Interesting)
Games and 3D make heavy use of FPU, but it's interesting to note that as time goes on, more and more of the heavy lifting FP work is being offloaded to the graphics processor.
Given a few more generations, most of the FPU work in todays games may actually be executed in the GPU.
Of course, this doesn't actually change anything, since tomorrow's games will just put that much more load on the CPU for physics processing and such!
Video codecs are essentially all integer based. Audio codecs often use the FPU, but they really don't need to - fixed point implementations tend to be just as fast.
The vast bulk of image editing work tends to be integer-based, or easily convertible to integer-based.
Spreadsheet math calculations aren't really performance-related in any sense. 99.9% (remember, your statistics may be made up on the spot, but mine are based on sound scientific handwaving!) of the time a spreadsheet spends is in fiddling with the GUI, which is primarily an integer operation activity.
That said, the parent poster's point sort of goes both ways. It's true that the FPU unit is heavily underutilized by most things outside of games, so it's not an unreasonable idea to strip it out and let the FPU be emulated in software or microcode or whatnot.
However, that won't necessarily really help. Modern CPU cores are better able to manage their power routing than previous ones, so having an FPU on there doesn't necessarily cause any trouble. The CPU may be able to disconnect power to the FPU when it's not in use, thus making the whole thermal issue something of a moot point in this respect. If it doesn't generate heat, it's just a matter of wasted silicon - and silicon's becoming quite cheap!
In fact, the FPU is an example of good design in CPU's, really. It's not too hard to fit a lot of computation units on one CPU core these days, hence having multiple ALU and FPU computation units being driven by complicated pipelining and SIMD engines. The difficulty is making efficient use of them - note the trouble getting SMP to work efficiently, and the whole idea of hyperthreading. While the FPU may get fairly low utilization, it is fantastically faster at floating point computation than the integer cores are, and putting a couple on a chip is thus generally a pretty good idea.
Re:Moore's law is already ending (Score:2)
Yes. And it should be a double-precision FPU. Trying to cram an physics engine onto the PS2, which has only 32-bit floating point, is not a happy experience.
I'm looking forward to... (Score:2, Funny)
Moore's Law is not a "law" (Score:4, Informative)
"Moore's Law" is no more a "law" in the sense of physics (or anything else for that matter), than any other basic observation made by a scientist or physicist.
Oddly, you'd have a hard time believing it wasn't a Law of Nature by the apocalyptic cries from the technology industry when "Moore's Law" falls behind - spouting that something *has* to be done immediately for Moore's Law to continue, lest the nuclear reaction in the Sun cease. Or something.
At the time it was coined by the *press* in 1965, only a small fraction of what we now know was known about the physics of integrated circuits and semiconductors at the time. So, looking back it's easy to see that the exponential trend in density would continue as long as the knowledge and abilility to manipluate materials increased exponentially.
Yes, it is rather surprising that Moore's observation has held true as long as it has. And this isn't to say that the growth trend won't continue, but it will certainly level off for periods while materials or manufacturing research comes up with some new knowledge to advance the industry.
As the article indicates, things are likely headed for a plateau, possibly toward the end of this decade or start of the next. And at that point, Moore's observation will simply no longer be true or appropriate.
Let the cries of armageddon begin as "Moore's Law" is finally recognized as an observation that will eventually be outlived.
For a little "Moore" background, see http://www.intel.com/research/silicon/mooreslaw.h
Re:Moore's Law is not a "law" (Score:2)
So?
Name change? (Score:2)
Shouldn't their acronym read FBI insted of DARPA, then?
Moore's Law in design (Score:2)
Let's keep in mind that Moore's Law was more an observation than a predictive law of nature, despite how people treat it that way.
Not entirely. The folks designing FooCorp's next generation of e.g. chip fabs generally use Moore's Law to tell them where the competition will be by the time the fab is built: FooCorp needs to be competitive at that point in the future. Then the folks designing e.g. PDAs use Moore's Law to tell them what processor power, memory capacity, etc will be available to them by th
Get rid of C! (Score:5, Interesting)
Think about this: Why is video graphics hardware so much faster than CPU's? You might say that it is because the video card is specifically designed for one task... however, these days, that isn't really true. Modern video cards allow you to write small -- but arbitrary -- programs which are run on every vertex or every pixel as they are being rendered. They aren't quite as flexible as the CPU, but they are getting close; the newest cards allow for branching and control flow, and they are only getting more flexible. So, why are they so much faster? There are a lot of reasons, but a big one is that they can do lots of things at the same time. The card can easily process many vertices or pixels in parallel.
Now, getting back to C... A program in C is supposed to be executed in order. A good compiler can break that rule in some cases, but it is harder than you would think. Take this simple example:
This is just a piece of C code which takes a list of numbers and produces another list by adding one to each number.
Now, even with current, mostly-serial CPU's, the fastest way to perform this loop is to process several numbers at once, so that the CPU can work on incrementing some of the numbers while it waits for the next ones to load from RAM. For highly-parallel CPU's (such as many currenty in development), you would even more so want to work on several numbers simultaneously.
Unfortunately, because of the way C is designed, the compiler can not apply such optimizations! The problem is, the compiler does not know if the "out" list overlaps with the "in" list. If it does, then the compiler has to do the assignments one-at-a-time to insure proper execution. Imagine the following code that calls the function, for example:
Of course, using the function in such a way would not be very useful, but the compiler has to allow for it. This problem is called "aliasing".
ISO C99 provides for a "restrict" keyword which can help prevent this problem, but few people understand it, even fewer use it, and those who do use it usually don't use it everywhere (using it everywhere would be too much work). It's not a very good solution anyway -- more of a "hack" if you ask me.
Anyway, to sum it up, C generally requires the CPU to do things in sequence. As a result, CPU manufacturers are forced to make CPU's that do one thing at a time really, really fast, rather than lots of things at the same time. And, so, since it is so much harder to design a fast CPU, we end up with slower CPU's... and we hit the limits of "Moore's Law" far earlier than we should.
In contrast, functional languages (such as ML, Haskell, Ocaml, and, to a lesser extent, LISP), due to the way they work, have no concept of "aliasing". And, despite what many experienced C programmers would expect, functional languages can be amazingly fast, despite being rather high-level. Functional languages are simply easier to optimize. Unfortunately, experienced C/C++/Java/whatever programmers tend to balk at functional languages at first, as learning them can be like learning to program all over again...
So, yeah. I recommend you guy
Re:Get rid of C! (Score:2)
There's a damn good reason almost nobody cares about this, and the ones that do care already care, and that is that for the vast majority of what people do every day, none of that matters.
You want to create Yet Another Functional Language? Hey, great, I'd hate to be the guy
Re:Get rid of C! (Score:2)
Re:Get rid of C! (Score:2)
Re:Get rid of C! (Score:2)
Moore's law/observation (Score:2)
The only direct effect is that the cost for a chip is halved every 18 months (assuming cost ~ die area). A side-effect is the fact that smaller transistors can be run at higher clocks than larger transistors, and/or dissipate less heat.
It is upto processor archit
Use your eyes (Score:2)
Moore's Law is scientific (Score:2)
We can apply it to other industries and seen the same effects.
For instance, before deregulation, long distant phone calls were expensive. Today, they are dropping in price, while the QOS and coverage is expanding.
The software industry, due to almost complete government non-interference, is able to take software to completely new levels every two to three yea
Oh great... (Score:2)
>:-(
The end of Moore's Law (Score:2)
We may hit a wall before that. Power dissipation may limit device density before atom size or fabrication technology doesn. In that case, memory devices, which use less power per unit area, will continue to be fabbed at smaller scales, while busier parts (CPUs, graphics engines) will not progress as much.
Ther
Re:The end of Moore's Law (Score:2)
Re:Dear Al Gore (Score:2)
That joke was dumb three years ago.
Sincerely,
The rest of the world,
Re:Dear Al Gore (Score:2)
I hope your brokerage doesn't get whacked by the feds.
Sincerely,
Penpal
P.S. send my best to the Schwabs.