Forty Years of Moore's Law 225
kjh1 writes "CNET is running a great article on how the past 40 years of integrated chip design and growth has followed [Gordon] Moore's law. The article also discusses how long Moore's law may remain pertinent, as well as new technologies like carbon nanotube transistors, silicon nanowire transistors, molecular crossbars, phase change materials and spintronics. My favorite data point has to be this: in 1965, chips contained about 60 distinct devices; Intel's latest Itanium chip has 1.7 billion transistors!"
Re:Keeping Count (Score:2, Interesting)
Re:Kinda obvious.... (Score:4, Interesting)
Moore's original observation, that transistor density doubles every 18 months, will obviously cease to apply once it becomes impossible to make transistors. But as long as that feedback loop continues to churn, it continues to make sense to talk about Moore's law.
Self fulfilling (Score:5, Interesting)
Also, for the record as a physicist, quantum computers won't remove the need for conventional computers in most areas - a big thing is (as I understand it) that they're not programmable, and have to be built to a certain specification. Therefore, classical computers will always have their use.
It definitely has less that 300 - 400 years. (Score:5, Interesting)
Graphs???? (Score:4, Interesting)
Bugs (Score:5, Interesting)
And don't give me any crap about that software is somehow inherently harder to keep bugfree. I develop both and there really is little difference when it comes to complexity.
Sure, software performs more complex tasks, but when you add 'parallel-ness' of hardware, as well as timing issues, temperature and manufacturing issues, clock distribution, leakage and crosstalk, hardware defenetly is a pretty good match.
The simple truth is that there is simply vastly more testing that goes into hardware then most software (software in mars rovers and lunar landers would be an exception). And I bet that there are better design methods and safty guards too.
Re:Keeping Count (Score:1, Interesting)
P.S. I grilled 3 meals last weekend. On charcoal. Real flames. Not cityfied propane flames.
Beef, it's what's for dinner, because there's no such thing as a chicken knife.
Law of Accelerating Returns (Score:5, Interesting)
Basically, it has been observed that any evolutionary process (including technology) will progress exponentially as it builds on past progress, with barely perceptable slow-down/speed-up "S-curves" as paradigm shifts occur.
Moore's Law is certainly an important component of this trend, as it relates to computing power and eventual AI/IA accelerating to Singularity [singinst.org] in ~25 years, but there are many others in parallel: storage space, networking bandwidth, # of internet nodes, transportation speed, etc.
One thing that certainly ISN'T keeping pace with our technology is our old evolutionary psychology; hopefully we can fix [hedweb.com] some of the more disgusting aspects of human nature before it's too late [gmu.edu].
Re:Moore's Law is probably being exceeded at... (Score:3, Interesting)
Re:Graphs???? (Score:2, Interesting)
http://www20.tomshardware.com/cpu/20041220/index.
Now to explain why you're asking the wrong question: Moore's observation says nothing directly about performance. He merely suggested that the complexity of ICs double every 18 months or so. In general, this has nothing to do with a comparable trend in clock speeds on CPUs, nor performance of CPUs.
On tom's charts, the most recent CPUs are about 50% faster in raw dhry-/whet-stone tests than my CPU which I bought two years ago. Other tests, which rely less on raw CPU performance, show an even smaller difference.
At some point in the past, performance of commodity hardware might have indeed doubled every year and a half. For the past 2-3 years, that's certainly untrue.
Re:Don't hold your breath... (Score:4, Interesting)
And what about Nvidia? They're last product jump from 5900 to the 6800 was absolutely amazing. A very clear %100 increase in performance. I'd be very surprised to see Nvidia be able to match that leap sooner than 4Q 2006.
Heard of it... (Score:3, Interesting)
Re:When was the last time Moore's law was correct? (Score:2, Interesting)
All it says is that the number of transistors you can fit in a fixed area doubles roughly every 18 months (or, expressed another way, the area of a transistor is halving every 18 months.)
Making transistors smaller does tend to mean you can run cirucits faster because you can switch state faster (which in turn, also reduces the dynamic component of your power consumption), but it's not just a simple linear relationship between size and speed.
Another Good Quotable (Score:4, Interesting)
From Popular Mechanics, march 1949:
"...computers in the future may have only 1000 vacuum tubes and perhaps weigh only 1 1/2 tons."
Re:For your information... (Score:3, Interesting)
Going back to your original post, the evidence that faster hardware means human and then more than human AI is as strong as it can be at this stage. We haven't found anything odd in the human brain that can't be simulated (and already simulated some parts). We found that individual neurons works in a rather simple way. We found that the brain is not a mysterious everything-connected-to-everything device, but a modular, rather crude and tolerant device. We also made significant process in brain scanning. All this leads to a conclusion that in a relatively near future (2-3 decades) it will be possible to simulate the human brain in silicon. Add a few more years and we might even simulate a brain that works.
This alone leads to more-than human AI as "an inevitable consequence of continued development of computer hardware". Your comment about "past 50 years" is rather idiotic, because 1) computers basically started 50 years ago and 2) we know for certain that today's computers are very slow compared with a human brain. As for the brilliant techniques, Moravec comments [transhumanist.com] on that. There are, indeed, many techniques that are impractical below a certain speed (as a matter of fact, most of techniques are that way).
It appears to me that you simply have a negative outlook towards technology (not 100% negative, mind you), and so you attempt to fit reality into your narrow beliefs (see your last sentence about "utility gained"). For some irrational reason you don't want progress to work. Well, this is clearly a problem, but one we can't do anything about right now. May be your brain is low on dopamine or something.
In any case, there is basically nothing useful that simple negativism such as expressed by yourself can bring to the discussion. "This won't work" is simply useless, especially when others have reasons to believe that it will. I can't tell you to read up, because you claim you already read enough (didn't do you much good though), but may be you can try improving your outlook on life. Ask your doctor for some anti-depressants. I've also read today that Semen can act as one [newscientist.com]. Then you might be able to consider our future prospects without your preconceived pessimism.