Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Technology Hardware Science

Forty Years of Moore's Law 225

kjh1 writes "CNET is running a great article on how the past 40 years of integrated chip design and growth has followed [Gordon] Moore's law. The article also discusses how long Moore's law may remain pertinent, as well as new technologies like carbon nanotube transistors, silicon nanowire transistors, molecular crossbars, phase change materials and spintronics. My favorite data point has to be this: in 1965, chips contained about 60 distinct devices; Intel's latest Itanium chip has 1.7 billion transistors!"
This discussion has been archived. No new comments can be posted.

Forty Years of Moore's Law

Comments Filter:
  • Re:Keeping Count (Score:2, Interesting)

    by Ruediger ( 777619 ) on Tuesday April 05, 2005 @07:40PM (#12149216)
    I still find amazing that they managed to fit 1.7 billion transitors in a chip.
  • Re:Kinda obvious.... (Score:4, Interesting)

    by fm6 ( 162816 ) on Tuesday April 05, 2005 @07:53PM (#12149318) Homepage Journal
    Strictly speaking, you're right. But Moore's law, despite the name, isn't a law of nature. It's an observation about the progress of the chip industry. And that progress is motivated by a simple feedback loop: other industries put ICs into their products, which motivates the IC industry to retool to make better, cheaper ICs, which motivates other industries to put ICs into their products...

    Moore's original observation, that transistor density doubles every 18 months, will obviously cease to apply once it becomes impossible to make transistors. But as long as that feedback loop continues to churn, it continues to make sense to talk about Moore's law.

  • Self fulfilling (Score:5, Interesting)

    by Bifurcati ( 699683 ) on Tuesday April 05, 2005 @07:56PM (#12149340) Homepage
    One can't help but wonder whether there's a self fulfilling element to these sort of prophecies - do computer manafacturers feel pressure to adhere to Moore's law? Is it a challenge to keep up? Or is it really just chance?

    Also, for the record as a physicist, quantum computers won't remove the need for conventional computers in most areas - a big thing is (as I understand it) that they're not programmable, and have to be built to a certain specification. Therefore, classical computers will always have their use.

  • by highfreq2 ( 575192 ) on Tuesday April 05, 2005 @07:58PM (#12149358)
    Somewhere around there the number of transistors in a chip becomes equal to the number of atoms in the known universe.
  • Graphs???? (Score:4, Interesting)

    by King-Raz ( 51985 ) on Tuesday April 05, 2005 @07:58PM (#12149359)
    Has anyone got any pretty graphs of the performance of particular CPUs against time? It would be cool to have some sort of visual representation of the validity of Moore's law.
  • Bugs (Score:5, Interesting)

    by sicking ( 589500 ) on Tuesday April 05, 2005 @08:01PM (#12149381)
    What amazes me the most is the amount of bugs a device with 1.7 billion transistors has compared to the number of bugs in, say, Windows XP, GIMP or Firefox.

    And don't give me any crap about that software is somehow inherently harder to keep bugfree. I develop both and there really is little difference when it comes to complexity.

    Sure, software performs more complex tasks, but when you add 'parallel-ness' of hardware, as well as timing issues, temperature and manufacturing issues, clock distribution, leakage and crosstalk, hardware defenetly is a pretty good match.

    The simple truth is that there is simply vastly more testing that goes into hardware then most software (software in mars rovers and lunar landers would be an exception). And I bet that there are better design methods and safty guards too.
  • Re:Keeping Count (Score:1, Interesting)

    by CmdrTostado ( 653672 ) on Tuesday April 05, 2005 @08:08PM (#12149434) Journal
    Actually, the blacks were the group who perfected the art of barbeque. They took the leftover pieces that their the masters didn't want to mess with, such as the ribs, and learned how to slow cook them to perfection.

    P.S. I grilled 3 meals last weekend. On charcoal. Real flames. Not cityfied propane flames.

    Beef, it's what's for dinner, because there's no such thing as a chicken knife.
  • by Saeger ( 456549 ) <`farrellj' `at' `gmail.com'> on Tuesday April 05, 2005 @08:14PM (#12149477) Homepage
    Few people realize that Moore's Law is just one component of an even greater overall exponential trend which has been called The Law of Accelerating Returns [kurzweilai.net] (by Ray Kurzweil).

    Basically, it has been observed that any evolutionary process (including technology) will progress exponentially as it builds on past progress, with barely perceptable slow-down/speed-up "S-curves" as paradigm shifts occur.

    Moore's Law is certainly an important component of this trend, as it relates to computing power and eventual AI/IA accelerating to Singularity [singinst.org] in ~25 years, but there are many others in parallel: storage space, networking bandwidth, # of internet nodes, transportation speed, etc.

    One thing that certainly ISN'T keeping pace with our technology is our old evolutionary psychology; hopefully we can fix [hedweb.com] some of the more disgusting aspects of human nature before it's too late [gmu.edu].

  • by product byproduct ( 628318 ) on Tuesday April 05, 2005 @08:32PM (#12149591)
    Here's a GPU performance graph [plasma-online.de] which illustrates his point.
  • Re:Graphs???? (Score:2, Interesting)

    by CurbyKirby ( 306431 ) on Tuesday April 05, 2005 @08:56PM (#12149746) Homepage
    First, to answer your question: yes, Tomshardware recently updated their CPU benchmark test to now include over 100 CPUs from the last ten years. Starts here (graphs come later):

    http://www20.tomshardware.com/cpu/20041220/index.h tml [tomshardware.com]

    Now to explain why you're asking the wrong question: Moore's observation says nothing directly about performance. He merely suggested that the complexity of ICs double every 18 months or so. In general, this has nothing to do with a comparable trend in clock speeds on CPUs, nor performance of CPUs.

    On tom's charts, the most recent CPUs are about 50% faster in raw dhry-/whet-stone tests than my CPU which I bought two years ago. Other tests, which rely less on raw CPU performance, show an even smaller difference.

    At some point in the past, performance of commodity hardware might have indeed doubled every year and a half. For the past 2-3 years, that's certainly untrue.
  • by Doppler00 ( 534739 ) on Tuesday April 05, 2005 @09:08PM (#12149803) Homepage Journal
    This is a good point. I have money saved up just waiting to buy the latest greatest thing... yet it's not here? My 3.0GHz P4 I bought in Jan 2004 is within %20 of the speed of any of Intel's offerings now (within the same class: desktop/consumer). And even when the dual core devices are released, I'm not confident that they will provide a doubling of performance.

    And what about Nvidia? They're last product jump from 5900 to the 6800 was absolutely amazing. A very clear %100 increase in performance. I'd be very surprised to see Nvidia be able to match that leap sooner than 4Q 2006.
  • Heard of it... (Score:3, Interesting)

    by Goonie ( 8651 ) <robert DOT merkel AT benambra DOT org> on Tuesday April 05, 2005 @09:31PM (#12149987) Homepage
    ..but think it's bunk. There is absolutely no evidence to suggest that more-than-human AI is an inevitable consequence of continued development of computer hardware. The last 50 years of faster computers haven't helped much so far. Nor am I aware of some brilliant AI technique that will be made possible by much faster conventional computers. Technological progress generally happens in fits and starts, with radical jumps long periods of slow, gradual improvement in between. The chip industry is possibly an exception; but, frankly, I suspect if you could come up with a "utility gained" measure it would grow a lot more slowly than chip density.
  • by garethw ( 584688 ) on Tuesday April 05, 2005 @09:35PM (#12150013)
    But that's not what Moore's Law says.

    All it says is that the number of transistors you can fit in a fixed area doubles roughly every 18 months (or, expressed another way, the area of a transistor is halving every 18 months.)

    Making transistors smaller does tend to mean you can run cirucits faster because you can switch state faster (which in turn, also reduces the dynamic component of your power consumption), but it's not just a simple linear relationship between size and speed.
  • by Jhyrryl ( 208418 ) on Tuesday April 05, 2005 @09:44PM (#12150074)
    My favorite data point has to be this: in 1965, chips contained about 60 distinct devices; Intel's latest Itanium chip has 1.7 billion transistors!"

    From Popular Mechanics, march 1949:

    "...computers in the future may have only 1000 vacuum tubes and perhaps weigh only 1 1/2 tons."

  • by danila ( 69889 ) on Wednesday April 06, 2005 @08:14PM (#12160724) Homepage
    Sorry for the unwarranted conclusion, but the second part of my claim may still be valid. That you have worked in a particular field (AI) doesn't automatically make you qualified to make claims about developments in this field more than a decade in the future.

    Going back to your original post, the evidence that faster hardware means human and then more than human AI is as strong as it can be at this stage. We haven't found anything odd in the human brain that can't be simulated (and already simulated some parts). We found that individual neurons works in a rather simple way. We found that the brain is not a mysterious everything-connected-to-everything device, but a modular, rather crude and tolerant device. We also made significant process in brain scanning. All this leads to a conclusion that in a relatively near future (2-3 decades) it will be possible to simulate the human brain in silicon. Add a few more years and we might even simulate a brain that works.

    This alone leads to more-than human AI as "an inevitable consequence of continued development of computer hardware". Your comment about "past 50 years" is rather idiotic, because 1) computers basically started 50 years ago and 2) we know for certain that today's computers are very slow compared with a human brain. As for the brilliant techniques, Moravec comments [transhumanist.com] on that. There are, indeed, many techniques that are impractical below a certain speed (as a matter of fact, most of techniques are that way).

    It appears to me that you simply have a negative outlook towards technology (not 100% negative, mind you), and so you attempt to fit reality into your narrow beliefs (see your last sentence about "utility gained"). For some irrational reason you don't want progress to work. Well, this is clearly a problem, but one we can't do anything about right now. May be your brain is low on dopamine or something.

    In any case, there is basically nothing useful that simple negativism such as expressed by yourself can bring to the discussion. "This won't work" is simply useless, especially when others have reasons to believe that it will. I can't tell you to read up, because you claim you already read enough (didn't do you much good though), but may be you can try improving your outlook on life. Ask your doctor for some anti-depressants. I've also read today that Semen can act as one [newscientist.com]. Then you might be able to consider our future prospects without your preconceived pessimism.

All the simple programs have been written.

Working...