Does Moore's Law Help or Hinder the PC Industry? 191
An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"
Both (Score:5, Insightful)
The drum beat of progress pushes development to it's limits, but at the same time hinders some forms of research or real world tests of computation theory, for all save the few chip makers dominating the market currently.
I'm gonna vote for hurts - big time (Score:5, Insightful)
Moore's Observation (Score:5, Insightful)
No significances. (Score:5, Insightful)
In My Opinion, It Isn't a Law (Score:5, Insightful)
Moore (or Mead for that matter) didn't get up one day and declare that the amount of transistors on a square centimeter of space will double every 18 to 24 months. Nor did he prove in anyway that it has always been this way and will always be this way.
He made observations and these observations happen to have held true for a relatively long time in the world of computers. Does that make them a law? Definitely not! At some point, the duality that small particles suffer will either stop us dead in our tracks or (in the case of quantum computers) propel us forward much faster than ever thought.
Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!
Re:Moore's Observation (Score:2, Insightful)
I suppose the laymen need something like to rally around, though.
Sure, double the number of transistors! But, did that do anything useful? Did you gain enough performance to offset the complexity you just created? In the drive to "keep up with Moore's Law", are we better off? Are the processors now "better", or simply faster to make up for how fat they have become?
Definately Both (Score:2, Insightful)
The Real Story (Score:4, Insightful)
Re:No significances. (Score:5, Insightful)
Here come the pedants (Score:2, Insightful)
Re:Moore's Observation (Score:5, Insightful)
No one does anything in an effort to prove Moore correct... they do it for their own benefit. Intel does it to stay ahead of their competition and continue to keep selling more processors. If they chose to stop adding transistors they could pretty much count on losing the race to AMD, and likely becoming obsolete in a very short time.
I agree that more transistors != better... however it is indeed the easiest way, and least complex, to increase performance. Changing the architecture of the chip, negotiating with software developers to support it, etc, is far more complex than adding more transistors.
Re:Moore's Observation (Score:2, Insightful)
Seriously.
Moore's "law" doesn't mean squat. Its not like gravity. Its more like noticing that I've never had a car accident.
Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply?
Yeah. Right.
Re:I'm gonna vote for hurts - big time (Score:3, Insightful)
Re:Both (Score:4, Insightful)
We're also not paying US$800 for a 80387 math co-processor (only did floating point). Like a friend of mine did in the 80's. That would be about $US1,600 in today's dollars.
Moore's Law is a crappy measurement (Score:4, Insightful)
Take the Itanic for example, or the P4, or WindowsME/Vista.
Why we need faster computers (Score:3, Insightful)
Despite this, there have been complaints from the PC industry that Vista isn't enough of a resource hog to force people to buy new hardware.
Computers have become cheaper. I once paid $6000 for a high-end PC to run Softimage|3D. The machine after that was $2000. The machine after that was $600.
Re:I'm gonna vote for hurts - big time (Score:2, Insightful)
Have you ever seen people confused about which package to get for which Linux distro? They don't know if they should get the one for Fedora, Ubuntu, Knoppix, Gentoo or Debian, and then they have to decide i386, x86_64, ppc, or whatever else there is.
Yes, most developers would have no problem, and most users wouldn't care once everything was working, it's just getting things into a working state that would suck when underlying architectures are changing every few years.
Instruction set != architecture (Score:3, Insightful)
I'm just going to refer you to my comment made earlier today when discussing a "new, better" processor architecture. Because there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.
See here [slashdot.org].
Re:Moore's Observation (Score:5, Insightful)
[1] There is no real upper bound on the number of transistors you can fit on a chip, just the number you can for a given investment.
Re:I'm gonna vote for hurts - big time (Score:3, Insightful)
Re:Here come the pedants (Score:1, Insightful)
And cue the moderators who mod those folks up thereby encouraging pedantic asshat behavior on Slashdot.