Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology Science

Does Moore's Law Help or Hinder the PC Industry? 191

An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"
This discussion has been archived. No new comments can be posted.

Does Moore's Law Help or Hinder the PC Industry?

Comments Filter:
  • Both (Score:5, Insightful)

    by DaAdder ( 124139 ) on Wednesday April 25, 2007 @12:46PM (#18872105) Homepage
    I suppose it does both.

    The drum beat of progress pushes development to it's limits, but at the same time hinders some forms of research or real world tests of computation theory, for all save the few chip makers dominating the market currently.
  • by $RANDOMLUSER ( 804576 ) on Wednesday April 25, 2007 @12:48PM (#18872129)
    If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.
  • by Edward Ka-Spel ( 779129 ) on Wednesday April 25, 2007 @12:48PM (#18872135)
    It's not a law, it's an observation.
  • No significances. (Score:5, Insightful)

    by Frosty Piss ( 770223 ) on Wednesday April 25, 2007 @12:48PM (#18872139)
    "Moore's Law" is not a real law. In reality, it is not relevant at all. It's kind of a cute thing to mention, but when it gets down to the real world engineering, it has no significances.
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Wednesday April 25, 2007 @12:54PM (#18872211) Journal
    I always viewed this as an observation or rule of thumb, not a law.

    Moore (or Mead for that matter) didn't get up one day and declare that the amount of transistors on a square centimeter of space will double every 18 to 24 months. Nor did he prove in anyway that it has always been this way and will always be this way.

    He made observations and these observations happen to have held true for a relatively long time in the world of computers. Does that make them a law? Definitely not! At some point, the duality that small particles suffer will either stop us dead in our tracks or (in the case of quantum computers) propel us forward much faster than ever thought.

    Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!
  • by Ngarrang ( 1023425 ) on Wednesday April 25, 2007 @12:55PM (#18872239) Journal
    Agreed. Why the industry chooses to measure itself against what some may consider just an anecdotal observation I will never understand.

    I suppose the laymen need something like to rally around, though.

    Sure, double the number of transistors! But, did that do anything useful? Did you gain enough performance to offset the complexity you just created? In the drive to "keep up with Moore's Law", are we better off? Are the processors now "better", or simply faster to make up for how fat they have become?
  • Definately Both (Score:2, Insightful)

    by john_is_war ( 310751 ) <jvines.gmail@com> on Wednesday April 25, 2007 @12:57PM (#18872263)
    With companies driving to increase transistor density by decreasing process size, the speed we can accurately use these methods is slowing. With each decrease in process size, a lot of issues arise with power leakage. This is where multi-core processors come in. These are the future because of the speed cap of processors. And hopefully this will spur an improvement in microprocessor architecture.
  • The Real Story (Score:4, Insightful)

    by tomkost ( 944194 ) on Wednesday April 25, 2007 @12:57PM (#18872267)
    The real story is that Moore's law describes the basic goal of the semiconductor industry. Perhaps there are better goals, but they tend to get swallowed up in the quest for smaller transistors. The other real story is Gate's law: I will use up those extra transistors faster than you can create them. My hardware OEMs need a bloated OS that will drive new HW replacement cycles. I also seem to remember Moore's law was often quoted as a doubling every year, now I see some saying 18-24 months, so I think in fact the rule is slowing down. We are pushing into the area where it takes a lot of effort and innovation to get a small increase in density. Even still, Moore's law has always been a favorite of mine! Tom
  • by truthsearch ( 249536 ) on Wednesday April 25, 2007 @12:58PM (#18872273) Homepage Journal
    While you are correct, it has value as being accurate foresight. So the question is, was it just an observation or did it become a self-fulfilling prophecy? If it was a self-fulfilling prophecy then what other judgements can we make now that may drive technology in the future?
  • by Dara Hazeghi ( 1076823 ) on Wednesday April 25, 2007 @01:02PM (#18872329) Homepage
    Cue all the pedantic asshats who absolutely have to point out that Moore's Law really isn't a Law... it's an observation.
  • by jhfry ( 829244 ) on Wednesday April 25, 2007 @01:03PM (#18872341)
    If you think that Intel or AMD double the number of transistors in an effort to keep up with Moore's law than you know nothing about business.

    No one does anything in an effort to prove Moore correct... they do it for their own benefit. Intel does it to stay ahead of their competition and continue to keep selling more processors. If they chose to stop adding transistors they could pretty much count on losing the race to AMD, and likely becoming obsolete in a very short time.

    I agree that more transistors != better... however it is indeed the easiest way, and least complex, to increase performance. Changing the architecture of the chip, negotiating with software developers to support it, etc, is far more complex than adding more transistors.
  • by vux984 ( 928602 ) on Wednesday April 25, 2007 @01:04PM (#18872353)
    Mod parent up.
    Seriously.

    Moore's "law" doesn't mean squat. Its not like gravity. Its more like noticing that I've never had a car accident.

    Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply? ...

    Yeah. Right.
  • by $RANDOMLUSER ( 804576 ) on Wednesday April 25, 2007 @01:07PM (#18872381)
    Game designers do it all the time. Compiler writers do it all the time. For 99.5% of the programmers out there, the underlying architecture is a black box; they only use the capabilities of the high-level language they happen to be using. But the final performance and capabilities of the system as a whole depend on that underlying architecture, which has been a single-accumulator, short-on-registers, byzantine instruction set (must. take. deep. breaths...) anachronism for far too long.
  • Re:Both (Score:4, Insightful)

    by ElectricRook ( 264648 ) on Wednesday April 25, 2007 @01:08PM (#18872411)

    We're also not paying US$800 for a 80387 math co-processor (only did floating point). Like a friend of mine did in the 80's. That would be about $US1,600 in today's dollars.

  • by Ant P. ( 974313 ) on Wednesday April 25, 2007 @01:09PM (#18872423)
    ...Like GHz or lines of code.

    Take the Itanic for example, or the P4, or WindowsME/Vista.
  • by Animats ( 122034 ) on Wednesday April 25, 2007 @01:23PM (#18872623) Homepage
    • Windows. The speed of Windows halves every two years.
    • Spyware, adware, and malware. Extra CPU power is needed to run all those spam engines in the background.
    • Spam filtering. Running Bayesian filters on all the incoming mail isn't cheap.
    • Virus detection. That gets harder all the time, since signature based detection stopped working.
    • Games. Gamers expect an ever-larger number of characters on-screen, all moving well. That really uses resources.

    Despite this, there have been complaints from the PC industry that Vista isn't enough of a resource hog to force people to buy new hardware.

    Computers have become cheaper. I once paid $6000 for a high-end PC to run Softimage|3D. The machine after that was $2000. The machine after that was $600.

  • by Chosen Reject ( 842143 ) on Wednesday April 25, 2007 @01:33PM (#18872749)
    And for 99.9% of users the underlying architecture is a block box. But 100$ of applications the underlying architecture is important, and if the application doesn't run then the user gets upset. Doesn't matter if the application only needs to be recompiled, even if the developers gave away free recompiles to people who had previously purchased the software, it would require users to know which architecture they have (already difficult for most people) and make sure they get the right one.

    Have you ever seen people confused about which package to get for which Linux distro? They don't know if they should get the one for Fedora, Ubuntu, Knoppix, Gentoo or Debian, and then they have to decide i386, x86_64, ppc, or whatever else there is.

    Yes, most developers would have no problem, and most users wouldn't care once everything was working, it's just getting things into a working state that would suck when underlying architectures are changing every few years.
  • by Criffer ( 842645 ) on Wednesday April 25, 2007 @01:40PM (#18872833)

    If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.

    I'm just going to refer you to my comment made earlier today when discussing a "new, better" processor architecture. Because there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.

    See here [slashdot.org].
  • by TheRaven64 ( 641858 ) on Wednesday April 25, 2007 @01:42PM (#18872877) Journal
    It takes roughly 3-5 years to design a modern CPU. At the start of the process, you need to know how many transistors you will have to play with. If you guess to few, you can do some tricks like adding more cache, but you are likely to have a slower chip than you wanted. If you guessed too many, you end up with a more expensive chip[1]. Moore's 'law' is a pretty good first-approximation guess of how many you will have at the end of the design process. A company that can't make this prediction accurately is not going to remain competitive for long.


    [1] There is no real upper bound on the number of transistors you can fit on a chip, just the number you can for a given investment.

  • by sgt scrub ( 869860 ) <[saintium] [at] [yahoo.com]> on Wednesday April 25, 2007 @01:52PM (#18873003)

    we'd have to rely on better processor design.
    Not to mention we'd have to rely on better software design. The way Moores law effects software, by allowing it to bloat, is the anti technology.
  • by Anonymous Coward on Wednesday April 25, 2007 @02:14PM (#18873291)
    Cue all the pedantic asshats who absolutely have to point out that Moore's Law really isn't a Law... it's an observation.

    And cue the moderators who mod those folks up thereby encouraging pedantic asshat behavior on Slashdot.

1 + 1 = 3, for large values of 1.

Working...