Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology Science

Does Moore's Law Help or Hinder the PC Industry? 191

An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"
This discussion has been archived. No new comments can be posted.

Does Moore's Law Help or Hinder the PC Industry?

Comments Filter:
  • by titten ( 792394 ) on Wednesday April 25, 2007 @12:48PM (#18872149)
    If Moore's law is helping or hindering the PC industry? I don't think it could hinder it... Do you think we'd have even more powerful computers without it? Or higher transistor density, if you like?
  • Efficiency (Score:4, Interesting)

    by Nerdfest ( 867930 ) on Wednesday April 25, 2007 @12:51PM (#18872171)
    It certainly seems to have had an effect on peoples attention to writing efficient code. Mind you, it is more expensive to write code than throw more processor at things ...
  • by bronzey214 ( 997574 ) <[jason.rippel] [at] [gmail.com]> on Wednesday April 25, 2007 @12:55PM (#18872223) Journal
    Sure, being tied to the x86 architecture hurts, but it's nice to have a pretty base standard as far as architectures go and not have to learn different assembly languages, data flows, etc. for each new generation of computers.
  • Why? (Score:4, Interesting)

    by malsdavis ( 542216 ) on Wednesday April 25, 2007 @12:59PM (#18872289)
    Why do computers in general need to get any faster these days?

    Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.

    Personally, I like my games, so "the faster the better" will probably always be key. But for the vast majority of people what is the point of a high-spec machine?

    Surely a decent anti-spyware program is a much better choice.

  • Cost of fabs... (Score:3, Interesting)

    by kebes ( 861706 ) on Wednesday April 25, 2007 @01:02PM (#18872327) Journal

    "...Every 24 months, you're doubling the number of transistors, doubling the capacity," he said. "But if you think about the process you're going through--they're taking a wafer, they put some devices on it, they cut it up and sell it to you--the cost of doing that is not doubling every 18 to 24 months."
    Is he claiming that the cost of doing lithography on wafers doesn't increase? That's crazy talk! The cost of building and running fabs is in fact also growing exponentially. According to Rock's Law [wikipedia.org], the cost of building a chip-making plant doubles every four years, and is already into the multi-billion dollar range.

    In fact there's alot of debate whether Moore's Law will break-down due to fundamental barriers in the physics, or whether we will first hit an economic wall: no bank will be willing (or able?) to fund the fantastically expensive construction of the new technologies.
  • by Applekid ( 993327 ) on Wednesday April 25, 2007 @01:30PM (#18872709)
    What we do with the transistors? Run software of course. Enter Wirth's Law [wikipedia.org]:

    "Software is decelerating faster than hardware is accelerating."
  • by TheLink ( 130905 ) on Wednesday April 25, 2007 @01:48PM (#18872973) Journal
    So any recent benchmarks of how the latest T2 stuff does vs recent x86 machines in popular server apps like _real_world_ webservers, databases?

    AFAIK, it was slower than x86 the day it was launched, and when Intel's "Core 2" stuff came out it got crushed in performance/watt.
  • Re:Efficiency (Score:3, Interesting)

    by Kjella ( 173770 ) on Wednesday April 25, 2007 @02:55PM (#18873943) Homepage
    It certainly seems to have had an effect on peoples attention to writing efficient code. Mind you, it is more expensive to write code than throw more processor at things ...

    Well, you can have software that's feature-rich, stable, cheap, fast or resource efficient, pick any two (yes, you still only get two). Let faster processors handle speed and GB sticks of memory handle resource efficiency, and let coders concentrate on the other three. The margin between "this will be too slow it doesn't matter what we do" and "it's so fast noone cares" is usually very slim (unless you're talking about major changes like using smarter algorithms, pushing heavy processing out of a loop etc. in other words a smarter design, not assembly hacking).
  • by Doctor Memory ( 6336 ) on Wednesday April 25, 2007 @03:48PM (#18874751)

    there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.
    We are, because it's just a new implementation of a crappy architecture. Apple showed that it's quite feasible to run old software on new hardware, even new hardware that had almost nothing in common with the old hardware. Intel provides x86 compatibility on Itanium, there's no reason why we can't all move to a new processor and take our old software with us. It's just that nobody's coming out with any new processors for PC-class machines.

    I'd say the ability to run 30-year-old software unmodified on a modern processor shows just how little progress we've actually made...
  • by ex-geek ( 847495 ) on Wednesday April 25, 2007 @04:01PM (#18874945)

    Windows vista on modern hardware boots no faster then 98 did on hardware that was modern for its era.

    Boot time is constrained by harddrive seek times, not CPU throughput. Today's harddrives have only marginally better seek times than harddrives from 1998. PCs didn't improve much in terms of latency at all.

    But few developers seem to be aware of this, which is probably one of the reasons for many types of apps starting even slower than they used to. Many apps abuse the filesystem as a database. My system has currently >600.000 files on it. In 98 I would have had maybe 2000 and back than, most of these files were my user files, rather than files for apps, configs and caches.

Old programmers never die, they just hit account block limit.

Working...