Does Moore's Law Help or Hinder the PC Industry? 191
An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"
Both (Score:5, Insightful)
The drum beat of progress pushes development to it's limits, but at the same time hinders some forms of research or real world tests of computation theory, for all save the few chip makers dominating the market currently.
Re:Both (Score:4, Insightful)
We're also not paying US$800 for a 80387 math co-processor (only did floating point). Like a friend of mine did in the 80's. That would be about $US1,600 in today's dollars.
Re: (Score:3, Funny)
*ducks*
Re: (Score:2)
A car, discounting accident or outright neglect*, can be expected to last in excess of ten years today. Sure, you'll have probably replaced a few parts by then, but it will be pretty much the same car.
While you can get the same longevity from computers, a four year old machine is considered ancient and no longer capable of keeping up. Yes, there are still many decade old machines out there, but I'd guess them to b
I'm gonna vote for hurts - big time (Score:5, Insightful)
Re: (Score:2, Interesting)
Re: (Score:3, Insightful)
Re: (Score:2, Insightful)
Have you ever
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Interesting)
AFAIK, it was slower than x86 the day it was launched, and when Intel's "Core 2" stuff came out it got crushed in performance/watt.
Re: (Score:2)
As long as most of the programs I use are not massively multithreaded, I would use three or four cores, leaving the other 28 unused.
This Sun thing was a bright idea - too bad a dual proc dual core Opteron was equivalent in performance, while finishing individual tasks faster? (by what I remember, the Opteron ran some 4 tasks at a time, finishing them in tens of milliseconds, while the Sun
Instruction set != architecture (Score:3, Insightful)
I'm just going to refer you to my comment made earlier today when discussing a "new, better" processor architecture. Because there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.
See here [slashdot.org].
Re: (Score:3, Informative)
Were it not for the opcode fetches to register-dance (because only certain registers can do certain things), or having to use memory to store intermediate results (because there aren't enough registers), or stack-based parameter passing, (not enough registers) or, again, the single accumulator (more opcode fetches and more register dancing) you might have a point. But what you're suggesting (in the rest of your post) is that having 1
Re: (Score:2)
Just some older games that I at least go back to once in a while and beat again.
XCom UFO: Enemy Unknown/Terror from the deap
Baldurs Gate (and others in family)
Syndicate Plus
Imperium Galactica
Starcraft
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Interesting)
there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.
We are, because it's just a new implementation of a crappy architecture. Apple showed that it's quite feasible to run old software on new hardware, even new hardware that had almost nothing in common with the old hardware. Intel provides x86 compatibility on Itanium, there's no reason why we can't all move to a new processor and take our old software with us. It's just that nobody's coming out with any new processors for PC-class machines.
I'd say the ability to run 30-year-old software unmodified on a m
Re: (Score:2)
cool! Oh wait, you can't.
(Oh I know, you can run an emulator just like you can on Windows or Linux, but it's hardly the same)
But can we? (Score:3, Informative)
You're going to be *far* better off running 30 year old software under emulation, where these things can be faked.
Re: (Score:3, Insightful)
Re: (Score:2)
Will it run Firefox faster? Make Quicken more useful? At some point Moore's law will push the price down to next to nothing.
The universe provides us with different mice (Score:2)
Re: (Score:2)
That is what will really kill the industry as we know it. When nobody needs to buy a new version of Office because Office or OpenOffice is good enough. No body needs a faster computer but a cheaper lower margin computer. And nobody needs a better smart phone because they all do enough.
If PCs are powerful enough each die shrink will just make them cheaper and cheaper until the cost of the case and power supply are the most expensive part
Re: (Score:2)
Besides, I'm not going to be happy until we Star Trek voice recognition and "secretary-level" AI
Re: (Score:2)
Moore's Observation (Score:5, Insightful)
Re: (Score:2, Insightful)
I suppose the laymen need something like to rally around, though.
Sure, double the number of transistors! But, did that do anything useful? Did you gain enough performance to offset the complexity you just created? In the drive to "keep up with Moore's Law", are we better off? Are the processors now "better", or simply faster to make up for how fat they have become?
Re:Moore's Observation (Score:5, Insightful)
No one does anything in an effort to prove Moore correct... they do it for their own benefit. Intel does it to stay ahead of their competition and continue to keep selling more processors. If they chose to stop adding transistors they could pretty much count on losing the race to AMD, and likely becoming obsolete in a very short time.
I agree that more transistors != better... however it is indeed the easiest way, and least complex, to increase performance. Changing the architecture of the chip, negotiating with software developers to support it, etc, is far more complex than adding more transistors.
Re: (Score:2)
In light of this, is the media guilty of over-hyping the concept? In other words, would we even be beating this subject into the ground if it were not for reporters heralding the "defeat of Moore's Law for 10 more years!" or some other similar headline?
Re:Moore's Observation (Score:5, Insightful)
[1] There is no real upper bound on the number of transistors you can fit on a chip, just the number you can for a given investment.
Re: (Score:2)
Re: (Score:2, Insightful)
Seriously.
Moore's "law" doesn't mean squat. Its not like gravity. Its more like noticing that I've never had a car accident.
Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply?
Yeah. Right.
Has the existence of Moore's law changed anything? (Score:2)
Re: (Score:2)
You can't ignore the fact that you would be a different driver today if your "Law of Magical Excellent Driving" had not upheld itself. If you were regularly involved in accidents, that would affect your opinions of buy
Well... (Score:2)
I'm right (Score:2)
Here [kurzweilai.net] is some background reading for you.
No significances. (Score:5, Insightful)
Re:No significances. (Score:5, Insightful)
Re: (Score:2)
That's a 30% linear scale reduction, which is something that any engineer would be happy to pursue for the next version of their equipment.
Ask them to make it 50% smaller scale in the next
Re: (Score:2)
On the level of individual engineers, I doubt there are many of them out there kept up at night worrying about how their progress relates to the general progress curve over the past couple decades. Simply put, we've all got better things to worry about.
Moore's Law is more a descrip
Re: (Score:2)
Answer to the question? (Score:3, Interesting)
It could be... (Score:4, Funny)
Efficiency (Score:4, Interesting)
Re: (Score:2)
Case in point: A business application my boss recently bought. Client/server app, with most of the intelligence in the client. They recommended at a minimum a Pentium 4 4GHz. Did such a thing even exist?
Re: (Score:3, Interesting)
Well, you can have software that's feature-rich, stable, cheap, fast or resource efficient, pick any two (yes, you still only get two). Let faster processors handle speed and GB sticks of memory handle resource efficiency, and let coders concentrate on the other three. The margin between "this will be too slow it doesn't matter what we d
Murphy's law... (Score:4, Funny)
Murphy tells us that more bugs will be found on release day than any day previous. That your laptop will work fine until the very minute your presentation is scheduled to begin. And that backup generators are unnecessary unless you don't have them.
Who cares about Moore's law... it's just prophecy from some Nostradamus wannabe.
density of transistors? (Score:3, Funny)
Re: (Score:2, Funny)
My second law of 'density' states that that the PR intelligence quotient is randomly modulated by Schroedingers' cat in the next room, and is only measurable when not actually listening to it.
Re: (Score:3, Funny)
In My Opinion, It Isn't a Law (Score:5, Insightful)
Moore (or Mead for that matter) didn't get up one day and declare that the amount of transistors on a square centimeter of space will double every 18 to 24 months. Nor did he prove in anyway that it has always been this way and will always be this way.
He made observations and these observations happen to have held true for a relatively long time in the world of computers. Does that make them a law? Definitely not! At some point, the duality that small particles suffer will either stop us dead in our tracks or (in the case of quantum computers) propel us forward much faster than ever thought.
Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!
Re: (Score:2)
Because it sells ad space on a web page which has been slashdotted. Duh.
Re: (Score:2)
Because clearly, clearly, if this law is a problem, we should repeal it or, nay, make a NEW law that keeps the industry safe. Yes. Clearly we need government intervention here to keep the number of transistors over time to a safe minimum. Terrorists. This will protect the industry (terrorists!).
Once they're done with this, they can pass a law to prevent gunpowder from igniting unless the u
two issues with the "it's bad" camp (Score:2)
I don't understand this perspective - especially on the enterprise side. Did the applications you were running suddenly slow down because a new CPU came out? Then why lament the rate of progress?
One valid argument is the frustration of having to upgrade hardware to get acceptable performance on
Better Summary (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Sort of... (Score:2)
It seems like the ultimate limiting factor is in packaging and testing - you'll be spending a certain amount for a fully-tested chip in a plastic package, no matter what the actual chip is. That price will have more to
Definately Both (Score:2, Insightful)
The Real Story (Score:4, Insightful)
Re: (Score:2)
hides crappy software (Score:2)
Re: (Score:2)
Why? (Score:4, Interesting)
Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.
Personally, I like my games, so "the faster the better" will probably always be key. But for the vast majority of people what is the point of a high-spec machine?
Surely a decent anti-spyware program is a much better choice.
Re: (Score:2)
Yet inevitably I eventually encounter a situation where my computer is having trouble keeping up, and I'm reminded that, yes, I would indeed like to have a faster computer, and I'd be willing to pay for it (up to some level, obviously).
These "I want more speed" situations don't come up that frequently, but they do come up. And I can think of millions of ways that having arbitrarily more computing power could be put to
Re: (Score:2)
Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.
People have been asking this question since there have been PCs. The N-1th (and usually the N-2th) generation of PCs alw
Re: (Score:2)
It seems to me that the upgrade cycle is an extremely artificial one with Microsoft and PC Vendors both working together to force people to upgrade when they really do not need or even particularly want to.
Take Windows Vista, it is basically a clone of Windows XP (with a few freeware apps built-in), yet somehow (I'm yet to figure out why) it consumes so much memory that a new computer is virtually required for anyone
Re: (Score:2)
Yet even today I can point you at a few real business applications which could really benefit from more power. I have no doubt whatsoever that in a few years time, they'll be OK on anything you're likely to run at them, but another troop of applications will have come along which require more power.
Re: (Score:2)
Medicine, physics, engineering, and AI all benefit from increasing computer power. There are probably numerous other fields that benefit secondarily, but those are probably the most important. Protein folding and cellular simulation will ultimately do more for medicine than anything in previous history, probably the same with physics and engineering. Nanotechnology will require massive computing power to design and test.
Computers would not g
Cost of fabs... (Score:3, Interesting)
In fact there's alot of debate whether Moore's Law will break-down due to fundamental barriers in the physics, or whether we will first hit an economic wall: no bank will be willing (or able?) to fund the fantastically expensive construction of the new technologies.
Re: (Score:2)
Re: (Score:2, Funny)
Rock's Law??? Tablizer's Law: The number of tech "laws" doubles every 2 years.
Here come the pedants (Score:2, Insightful)
Re: (Score:2)
misunderstanding is rampent everywhere (Score:2)
Re: (Score:2)
Re: (Score:2)
Performance (Score:2)
I have tried to find out, but didn't get a clear enough answer from what is publicly available on the internet.
Moore's Law is a crappy measurement (Score:4, Insightful)
Take the Itanic for example, or the P4, or WindowsME/Vista.
Re: (Score:2)
Obviously (Score:2)
Computers only 12 months old with a _gigabye_(!) of RAM are not robust enough to run a full install of Vista with all the bells and whistles, for example.
--
BMO
Yeah, mod me troll, bitch. I've got more karma than you've got mod points.
Why we need faster computers (Score:3, Insightful)
Despite this, there have been complaints from the PC industry that Vista isn't enough of a resource hog to force people to buy new hardware.
Computers have become cheaper. I once paid $6000 for a high-end PC to run Softimage|3D. The machine after that was $2000. The machine after that was $600.
Re: (Score:2)
Even -today- I still know people who have shiney brand new WinXP CDs that they won't use, prefering 98 or ME with all their problem (and mostly unsupp
help or hinder ? (Score:2)
Simple answers from an old Guru (Score:5, Interesting)
"Software is decelerating faster than hardware is accelerating."
Re: (Score:2)
That's no Law... (Score:2)
Moore's law co-opted by the suits (Score:2)
However, the paid talking head pundits grab it and start talking about it and dissecting it and taking it literally. It's not a topic for geeks any more, it's not funny, and it's stupid to be discussing it in an article.
I propose a real law. A legal law.
Definitely helps me. (Score:2)
Mainly hinder, but both! (Score:2)
It's a monster hindrance for mainstream computing. Having all this processing power available to you, coupled with cheap memory, means you can be as lazy as you want when you write software. I do systems integration work for a large company, and the
And it has a bad effect on the CPI... (Score:2)
Hurts - big time (Score:2)
When excessive amounts of memory and processor speeds allow you to release software which by any stretch of the imagination is "bloa
Re: Hurts - big time (Score:2)
There are two kinds of sites in the world -- those who give you explicitly what you request -- and those who attempt to feed you much more than that.
I can cite dozens of sites which only give you explicitly what one requests -- many coll
Wrong Question (Score:2)
Up until a few years ago, more performance and memory resulted in a distinct return on investment. Right now, most machines are "good enough" for present apps. I predict a shift to system on a chip designs driving small reasonably powerful systems like the OLPC.
The problem is the industry adapting to this new model.
I dislike how people think there are no rules (Score:2)
Here's a more interesting question: (Score:4, Funny)
Discuss.
Re: (Score:2)
Re: (Score:3, Interesting)
Boot time is constrained by harddrive seek times, not CPU throughput. Today's harddrives have only marginally better seek times than harddrives from 1998. PCs didn't improve much in terms of latency at all.
But few developers seem to be aware of this, which is probably one of the reasons for many types of apps starting even slower than they used to. Many apps abuse the filesystem as a database. My system has