Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology

Forget Moore's Law? 406

Roland Piquepaille writes "On a day where CNET News.com releases a story named "Moore's Law to roll on for another decade," it's refreshing to look at another view. Michael S. Malone says we should forget Moore's law, not because it isn't true, but mainly because it has become dangerous. "An extraordinary announcement was made a couple of months ago, one that may mark a turning point in the high-tech story. It was a statement by Eric Schmidt, CEO of Google. His words were both simple and devastating: when asked how the 64-bit Itanium, the new megaprocessor from Intel and Hewlett-Packard, would affect Google, Mr. Schmidt replied that it wouldn't. Google had no intention of buying the superchip. Rather, he said, the company intends to build its future servers with smaller, cheaper processors." Check this column for other statements by Marc Andreessen or Gordon Moore himself. If you have time, read the long Red Herring article for other interesting thoughts."
This discussion has been archived. No new comments can be posted.

Forget Moore's Law?

Comments Filter:
  • Upgrading Good... (Score:4, Insightful)

    by LordYUK ( 552359 ) <jeffwright821@noSPAm.gmail.com> on Tuesday February 11, 2003 @09:39AM (#5278858)
    ... But maybe Google is more attuned to the mindset of "if it aint broke dont fix it?"

    Of course, in true /. fashion, I didnt read the article...
  • Misapprehensions (Score:5, Insightful)

    by shilly ( 142940 ) on Tuesday February 11, 2003 @09:41AM (#5278864)
    For sure, Google might not need the latest processors...but other people might. Mainframes don't have fantastic computing power either -- 'cos they don't need it. But for those of us who are busy doing things like digital video, the idea that we have reached some sort of computing nirvana where we have more power than we need is laughable. Just because your favourite word processor is responsive doesn't mean you're happy with the performance of all your other apps.
  • by betanerd ( 547811 ) <segatech&email,com> on Tuesday February 11, 2003 @09:43AM (#5278873) Homepage
    Why is it called Moore's Law and not Moore's Therom? Doesn't "Law" imply that it could be applied to all situations in all times and still be true? Or am I reading way to much into this?
  • by g4dget ( 579145 ) on Tuesday February 11, 2003 @09:46AM (#5278884)
    Assume, for a moment, that we had processors with 16bit address spaces. Would it be cost-effective to replace our desktop workstations with tens of thousands of such processors, each with 64k of memory? I don't think so.

    Well, it's not much different with 32bit address spaces. It's easy in tasks like speech recognition or video processing to use more than 4Gbytes of memory in a single process. Trying to squeeze that into a 32bit address space is a major hassle. And it's also soon going to be more expensive than getting a 64bit processor.

    The Itanium and Opteron are way overpriced in my opinion. But 64bit is going to arrive--it has to.

  • by Lukano ( 50323 ) on Tuesday February 11, 2003 @09:48AM (#5278898)
    Reply I've run into similar situations with clients of mine, when trying to figure out for them which the best solution for their new servers/etc would be.

    Time and time again, it always comes down to;

    Buy them small and cheap, put them all together, and that way if one dies, it's a hell of a lot easier and less expensive to replace/repair/forget.

    So Google's got the right idea, they're just confirming it for the rest of us! :)
  • by ergo98 ( 9391 ) on Tuesday February 11, 2003 @09:48AM (#5278899) Homepage Journal
    Google is of the philosophy of using large clusters of basically desktop computers rather than mega servers, and we've seen this trend for years and it hardly spells the end of Moore's Law (Google is just as much taking advantage of Moore's Law as anyone: They're just buying at a sweet point. While the CEO might forebodingly proclaim their separation from those new CPUs, in reality I'd bet it highly likely that they're running 64-bit processors once the pricing hits the sweet spot).

    This is all so obtuse anyways. These articles proclaim that Moore's Law is some crazy obsession, when in reality Moore's Law is more of a marketing law than a technical law: If you don't appreciably increase computing power year over year, no new killer apps will appear (because the market isn't there) encouraging owners of older computers to upgrade.
  • by Des Herriott ( 6508 ) on Tuesday February 11, 2003 @09:49AM (#5278905)
    It's not even a theorem (which is what I assume you meant) - that would imply that some kind of mathematical proof exists. Unless you meant "theory"?

    "Moore's Theory" or "Moore's Rule of Thumb" would be the best name for it, but "Moore's Law" sounds a bit catchier. Which, I think, is really all there is to it.
  • by e8johan ( 605347 ) on Tuesday February 11, 2003 @09:50AM (#5278917) Homepage Journal

    This is where FPGAs and other reconfigurable hardware will enter. There are allready transparent solutions, converting C code to both machine code and hardware (i.e. a bitstream to download into an FPGA on a PCI card).

    When discussing video and audio editing, you must realize that the cause of the huge performance need is not the complexity of the task, but the lack of parallel work in a modern PC. As a matter of a fact, smaller computing units, perhaps thousands of CUs inside a CPU chip, would give you better performance (when editing videos if the code was adapted to take advantage of it) than a super chip from intel.

    If you want to test parallellalism, bring together a set of Linux boxes and run mosix. It works great!

  • by Anonymous Coward on Tuesday February 11, 2003 @09:52AM (#5278930)
    I mean the guy was involved in Netscape.

    He hit the lottery. He was a lucky stiff. I wish I was that lucky.

    But that's all it was. And I don't begrudge him for it. But I don't take his advice.

    As for google. Figure it out yourself.

    Google isnt' driving the tech market. What's driving it are new applications like video processing that guess what...needs much faster processors than we've got now.

    So while Google might not need faster processors, new applications do.

    And I say that loving google, but its not cutting edge in terms of hardware. They have some good search algorithms.
  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Tuesday February 11, 2003 @09:55AM (#5278944)
    Comment removed based on user account deletion
  • by stiggle ( 649614 ) on Tuesday February 11, 2003 @09:55AM (#5278947)
    64bit has been here for a while, called Alpha Processors and they work very nicely.

    Why stay stuck in the Intel world? There's more to computers that what you buy from Dell.

  • Because (Score:3, Insightful)

    by tkrotchko ( 124118 ) on Tuesday February 11, 2003 @09:55AM (#5278951) Homepage
    The expectation that computing power will (essentially) double every 18 months drives business planning at chip makers, fab makers, software developers, everything in the tech industry. In other words, it becomes a self-fulfilling prophesy.

    I'm not doing it real justice, but Google (ironic, eh?) about the effects of moore's law for a much better explanation.
  • by praetorian_x ( 610780 ) on Tuesday February 11, 2003 @09:56AM (#5278955)

    "The rules of this business are changing fast," Mr. Andreessen says, vehemently poking at his tuna salad. "When we come out of this downturn, high tech is going to look entirely different."
    *gag* Off Topic, but has *anyone* become as much of a caricture of themselves as Andreessen?

    This business is changing fast? Look entirely different? Thanks for the tip Marc.

    Cheers,
    prat
  • by rillian ( 12328 ) on Tuesday February 11, 2003 @09:57AM (#5278958) Homepage

    Google had no intention of buying the superchip. Rather, he said, the company intends to build its future servers with smaller, cheaper processors.

    How is this not Moore's law? Maybe not in the strict sense of number of transistors per cpu, but it's exactly that increase in high-end chips that make mid-range chips "smaller, cheaper" and still able to keep up with requirements.

    That's the essense of Moore's law. Pretending it isn't is just headline-writing manipulation, and it's stupid.

  • by Hays ( 409837 ) on Tuesday February 11, 2003 @09:58AM (#5278965)
    They're not saying they don't want faster processors with higher address spaces, who wouldn't. They're simply saying that the price/performance ratio is likely to be poor, and they have engineered a good solution using cheaper hardware.

    Naturally there are many more problems which can not be parallelized and are not so easily engineered away. Google's statement is no great turning point in computing. Faster processors will continue to be in demand as they tend to offer better price/performance ratios, eventually, even for server farm situations.

  • by LinuxXPHybrid ( 648686 ) on Tuesday February 11, 2003 @09:58AM (#5278968) Journal
    > For sure, Google might not need the latest processors...but other people might.

    I agree. Also the article's concluding that big companies have no future because Google has no intention of investing in new technology is premature. Google is a great company, a great technology company, but it is just one of many. Google probably does not represent the very edge of cutting edge technology, either. Stuff like Molecular Dynamics Simulation requires more computer power; I'm sure that people who work in such areas can't wait to hear Intel, AMD and Sun announcing faster processor, 64bit, more scalability.
  • Mushy writing (Score:5, Insightful)

    by icantblvitsnotbutter ( 472010 ) on Tuesday February 11, 2003 @10:00AM (#5278982)
    I don't know, but am I the only one who found Malone's writing to be mushy? He wanders around, talking about how Moore's Law applies to the burst Web bubble, that Intel isn't surviving because of an inability to follow it's founder's law, and yet that we shouldn't be enslaved by this "law".

    In fact, the whole article is based around Moore's Law still applying, desptie being "unhealthy". Well, duh. I think he had a point to make somewhere, but lost it on the way to the deadline. Personally, I would have appreciated more concrete reasons about why Google's bucking the trend is so interesting (to him).

    He did bring up one very interesting point, but didn't explore it enough to my taste. Where is reliability in the equation? What happens if you keep all three factors the same, and use the cost savings in the technology to address failure points?

    Google ran into bum hard drives, and yet the solution was simply to change brands? The people who are trying to address that very need would seem to be a perfect fit for a story about why Moore's Law isn't the end-all be-all answer.
  • by Shimbo ( 100005 ) on Tuesday February 11, 2003 @10:02AM (#5278993)
    Its a prediction that has held pretty true. Its a good benchmark but is not a true Law.

    The majority of laws are empirical in nature. Even Newton's laws of motion don't come from the theory, rather they are axioms that underly it.
  • by Jack William Bell ( 84469 ) on Tuesday February 11, 2003 @10:02AM (#5278999) Homepage Journal
    The problem is that cheaper processors don't make much money -- there isn't the markup on commodity parts that there is on the high end. The big chip companies are used to charging through the nose for their latest and greatest and they use much of that money to pay for the R & D, but the rest is profit.

    However profit on the low end stuff is very slight because you are competing with chip fabs that don't spend time and money on R & D; buying the rights to older technology instead. (We are talking commodity margins now, not what the market will bear.) So if the market for the latest and greatest collapses the entire landscape changes.

    Should that occur my prediction is that R & D will change from designing faster chips to getting better yields from the fabs. Because, at commodity margins, it will be all about lowering production costs.

    However I think it is still more likely that, Google aside, there will remain a market for the high end large enough to continue to support Intel and AMD as they duke it out on technological edge. At least for a while.
  • by drix ( 4602 ) on Tuesday February 11, 2003 @10:04AM (#5279008) Homepage
    Right, thank you, glad someone else got that. No one is saying that Google has abandoned Itanium and 64-bit-ness for good. Read that question in the context of the article and what Schmidt is really being asked is how will the arrival of Itanium affect Google. And of course the answer is that it won't, since as we all know Google has chosen the route of 10000 (or whatever) cheap Linux-based Pentium boxes in place of, well, an E10000 (or ten). But that sure doesn't mean Google is swearing off 64-bit for good--just that it has no intention of buying the "superchip." But bet your ass that when Itanium becomes more readily available and cheap, a la the P4 today, when Itanium has turned from "superchip" to "standardchip," Google will be buying them just as voraciously as everyone else. So for me these doomsday prognostications that Malone flings about don't seem that foreboding to me--Itanium will sell well, just not as long as it's considered a high-end niche item. But that never lasts long anyways. One-year-ago's high-end niche processor comes standard on every PC at CompUSA today.
  • by binaryDigit ( 557647 ) on Tuesday February 11, 2003 @10:05AM (#5279017)
    For their application having clusters of "smaller" machines make sense. Lets compare this to ebay.

    The data google deals with is non real time. They churn on some data and produce indices. A request comes in over a server, that server could potentially have it's own copy of the indices and can access a farm of servers that hold the actual data. The fact that the data and indices live on farms is no big deal as there is no synchronization requirement between them. If server A serves up some info but is 15 minutes behind server Z, that's ok. This is a textbook application for distributed non-stateful server farms

    Now ebay, ALL their servers (well the non listing ones) HAVE to be going after a single or synchronized data source. Everybody MUST have the same view of an auction and all requests coming in have to be matched up. The "easiest" way to do this is by going against a single data repository (well single in the sense that the data for any given auction must reside in one place, different auctions can live on different servers of course). All this information needs to be kept up on a real time basis. So ebay also has the issue of transactionally updating data in realtime. Thus their computing needs are significantly different than that of google.
  • by Zog The Undeniable ( 632031 ) on Tuesday February 11, 2003 @10:10AM (#5279051)
    Is it possible that chip manufacturers feel they have to deliver new products in accordance with ML but not exceed it? Apparently Intel have had 8GHz P4s running (cooled by liquid nitrogen, but you had to do this to get fairly modest overclocks not so long ago).

    I fully expect this to get modded down, but I still think chip manufacturers are deliberately drip-feeding us incremental speeds to maximise profits. There's not much evidence of a paradigm shift on the horizon; Hammer is an important step but it's still a similar manufacturing process. As a (probably flawed) analogy, if processing power became as important to national security as aircraft manufacture in WWII, look how fast progress could be made!

  • by iion_tichy ( 643234 ) on Tuesday February 11, 2003 @10:12AM (#5279058)
    Wether you use a super chip or several low cost chips, the computing power at your disposal still grows exponentially, I guess. So no refutation of Moore's law.
  • what moore said.. (Score:5, Insightful)

    by qoncept ( 599709 ) on Tuesday February 11, 2003 @10:12AM (#5279063) Homepage
    I think people are missing the point of Moore's law. When he said he thought transistors would double every 2 years, thats what he thought would happen. Thats not a rule set that anyone has to follow (which, as far as I can figure, is the only way it could be "dangerous," because people might be trying to increase the number of transistors to meet it rather than do whatever else might be a better idea..????). It's not something he thought would always be the rule, forever, no matter what. The fact that he's been right for 35 years already means he was more right than he could have imagined.
  • Re:But why? (Score:3, Insightful)

    by micromoog ( 206608 ) on Tuesday February 11, 2003 @10:14AM (#5279076)
    Image editing has been around for many years now, and there's still a much smaller percentage of people doing that than basic email/word processing. Video editing will always be a smaller percentage still.

    Believe it or not, there's a large number of people that don't see their computer as a toy, and really only want it to do things they need (like write letters). Just because the power's there doesn't mean a ton of people will suddenly become independent filmmakers (no matter what Apple's ads tell you).

  • by Lumpy ( 12016 ) on Tuesday February 11, 2003 @10:14AM (#5279077) Homepage
    Software over the past 20 years has gotten bigger not better. We dont do anything different than what I was able to do in 1993. And it doesnt affect just windows and commercial apps. Linux and It's flotilla of apps are all affected. Gnome and KDE are bigger and not better. They do not do the desktop thing any better than what they did 5 years ago. Sure small features have finally been fixed, but at the cost of adding 100 eye-candy opetions for every fix. Mozilla is almost as big as IE, Open Office is still much larger than it needs to be. X windows hasn't been on a diet for years.

    granted it is much MUCH worse on the windows side. Kiplingers TaxCUT is 11 megabytes in size for the executable.. FOR WHAT?? eye candy and other useless features that don't make it better.... only bigger.

    Too many apps and projects add things for the sake of adding them... to look "pretty" or just for silly reasons.

    I personally still believe that programmers should be forced to run and program on systems that are 1/2 to 1/3rd of what is typically used. this will force the programmers to optimize or find better ways to make that app or feature work.

    It sounds like google is tired of getting bigger and badder only to watch it become no faster than what they had only 6 months ago after the software and programmers slow it down.

    remember everyone... X windows and a good windows manager in linux RAN VERY GOOD on a 486 with 16 meg of ram and a decent video card.. Today there is no chance in hell you can get anything but blackbox and a really old release of X to run on that hardware (luckily the Linux kernel is scalable and it heppily runs all the way back to the 386.)

  • by Anonymous Coward on Tuesday February 11, 2003 @10:19AM (#5279100)
    Your lead, and the redherring story have for some reason missed the point and are misleading. There is no objection whatsoever to faster, more powerful processors. The problem is the high power bills.
  • Oh Really? (Score:2, Insightful)

    by plasticmillion ( 649623 ) <matthew@allpeers.com> on Tuesday February 11, 2003 @10:23AM (#5279126) Homepage
    This article is certainly thought-provoking, and it is always worthwhile to challenge conventional wisdom once in a while. Nonetheless, I can't shake the feeling that this is a lot of sound and fury about nothing. As many others have the pointed out, Google's case may not be typical, and in my long career in the computer industry I seem to remember countless similar statements that ended up as more of an embarrassment to the speaker than anything remotely prescient (anyone remember Bill Gates's claim that no one would EVER need more than 640K of RAM?).

    I use a PC of what would have been unimaginable power a few short years ago, and it is still woefully inadequate for many of my purposes. I still spend a lot of my programming time optimizing code that I could leave in its original, elegant but inefficient state if computers were faster. And in the field of artificial intelligence, computers are finally starting to do useful things, but are sorely hampered by insufficient processing power (try a few huge matrix decompositions -- or a backgammon rollout! -- and you'll see what I mean).

    Perhaps the most insightful comment in the article is the observation that no one has ever won betting against Moore's Law. I'm betting it'll be around another 10 years with change. Email me if you're taking...

  • by jacquesm ( 154384 ) <j AT ww DOT com> on Tuesday February 11, 2003 @10:24AM (#5279131) Homepage
    With all respect for Moore's law (and even if it is called a law, it's no such thing since it approaches infinity really rapidly and that'a phyiscal impossibility): Killer apps and harware have very little to do with each other. While hardware can enable programmers to make 'better' software the basic philosophy does not change a lot, with the exception of gaming.


    Computers are productivity tools, and a 'google' like application would have been perfectly possible 15 years ago, the programmers would have had to work a little bit harder to achieve the same results. Nowadays you can afford to be reasonably lazy. It's only an economics thing, where cost of developement and cost of hardware balance at an optimimum.


    In that light, if google were developed 15 years ago it would use 286's, and if it would have been developed in 15 it would use what's in vogue and at the econonmical right pricepoint for that time.

  • by Dr. Spork ( 142693 ) on Tuesday February 11, 2003 @10:31AM (#5279188)
    I think you're exactly right, and I find it incomprehensible that the author of an article on Moore's law does not even know how it goes. It has always been an index of performance per unit of cost, and of how this ratio changes with time. The author seems to think it's all about how chips get faster and faster, and that's an oversimplification we don't even need to make for a schoolchild.

    Google are taking advantage of cheap, high-performing chips, exactly the things predicted by Gordon Moore.

  • by jj_johny ( 626460 ) on Tuesday February 11, 2003 @10:35AM (#5279225)
    Here is the real deal about Moore's law and what it means. If you don't take Moore's law into account, it will eventually change the dynamics of your industry and cause great problems for most companies.

    Example 1 - Intel - This company continues to pump out faster and faster processors. They can't stop making new processors or AMD or someone else will. The costs of making each processor goes up but the premium for new, faster processors continues to drop as fewer people need the absolute high end. So if you look at Intel's business 5 years ago, they always had a healthy margin for the high end. That is no longer the case and if you exprapolate out a few years, it is tough to imagine that Intel will be the same company it is today.

    Example 2 - Sun - These guys always did a great job of providing tools to companies that needed the absolute fastest machines to make it work. Unfortunately, Moore's law caught up and made their systems a luxury compared to lots of other manufacturers.

    The basic problem that all these companies have is that Moore's Law eventually changes every business into a low end commodity business.

    You can't stop the future. You can only simulate it by stopping progress

  • no need for speed (Score:4, Insightful)

    by MikeFM ( 12491 ) on Tuesday February 11, 2003 @10:35AM (#5279229) Homepage Journal
    Seriously at this point most people don't need 1Thz CPU's. What most people need is cheaper, smaller, more energy effecient, cooler CPU's. You can buy 1Ghz CPU's now for the cost of going to dinner. If you could get THOSE down to $1 each so they could be used in embedded apps from clothing to toasters you would be giving engineers, designers, and inventors a lot to work with. You'd see a lot more innovation in the business at that price point. Once powerful computing had spread into every device we use THEN new demand for high end processors would grow. The desktop has penetrated modern life - so it's dead - time to adjust to the embedded world.
  • by ergo98 ( 9391 ) on Tuesday February 11, 2003 @10:35AM (#5279231) Homepage Journal
    Killer apps and harware have very little to do with each other. While hardware can enable programmers to make 'better' software the basic philosophy does not change a lot, with the exception of gaming.

    Killer apps and hardware have everything to do with each other. Could I browse the Internet on an Atari ST? Perhaps I could do lynx like browsing (and did), however the Atari ST didn't even have the processor capacity to decompress a jpeg in less than a minute (I recall running a command line utility to decompress those sample JPEGs hosted on the local BBS to ooh and ahh over the graphical prowess). Now we play MP3s and multitask without even thinking about it, and we wouldn't accept anything less. As I mentioned in another post I believe the next big killer app that will drive the hardware (and networking) industry is digital video: When every grandma wants to watch 60 minute videos of their grandchild over the "Interweeb" suddenly there will be a massive demand for the bandwidth and the computation power (I've yet to see a computer that can compress DV video in real-time).
  • King Canute (Score:2, Insightful)

    by lugumbashi ( 321346 ) on Tuesday February 11, 2003 @10:36AM (#5279238)
    You can no more "Forget Moore's Law" than you can roll back history. It is driven by competition. It would be commercial suicide for AMD or Intel to decide enough was enough and declare, "there you go that ought to be fast enough for you".


    In any case the article shows a fundamental misuderstanding of the industry and its driving forces. The principle driving force is to lower costs and this is the chief effect of Moore's law. The focus is not on building supercomputers but super-cheap computers. Of course this has the effect of lowering the costs of supercomputers as well. The anecdote from Google is a perfect example of the benefits of Moore's law, not a sign of it becoming redundant or dangerous.


    Some of the biggest changes are seen in the embedded world - e.g mobile phones. Intel's vision is of putting radios on every chip.

  • by Futurepower(R) ( 558542 ) on Tuesday February 11, 2003 @10:39AM (#5279264) Homepage

    'His words were both simple and devastating: when asked how the 64-bit Itanium, the new megaprocessor from Intel and Hewlett-Packard, would affect Google, Mr. Schmidt replied that it wouldn't. Google had no intention of buying the superchip. Rather, he said, the company intends to build its future servers with smaller, cheaper processors." '

    The parent comment is correct, but the entire issue is confused. In a few years, the Itanium will be the cheapest processor available, and Google will be using it.
  • Why the Debate? (Score:2, Insightful)

    by rhkaloge ( 208983 ) on Tuesday February 11, 2003 @10:43AM (#5279307)
    Why do people even give ink to "Will Moores Law Hold Up?" debates? I always thought of it as a neat novelty to open powerpoint presentations with. I somehow doubt that Intel has as a mandate to "keep up with Moores Law" or anything. It's really only a "Law" when applied in retrospect anyway.
  • by Arcturax ( 454188 ) on Tuesday February 11, 2003 @10:57AM (#5279413)
    64 bit will arrive, but the point of the article is that it may not arrive as fast as Moore said it would.

    Right now, we are at the point where its just a waste to build bigger and bigger hammers when you can get 100 smaller hammers to do more than a few bigger hammers and do it more quickly, cheaply and efficiently.

    Parallel computing is really coming of age now for consumers and small buisinesses. While in the past only a big megacorp or the government could afford a Cray class machine, now you can build equivalent power (maybe not up to today's supercomputers, but certainly equivilent to ones 10 years ago which is still pretty significant) in your basement with a few Powermac's/PC's, some network cable and open source software for clustering.

    So it makes more sense for Google to invest in a load of current technology and use it in the most effecient way possible than to spend money on expensive and untested (in the "real world") hardware.

    After all, just take a look at what Apple's done with the X-Serve. Affordable, small, efficient clustering capability for buisiness. Two CPU's per machine and you can beowulf them easily. Add in the new X-Raid and you have yourself a powerful cluster that probably (even at Apple's prices) will cost a lot less than a bunch of spanking new Itanium machines.

    64 bit will arrive (Probably when Apple introduces it ;), but it will just take a bit longer since we can get a lot out of what we already have.
  • Missing the point (Score:2, Insightful)

    by raduf ( 307723 ) on Tuesday February 11, 2003 @11:14AM (#5279571)

    I've read most of the comments so far and they don't seem to get the point of the article.
    The point is that except for a limited class of applications (multimedia, games), most of the things you can do on tommorow's computers you can do on today's. And that it's becoming incresingly expensive to follow Moore's law while it's less and less necessary.

    Much more important will probably become the price and maybe other factors (versatility? miniaturisation? power consumption?). Imagine dirt-cheap wireless chips and p-3 like microprocessors. Think about the applications for a moment.
    With the right protocols in place it could mean unlimited bandwith anytime anywhere, just for starters.

    When the world will be 100% computerized it won't be because of supercomputers. It'll be because of cheap small chips and smart software.

  • by Noren ( 605012 ) on Tuesday February 11, 2003 @11:31AM (#5279732)
    This arguement is ignoring a major point. Sure, home PCs, web servers, search engines, databases may all get fast enough that further computational speed is irrelevant.

    But when computers are used for crunching numbers we still want machines to be as fast as possible. Supercomputers still exist today. Countries and companies are still spending millions to build parallel machines thousands of times faster than home PCs. They're doing this because the current crop of processors is not fast enough for what they want to calculate.

    Current computational modeling of the weather, a nuclear explosion, the way a protein folds, a chemical reaction, or any of a large number of other important real-world phenomena is limited by current computational speed. Faster computers will aid these fields tremendously. More power is almost always better in mathematical modeling- I don't expect we'll ever get to the point where we have as much computational power as we want.

  • by hey! ( 33014 ) on Tuesday February 11, 2003 @11:35AM (#5279758) Homepage Journal
    Moore's law not quite what most people think. If I'm not mistaken, it isn't that processor power will double every eighteen months, but that transsistor density will double. Processsor speed doubling is a side effect of this.

    I think there will always be a market for the fastest chips possible. However, there are other ways for this trend to take us rather than powerful CPU chips. These would include lower power, lower size, higher system integration, and lower cost.

    The EPIA series mini-ITX boards are an example of this. Once the VIA processors get powerful enough to decode DVDs well, it is very likely that they won't need to get more powerful for most consumer applications. However, if you look at the board itself (e.g. here [viavpsd.com]),
    you'll see that component count is stil pretty high; power consumption, while small, still requires a substantial power supply in the case or a large brick.

    When something like this can be put together, capable of DVD decoding, having no external parts other than memory (and maybe not even that), and the whole thing runs on two AAA batteries, then you'd really have something. Stir bluetooth (or more likely its sucessors) into the mix and you have ubiquitous computing, capable of adapting to their environement and adapting the environment to suit human needs.

  • by crawling_chaos ( 23007 ) on Tuesday February 11, 2003 @11:50AM (#5279881) Homepage
    Any problem that requires a big working set can benefit from running on big iron. If you can't sub-divide the memory usage, you'll spend a lot of time whipping memory requests out over very slow links. Cray has a bunch of data on this. The short of it is that it's all about memory latency. The X1 series is built have extremely low latency.

    That's not to say that every complex problem needs a supercomputer. That's why Cray also sells Intel clusters. Right tool for the right job and all of that.

  • by skeedlelee ( 610319 ) on Tuesday February 11, 2003 @12:38PM (#5280348)
    I'm about to rant a little here.

    IIRC, there an article a while back (discussed here) that reviewed Moore's law, as Moore used it over a number of years and found that Moore himself seemed to redefine it every couple years. It's a marketing term which describes the general phenomenon of faster computers getting cheaper in a regular way.

    You're right though, Moore never really talked about doubling of 'processor power,' he discussed things in terms of devices such as transistors. Trouble is sometimes RAM was included in the 'device' total sometimes not... it's easy to fudge a bit during the slow downs and speed ups if you change how the thing is defined.

    Top it off with the fact that the whole thing was eventually cast in terms of the cost optimal solution. Given the degree to which the size of the market for computers has changed I'd say that this is a very difficult thing to define. As everyone is likely to point out, commodity desktop PC's have a very different optimum from massive single-system image computers. Of course, if you consider that a calculator is a computer, be they $1 cheapo's or the latest graphing programable whoopdeedoo they are all computers. There are so many markets for computers now, each with their own optimum that it's pretty artificial to talk about Moore's law at all. I've never seen anyone plot out Moore's law with a bunch of branches. Further, cost optimal becomes pretty subjective in all the markets when there are so many variables. Finally, there are points where Moore's law breaks down... the number of devices in cheapo calculators probably hasn't changed much in the last few years, but the price changes. Moore's law doesn't really allow for this sort of behavior, that there is a maximum necessary power for a certain kind of device, if it doesn't have to do anything else, then the complexity levels off and the price goes down. This may well happen at some point in the commodity sector. It is possible that the number of features in a conventional desktop will level off at some point. Hell with a $200 WalMart cheapo PC, maybe we're there now...

    Intuitively, everyone applies Moore's law to desktops but there's no particular reason to do so. Considering the history of it the massive mainframe style computer is probably the best application of it, but this is seldom done. Mainframes these days can be a complex as you're willing to pay for, which pretty much means that there is a cost optimal solution for an given problem, not just for fabrication, which is what Moore was talking about. Seems like we have turned a corner, it's time to redefine Moore's law yet again.
  • by HiThere ( 15173 ) <charleshixsn@@@earthlink...net> on Tuesday February 11, 2003 @12:49PM (#5280451)
    The mainframes basically stopped at 32 bits. There were models that went to 128 bits, and CDC liked 60 bits, but the workhorse (IBM 360, etc.) never went beyond 32 bits.

    Perhaps the next step instead of being towards larger computers will be towards smaller ones. Moore's law remains just as important, but the application changes. Instead of building faster computers, you build smaller, cheaper ones. The desktops will remain important for decades as the repository of printers, large hard disks, etc. And the palmtops/wristtops/fingernailTops/embedded will communicate with them for archiving, etc.

    This means that networking is becoming more important. This means that clusters need to be more integrated. I conceive of future powerful computers as a net of nets, and at the bottom of each net is a tightly integrated cluser of cpus, each more powerful than the current crop. These are going to need a lot of on-chip ram, and ram attached caches, because their access to large ram will be slow, and mediated through gatekeepers. There will probably be multiported ram whiteboards, where multiple cpus can share their current thoughts, etc.

    For this scenario to work, computers will need to be able to take their programs in a sort of pseudo-code, and re-write it into a parallelized form. There will, of course, be frequent bottle necks, etc. So there will be lots of wasted cycles, but some of them can be used on other processes with lower priority. And at least each cluster will have one cpu that spends most of it's time scheduling. etc.

    Consider the ratio between gray matter and white matter. I've been told that most of the computation is done in the gray matter, and the white matter acts as a communications link. This may not be true (it was an old source), but it is a good model of the problem. So to make this work, the individual processors need to get smaller and cheaper. But that's one of the choices that Moore's law offers!

    So this is, in fact, an encouraging trend. But it does mean that the high end cpus will tend to be short-term solutions to problems, faster at any particular scale of the technology, but too expensive for most problems, and not developing fast enough to stay ahead of their smaller brethern. Because they are too expensive to be used in a wasteful manner.

    Perhaps the "final" generation will implement these longer word length cpus, at least in places. And it would clearly use specialized hardware for the signal switchers, just as the video cards use specialized hardware, though they didn't at first. But the first versions will be built with cheap components, and the specialized hardware will only come along later, after the designs have stabilized.

  • by lamz ( 60321 ) on Tuesday February 11, 2003 @01:14PM (#5280672) Homepage Journal
    This article is full of overblown rhetoric. It goofily applies Moore's Law to too many other things, like Dot-coms. Note that at no point in the article is Moore's Law clearly stated -- it would spoil too many of the article's conclusions.

    That said, I remember the first time I noticed that technology was 'good enough,' and didn't need to double ever again: with the introduction of CDs, and later, CD-quality sound cards. Most people are not physically capable of hearing improvements if the sampling rate of CDs is increased, so we don't need to bother. Certainly, people tried, and the home theatre style multi-channel stuff is an improvement over plain stereo CDs, but it is an insignificant improvement when compared to CDs over older mono formats. Similarily, the latest SoundBlaster cards represent an insignificant improvement over the early beeps of computers and video games. (Dogs and dolphins might wish that audio reproduction was improved, but they don't have credit cards.)

    Back in the early 80s, when most bulletin board access was by 300 baud modem, paging of long messages was optional, since most people can read that fast. Of course, we need faster modems for longer files and applications, but as soon as say, HD-quality video and sound can be streamed at real-time speeds, then bandwidth will be 'enough.'

  • by oconnorcjo ( 242077 ) on Tuesday February 11, 2003 @01:43PM (#5280946) Journal
    I dream of the day when motherboard manufacturors sell cheap 4 cpu boards and AMD/Intel sell low powered/low heat processors (something akin to transmeta). Yeah the Quad Xeon exists but Intel wants you to pay through the nose for those (and they don't run cool). I would love to have 4 (900 mhz "Barton's") Athlon MP cpus in a box that ran cool and reliably. It may not even run as fast as even one Intel P4 3.06 ghz HT for many applications but from what I have seen of smp machines is that they run much SMOOTHER. When smp machines are dished out a lot of work, it does not effect the responsiveness of the whole system. Instead of having one servant who is on supersteriods and is the best at everything but can really only do one thing at a time, I would rather have four servants (which even get in the way of each other at times) who can't do as much but they all can be doing different things at once.
  • Re:clustering (Score:3, Insightful)

    by SpikeSpiff ( 598510 ) on Tuesday February 11, 2003 @02:30PM (#5281412) Journal
    The question is how important is what you're doing?

    If Google screws up 1 in 1000 requests, I wouldn't even notice. Refresh and on my way.

    Citibank trades roughly $1 Trillion in currency a day. If they had 5 9's accuracy, they would be misplacing $10,000,000 a day. In that environment, commodity machines are unacceptable.

    And it scales down: paychecks? billing records? The video check-out at Blockbuster?

It is easier to write an incorrect program than understand a correct one.

Working...