Follow Slashdot stories on Twitter


Forgot your password?
Upgrades Technology

Can Our Computers Continue To Get Smaller and More Powerful? 151

aarondubrow (1866212) writes In a [note, paywalled] review article in this week's issue of the journal Nature (described in a National Science Foundation press release), Igor Markov of the University of Michigan/Google reviews limiting factors in the development of computing systems to help determine what is achievable, in principle and in practice, using today's and emerging technologies. "Understanding these important limits," says Markov, "will help us to bet on the right new techniques and technologies." Ars Technica does a great job of expanding on the various limitations that Markov describes, and the ways in which engineering can push back against them.
This discussion has been archived. No new comments can be posted.

Can Our Computers Continue To Get Smaller and More Powerful?

Comments Filter:
  • Obvious (Score:5, Insightful)

    by Russ1642 ( 1087959 ) on Thursday August 14, 2014 @04:48PM (#47673385)

    Yes. Next question please.

  • by mythosaz ( 572040 ) on Thursday August 14, 2014 @04:52PM (#47673411)

    Even if the electronics fail to get much smaller, there's plenty of room to be had in batteries, screens, and the physical casings of our handheld devices.

  • by Anonymous Coward on Thursday August 14, 2014 @04:58PM (#47673447)

    We're running up against physical limitations but "3d" possibilities will take our 2d processes and literally add computing volume in a new dimension.

    So of course it's going to continue, the only question is one of rate divided by cost/benefit.

  • Re:Obvious (Score:3, Insightful)

    by bobbied ( 2522392 ) on Thursday August 14, 2014 @05:02PM (#47673479)

    Actually, the answer is no and that is obvious. Eventually we are going to run into limits driven by the size of atoms (and are in fact already there).

    Once you get a logic gate under a few atoms wide, there is no more room to make things smaller. No more room to make them work on less power. We will have reached the physical limits, at least in the realm of our current lithographic doping processes. We are just about there.

    This is not to say there won't be continued advances. They are going to get more and more stuff onto each die for quite some time and manufacturing costs will continue to decline as yields go up. It's just that we are about at the limits of lowering the power consumption of the CPU and chipsets.

  • by vux984 ( 928602 ) on Thursday August 14, 2014 @05:40PM (#47673555)

    three decades in the industry and I've never seen performance measured or stated in MHz

    Erm... from the 80286 through the Pentium 3 CPU clockspeed was pretty much THE proxy stat for "PC performance".

  • by uCallHimDrJ0NES ( 2546640 ) on Thursday August 14, 2014 @05:42PM (#47673563)

    Next you'll be telling me they'll let us run unsigned code on processors capable of doing so. You need to get onboard, citizens. All fast processing is to occur in monitored silos. Slow processing can be delegated to the personal level, but only with crippled processors that cannot run code that hasn't yet been registered with the authorities and digitally signed. You kids ask the wrong questions. Ungood.

  • Re:Obvious (Score:5, Insightful)

    by ShanghaiBill ( 739463 ) on Thursday August 14, 2014 @05:47PM (#47673599)

    Did our jets get faster and lighter and cheaper?

    The fastest air breathing aircraft was the SR-71, which went into production in 1962, based on technology from the 1950s. So for at least half a century, jets did not get faster. Aircraft improved enormously between 1903 and 1960. Then the rate of improvements fell off a cliff. That is why Sci-Fi from that era often extrapolated the improvements into flying cars, and fast space travel, but far fewer predicted things like the Internet or Wikipedia.

    What's after atoms?

    Silicon lithography will hit its limits after a few more iterations. But nano-assembly techniques may allow silicon transistors to be even smaller. After that we may be able to move to carbon nanotube transistors, based on spintronics to lower the heat dissipation. There is still plenty of room at the bottom.

  • Re:Obvious (Score:5, Insightful)

    by bobbied ( 2522392 ) on Thursday August 14, 2014 @06:00PM (#47673709)

    If you read my comment.... I'm saying that we are very close to hitting the physical limits. In the past, the limits where set by the manufacturing process, but now we are becoming limited by the material, the size of the of silicon atoms.

    There is basically only one way to reduce the current/power consumption of a device, make it smaller. A smaller logic gate takes less energy to switch states. We are rapidly approaching the size limits of the actual logic gates and are now doing gates measured in hundreds of atoms wide. You are not going to get that much smaller than a few hundred atoms wide. Which means the primary means of reducing power consumption is reaching it's physical limits. Producing gates that small also requires some seriously exacting lithography and doping processes, and we are just coming up the yield curve on some of these, so there is improvement still to come, but we are *almost* there now.

    There are still possible power reducing technologies which remain to be fully developed, but they are theoretically not going to get us all that much more, or we'd have already been pushing them harder. So basic silicon technology is going to hit the physical limits of the material pretty soon.

  • Re:Obvious (Score:4, Insightful)

    by Beck_Neard ( 3612467 ) on Thursday August 14, 2014 @06:03PM (#47673721)

    We're eventually going to hit limits, but there's no reason to think that that limit is a logic gate a few atoms wide. There's isentropic computing, spintronics, neuromorphic computing, and further down the road, stuff like quantum computing.

  • by vux984 ( 928602 ) on Thursday August 14, 2014 @06:55PM (#47674059)

    Marketing and sales to ignorant consumers don't count.

    Originally it was useful enough. Marketing and sales perpetrated it long after it wasn't anymore.

    The "MHz Myth" has been time and again a subject in many a PC magazines

    Only once the truth had become myth. The Mhz "myth" only existed because it was sufficiently useful and accurate to compare intel CPUs by MHz within a generation and even within limits from generation to generation for some 8 generations.

    It wasn't really until Pentium 4 that MHz lost its usefulness. The Pentium 4 clocked at 1.4GHz was only about as fast as a P3 1000 or something; and AMD's Athlon XP series came out and for the first time in a decade MHz was next to useless. Prior to that, however, it was a very useful proxy for performance.

    More meaningful benchmarks have existed long before that era (e.g. Whetstone from early 70s) and many were (e.g. Dhrystone in mid 80s) used all through the rise of the microprocessor (8080, 6502, etc.)

    Sure they did. But for about decade or so, if you wanted a PC, CPU + MHz was nearly all you really needed to know.

  • Re:Obvious (Score:5, Insightful)

    by dnavid ( 2842431 ) on Thursday August 14, 2014 @08:35PM (#47674757)

    Silicon lithography will hit its limits after a few more iterations. But nano-assembly techniques may allow silicon transistors to be even smaller. After that we may be able to move to carbon nanotube transistors, based on spintronics to lower the heat dissipation. There is still plenty of room at the bottom.

    The point of the article and the article it references is that its easy to say stuff like that, but also mostly irrelevant to practical computing because in the history of modern computing its never been absolute physical limits that caused major changes to how computing is implemented. Just because there's room at the bottom, doesn't mean its room we can use. We *may* be able to use nano-assemblers for silicon and *may* be able to use carbon nanotube transistors, but unless that gets translated to someone working on actual practical implementations of those technologies, they will apply as much to the average consumer as the SR-71 that's being discussed in this thread means to the average commercial air traveler. In other words, exactly zero.

    When I was in college people were already talking about the exotic technologies we would have to migrate to in order to achieve better performance, and that was the late eighties. In the twenty-plus years since then, we're still basically using silicon CMOS. Granted the fabrication technologies and gate technologies have radically improved, but the fundamental manufacturing technology is still the same. Its been the same because there's hundreds of billion dollars of cumulative technological infrastructure and innovation behind silicon lithography. For these other "room at the bottom" technologies to be meaningful, and not just SR-71s, they need to be able to reach the same point silicon lithography with its multi-decade head start and approaching trillion dollar learning curve. Its not enough to just work in theory, or even in practice one-off. If it can't work at the scale and scope of silicon lithography, its just an SR-71. A cool museum piece of advanced technology almost no one will ever see, touch, use, or directly benefit from.

    It isn't trivially obvious there exists a technology commercializable in the next few decades that can replace silicon lithography. Anyone who thinks that's obvious doesn't understand the practical realities of scaling these technologies.

"I have not the slightest confidence in 'spiritual manifestations.'" -- Robert G. Ingersoll