Forgot your password?
typodupeerror
Technology Hardware

'Reversible' Computers More Energy Efficient 330

Posted by Hemos
from the faster-but-does-it-work dept.
James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
This discussion has been archived. No new comments can be posted.

'Reversible' Computers More Energy Efficient

Comments Filter:
  • Vaporware? (Score:5, Interesting)

    by Carnildo (712617) on Tuesday November 11, 2003 @03:30PM (#7447443) Homepage Journal
    Has anyone ever built even a very simple reversible computer? Or is this like quantum computers: all theory, no practice?
    • Re:Vaporware? (Score:5, Informative)

      by nestler (201193) on Tuesday November 11, 2003 @03:35PM (#7447517)
      This is more practical than quantum computers because it is much easier to build and can be used for general purpose things other than search and factoring.

      The idea is to (down at the gate level) keep everything reversible. For example, current OR gates are not reversible (given a true output you can't definitively tell what either input was individually). If you have two outputs on the gate instead of one, you make the gate reversible. However, since you are just using it for OR, you are free to ignore the second bit you added on to make it reversible.

      The bit doesn't help your computation in the sense of the answer you are looking for, but it can make things more energy efficient at the gate level.

      • by autopr0n (534291) on Tuesday November 11, 2003 @03:48PM (#7447679) Homepage Journal
        I mean, say you have a CMOS OR gate. If both of the inputs are high, then the NMOS transistors will close and the PMOS transistors will open. Energy is lost only when electrons 'leak through' when the gate changes (and of course, electrons that leak through but don't affect the computation, which I guess happens all the time). How would reversing the computation affect this? Maybe if you were using plain PMOS or something...
        • Well, there isn't exactly a how. Thermodynamics guarentees us that non-reversable gates will use energy. It doesn't guarentee that it's possible to build a reversable one that doesn't; it does guarentee that any gate which doesn't lose energy is reversable.
          • Huh? (Score:3, Interesting)

            by autopr0n (534291)
            Thermodynamics also says that you lose non-heat energy in reversable systems as well. If you throw a ball into the air, you lose some energy from wind resistance, from converting chemical energy in your arm into mechanical energy, etc.
            • Re:Huh? (Score:4, Insightful)

              by Aardpig (622459) on Tuesday November 11, 2003 @04:20PM (#7447990)

              Thermodynamics also says that you lose non-heat energy in reversable systems as well. If you throw a ball into the air, you lose some energy from wind resistance, from converting chemical energy in your arm into mechanical energy, etc.

              Sure, but mechanical losses can always be recovered and put back into a system. Heat losses can't, which is the point of the second law of thermodynamics.

        • by randyest (589159) on Tuesday November 11, 2003 @03:55PM (#7447757) Homepage
          Enegy is lost always (leakage current) because the gate is not a perfect insulator. The smaller the gates, the more ther leakage. This is called static power.

          Energy is also lost during switching, as the charge needed to switch is moved around. This is called dynamic power.

          Reversible computing endeavors to reduce/eliminate dynamic power. It does nothing for static power. A long time ago, dynamic power was dominant and static power was negligble. Now, gates are so small, static power is approaching the same order of magnitude as dynamic.

          So, even though they're only addressing about 1/2 of the problem, it would be great to have the magnitude of that big problem halved.
        • by nestler (201193)
          I don't know the exact details of how it is more efficient. It was explained to me once in a quantum computation course (where among other things they were using equations to relate energy to information).

          So I don't know how to explain in terms of currents and transistors, but it is similar to what mikee is saying in this thread (that thermodynamic laws say that destroying information will always consume energy).

          The reason quantum computation guys tend to know about this area is because all logical oper

      • Re:Vaporware? (Score:5, Informative)

        by randyest (589159) on Tuesday November 11, 2003 @03:51PM (#7447722) Homepage
        I think you completely misunderstood the article, though in your defense it didn't do a very good job of explaining. The idea is not to be able to reverse logical operations -- that is of little value to anyone. Rather, they're trying to make the electrical changes (the energy transfer) reversible. That's a fundamentally differeent thing. A decent analogy, mentioned in the article is:

        The concept is somewhat analogous to hybrid cars now on the market that take the energy generated during braking and recycle it into electricity used to power the car.

        So, the logical realm is no different here. Physically, and electrically, there is a big difference from existing computers. Now, when a bit changes from 1->0, the voltage (accumulated charge) is simply shorted to ground (via resistive path that dissipates heat). That energy is lost. In a reversible computer, that charge would be stored, in the electrical equivalent of a spring or flywheel in a mechanical system. So, next time it needs to go 0->1, the energy is sitting there, ready to be re-used(stored in the spring's compression or flywheel's rotation).

        I assume these electrical "springs or flywheels" need to be phsycally close to the transistors they're storing energy for. If all transistor's storage were common, the heat loss (and time delay) to get the energy back to where it's needed would defeat the entire purpose.

        In the article, they mention that current prototypes use oscillators to store the energy (which are more like a flywheel than a spring, to continue the mechanical analogy), but the efficiency is not quite good enough to be called "reversible". Too much energy is lost in storing and un-storing the energy. The current work is focused on improving the efficiency of storing and un-storing energy from state changes.

        However, as a chip designer, I know that oscillators are usually (1) much much bigger than simple logic gates and (2) much more difficult to design with (it's analog design stuff, really). So, my concerns are (1) how much bigger will dice need to be to use this system (linear increase in die size equals exponential increase in manufacturing cost) and (2) how much longer is it going to take to close a design with all those little analog cells all over the place.

        I don't even want to think about the implications for STA (static timing analysis) or LVS (layout versus schematic verification) -- it makes my head hurt. :)
        • Re:Vaporware? (Score:5, Informative)

          by cgb8176 (685935) on Tuesday November 11, 2003 @04:50PM (#7448282)

          I think you completely misunderstood the article, though in your defense it didn't do a very good job of explaining. The idea is not to be able to reverse logical operations -- that is of little value to anyone. Rather, they're trying to make the electrical changes (the energy transfer) reversible. That's a fundamentally differeent thing.

          Actually, you are wrong, in that the two things are very intimately related. I will assume that, as a chip designer, you are aware of what AND, OR, and NOT gates are, and that NAND is an example of a universal gate. NAND, however, is not reversible; you cannot in general determine the inputs by looking at the output.

          The Fredkin Gate is an example of a reversible gate. As it happens, it is impossible to do reversible computing with two input gates. The Fredkin Gate (a controlled swap; two inputs, two outputs, and a control wire that passes through) has the property that it is

          reversible (Fredkin inverts Fredkin), and

          it has the same number of non-zero outputs as it does non-zero inputs.

          To achieve reversible computing, you need reversible gates. Furthermore, with reversible gates, you can perform any computation with an arbitrarily small amount of energy; the catch is that you need more time (see adiabatic circuits, Carnot engines).

          • by Chris Burke (6130) on Tuesday November 11, 2003 @05:45PM (#7448746) Homepage
            Furthermore, with reversible gates, you can perform any computation with an arbitrarily small amount of energy; the catch is that you need more time (see adiabatic circuits, Carnot engines).

            Hey, thanks for the keywords. Google turned up lots of nice stuff.

            Though that catch is rather a big one. According to the links, as E->0, T->infinity, which I don't like one bit. Arbitrarily low power, but arbitrarily lengthy computation.

            So I've now got my own low-power logic idea. I call it the "apathetic circuit", and it works by not doing the computation at all. Same zero energy/infinite time tradeoff, but with the advantage that the basic "meh" gate can be arbitrarily small even to the point of zero area! :)
        • Re:Vaporware? (Score:5, Insightful)

          by Bingo Foo (179380) on Tuesday November 11, 2003 @04:53PM (#7448310)
          Sorry, but reversible computing is about having N distinct ouputs for N distinct inputs in any logical operation. Think thermodynamics and statistical mechanics, where reversibility is intimately coupled with "no production of entropy" which means "no loss of information."

          It is at the information theory and logic level of description where reversible computing must be implemented.

      • The bit doesn't help your computation in the sense of the answer you are looking for, but it can make things more energy efficient at the gate level.

        If you know how it is more energy efficient, please share. The article doesn't contain anything that actually suggests "power savings" to me.

        So you have a reversible OR gate. So then you can make an Un-OR gate. Okay, so you can recover what you had before the OR. Big deal. How have you saved power? You've just done another computation through a gate tha
    • Re:Vaporware? (Score:3, Informative)

      by stoolpigeon (454276)
      The answers to your questions are in the article - here:

      Frank, who first worked on reversible computing as a doctoral student at the Massachusetts Institute of Technology, heads UF's Reversible & Quantum Computing Research Group. Among other recent publications and presentations, he presented three papers dealing with topics related to reversible computing this summer, including "Reversible Computing: Quantum Computing's Practical Cousin" at a conference in Stony Brook, N.Y.

      and here:

      Frank currentl
    • From the article... (Score:3, Informative)

      by TamMan2000 (578899)
      While he was at MIT, Frank worked on a team that built several simple prototypes of reversible chips.

      It has at least gotten to the chip level so far...
      • It has at least gotten to the chip level so far...

        I have a prototype of a Holly Hop Drive. It dosen't quite work yet, but still ... it's a prototype :)

        All I need now is funding...
    • Homer: I know. And this perpetual-motion machine she made today is a joke! It just keeps going faster and faster. In this house, we obey the laws of thermodynamics!
    • Universiteit Gent [rug.ac.be] has some pictures of reversible logic gates, including a four-bit adder composed out of Feynman's "NOT, the CONTROLLED NOT, and the CONTROLLED CONTROLLED NOT" reversible logic gates, and some other circuits they've built.

      They also have links to other sites about reversible logic and reversible computing, such as Ralph Merkle's Reversible Computing page [zyvex.com] (from Xerox).

      Also note the bottom of the page: there's a vacancy in the research group, [ugent.be] for all those just aching for a chance to work

  • Imagine all the cool things you could do with the heat from your computer, instead of directing back to the system... *gasp-heartfailure*
    • I, for one, am all for this. Providing machines with a way to reclaim their own heat energy for power is the only way to ensure we don't all end up jacked into a computer-generated Matrix so they can suck up ours.
  • Reversing entropy? (Score:3, Interesting)

    by oGMo (379) on Tuesday November 11, 2003 @03:31PM (#7447459)

    IANAP, but this sounds like trying to reverse entropy as much as possible to me. Won't it take more energy to do a reverse computation than you'll save? Where does the lost energy from that go?

    • by Carnildo (712617)
      There are two theories about energy usage during computation. One is that moving and transforming data requires energy. The other is that the only operation that requires energy is destroying data. Reversible computing subscribes to the second theory, so a reversible computer would not actually use energy to do computations (apart from the inevitable inefficiencies). Since there is no net energy usage, the net entropy neither increases nor decreases, and the second law of thermodynamics doesn't apply.
      • I've long been curious about this idea of destroying data inside a computer. It seems to me that before and after any opcode, your typical computer contains exactly the same amount of data. A few bits have changed from 0 to 1 or vice-versa. But the number of bits is the same, and each contains only one bit of data, so the amount of data is the same.

        So far, when I've questioned people about this, I always get a response that amounts to saying "Boy, you must be a real idiot if you don't understand this."
        • I'm not an expert, but when one talks about entropy, they mean randomness.

          A computer freshly turned on has essentially random "data" in its memory; when you write a specific value into the memory it reduces the randomness of the system; this decreases the entropy.

          I guess, after memory has been used and freed, it has (to anything not the original using program) some approximation of "random" bit patterns, so the next program which uses it makes it less random again. In real operating systems, the kernel it
        • IANAnElectrical Engineer but I think in the context of Reversible computing they are talking about physically, on the lowest hardware level, setting a 1 to a 0. A physical bit that is set to 1 has 5 volts potential across it (or 3 in some cases). To set it to 0, that voltage has to go somewhere. Currently, the circuit gets shorted out and phzzzt, the voltage gets converted to heat and we're back to the 0.

          Of course, in other contexts, 'destroying data' can mean other things. Like magnetizing a hard drive,

        • It may have the same number of bits, but are those bits actually telling you anything? For instance, you can take 700 MB worth of audio data off of a CD and compress it to about 400 MB without losing any information. In that sense you really only had 400 MB of information in the first place, you were just expressing it inefficiently. Interestingly, ordered data can be well compressed, but disordered data cannot be compressed well. So more entropy actually is more information.

          Disclaimer: IANACS
        • by plastik55 (218435)
          It's not strictly "data" as you're accutomed to thinking about it as a bunch of ones and zeroes, but "information" in the sense of information theory--more or less another word for entropy.

          The actual measurement of entropy has to do with counting the possible states that a system could be in. A computer containing a list of numbers in its memory could be in any of a large number of states depending on what you know about the list of numbers and the contents of the rest of its memory. If you instruct it to
        • by kfg (145172)
          You are not a real idiot. You have asked a deep question. Pity those around you choose to cover their own ignorance by being arrogant.

          Me, I like to display my ignorance, so here I go:

          The computer is just a big abacus really, a physical model of the data. When you shake up an abacus it still has just as many beads on it, only their state has changed, and the now random ordering of the beads can still be read as representing a number. What has been lost is meaning.

          Is "2" data? (bearing in mind that we're t
  • by cant_get_a_good_nick (172131) on Tuesday November 11, 2003 @03:31PM (#7447462)
    Wouldn't "regenerative", like regenerative braking on most electrics/hybrids been a better term?
    • by Anonymous Coward
      This is an area where information theory and physics meet. To minimize heat you must minimize entropy production and that means you must in fact make your computations reversible, as much as possible.
  • I have to admit that I'm no chip designer, but I have to wonder why this hasn't been done before? What are the problems with this technic?

    It sounds good, but what's the catch?
    • by pclminion (145572) on Tuesday November 11, 2003 @03:36PM (#7447533)
      Because it seemed totally pointless. It was a theoretical curiosity.

      People started looking at reversibility in earnest when quantum computing came on the scene. A quantum computer HAS to be reversible in order to function. That made it a very important field of study.

      We only recently realized that reversible circuits are also more energy efficient. So basically, we didn't do it before because we didn't know. There is no "catch."

      • Although it should have been obvious all along, and probably was if anyone cared. It follows directly from thermodynamics, although the result is a little odd; in essence, there's no theoretical lower bound on how much energy it takes to compute; it's forgetting that takes energy. Ergo, in theory a computer that never loses any data (is reversable) doesn't necessarily use any energy.
    • Reversible computing is severly limited in terms of normal processor operations. This means that operations such as modular multiplication start to build up a lot of data since you need to 'remember' the two number multiplied in order to undo the operation.

      Consider multiplying two numbers, a and b. So a * b = c. Now to undo the operation you only need c and either a or b. So with normal multiplication (or addition, etc) you have two inputs as such and you need to remember two outputs. This gets worse
  • by burgburgburg (574866) <splisken06NO@SPAMemail.com> on Tuesday November 11, 2003 @03:32PM (#7447473)
    fry eggs [hex-tech.co.uk] if this sort of thing becomes the norm?

    Insensitive clod!

  • by smd4985 (203677) on Tuesday November 11, 2003 @03:32PM (#7447477) Homepage
    your computer could spit out: "these CPU cycles made of 75% post-CPU-consumed waste" :)
  • by Nom du Keyboard (633989) on Tuesday November 11, 2003 @03:33PM (#7447478)
    In fact, unless reversible computing is achieved, computer chips are expected to reach their maximum performance capabilities within the next three decades

    Boy, that's something to worry about today. I'll just have to find a spot to insert it on my Worry List. Maybe I can drop Global Warming to make space.

  • by winkydink (650484) * <sv.dude@gmail.com> on Tuesday November 11, 2003 @03:33PM (#7447482) Homepage Journal
    Here I thought it was an Intel box that, when turned inside out, became a Mac.

    Sigh.

  • A Perpetual Computing Machine!

    Turn it on and it generates cycles from microscopic springs and pulleys, we call "Springons" that can recover the computing power expended, sending the cpu "Wheel" into another revolution.

    --

    funny how all these machines require a battery or plug.... :-)
  • Energy efficient, got you. But as an Electrical Engineer, and a seasoned network engineer, I've never heard the term before. And I'm pretty damn well read.
  • by Anonymous Coward
    Should be "Reversible computing rears its butt again"
  • .....it also could boost their speed, because these chips are becoming so fast that the heat they generate limits the speed at which they can operate without overheating and malfunctioning.

    Bah, this idea is nothing new. From the two SGI's with two 20in displays, two macs and five displays attached to them, my tiny little first apartment had more than enough heat production to warm things up. :-)

  • The idea here is that when you use any 2->1 gate, such as an and gate, you lose one bit of information. Since information is actually just energy, you have to dissapate that energy somewhere, usually as heat. If instead of a 2->1 gate you used 2->2 gates, where one bit is the and product and the other is enough information to reverse the operation, you aren't discarding any information and thus, aren't dissapating any heat.
    • Re:Theory (Score:5, Interesting)

      by TeknoHog (164938) on Tuesday November 11, 2003 @04:39PM (#7448193) Homepage Journal
      Entropy S = k ln W
      (k = Boltzmann's constant, W = number of states)
      Information (in bits) I = log_2 W = ln W / ln 2
      Hence S = kI/ln 2 or roughly S = kI.

      Heat Q = ST = kTI.

      Let's say we destroy 100 gigabits of information at a temperature of 300 K. Since k = 1.38E-23 J/K, this means a heat of about 4E-10 Joules. Which is not very much, and does not really contribute to the heat produced by CPUs.

      In fact, I think this is the way to find a theoretical minimum for the heat produced from information processing. We can try and make more efficient processors to get closer, just like we can increase the efficiency of engines to get closer to the thermodynamic limit.

  • I was so used to cooking breakfast on the top surface of my Apple Cube [wired.com]. I'll miss this if the energy gets recylcled elsewhere, and I'll likely have to go buy a Foreman grill to make up for the loss of this nifty cooking appliance.
  • Finally a hardware device that can decrypt backwards writing.

    Now if only they'll invent Transposed Computing that can hardware decrypt Rot13.
  • I wish my computer was reversible in the sense that I could press the rewind button and everything I did would happen in reverse.

    I'm telling at as a joke, but I've always wondered why no chip designer ever wrote this. It should be possible to log every instruction that passes through the CPU and play them in reverse order. Imagine how cool that would be!
  • It seems to me that what they are saying is that most of the heat comes from memory elements being discharged. What about energy spent by fliping a CMOS gate? Isn't that where most of the power is lost? I mean, even CPUs that only have very small on-die caches still generate a lot of heat.
    • That's what I thought as well - that during the switching of a CMOS gate from one state to the other, the positive supply and ground are momentarily shorted (well, there's still a lot of resistance there, but compared to normal, quite a lot of current flows).

  • Thermodynamics 101 (Score:5, Informative)

    by majid (306017) on Tuesday November 11, 2003 @03:42PM (#7447595) Homepage
    You get the most energy efficiency from a machine when it works in a thermodynamically reversible way, for instance the most efficient thermal motor possible is one that uses a Carnot cycle. Most real-world engines use different, less efficient cycles like the Otto or Stirling cycle because they yield higher speeds or torque.

    Losing the ability to reverse computations means increasing entropy and thus lower efficiency. Interestingly, there is a whole class of functional programming methods that is intrinsically reversible (because evaluating expressions without side effects is reversible).

    The best explanations of the issues involved is in Richard Feynman's "Lectures on Computation", that show how thermodynamics constrain what is ultimately possible with a computer.
    • by iggymanz (596061) on Tuesday November 11, 2003 @03:48PM (#7447678)
      funny this topic has popped up again after I saw it featured in Scientific American over 10 years ago....the real problem is that no one wants to halve the number of useful gates on a chp in order to bulid all the extra circuitry required to reduce (of course, not eliminate, entropy still will increase though at lessened rate) the thermodynamic cost of "forgetting" data.

      I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery. What we DO need to do is get smarter about where we get our energy - instead of adding to net heat budget and pollution budget of earth getting really serious about solar energy (which might just mean making hydrocarbon fuel out of plant & suitable waste materials)
      • by Theaetetus (590071)
        I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery. What we DO need to do is get smarter about where we get our energy - instead of adding to net heat budget and pollution budget of earth getting really serious about solar energy (which might just mean making hydrocarbon fuel out of plant & suitable waste materials)

        Um, that's not the basic premise. The basic premise is that with each bit of informat

      • by greg_barton (5551) *
        I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery.

        Then you are attacking a straw man, at least with respect to the article. A reversible chip would be no less capable. In fact, in the long run, it would be more capable. Less energy heat dissapated means we can continue to use the same materials far into the future with faster and faster chips. As it stands silicon will become ununsable once the hea
  • This computer uses 100% recycled electrons. No electrons were destoryed or harmed by this computer.
  • From the article:
    "In theory, these oscillators could recapture most of the energy expended in a calculation and reuse it other calculations."
    What the hell does this mean?
    4(0100) + 3(0011) = 7(0111)
    Ok, now, let's take that 0111 and use the bits for the answer to 7+8.
    Is that really what they're saying?

    "The concept is somewhat analogous to hybrid cars now on the market that take the energy generated during braking and recycle it into electricity used to power the car."
    Ummm, no. The car analogy would work if w
    • "The concept is somewhat analogous to hybrid cars now on the market that take the energy generated during braking and recycle it into electricity used to power the car."
      Ummm, no. The car analogy would work if we captured the waste heat thrown off, and converted it back to electricity. The concept here is that we don't waste the heat to begin with. This would be analogous to driving back to point A in reverse and reclaiming the fuel.
      How could this possibly work?

      It's a little esoteric, but stay with me..

  • ..collaboration with this guy [timecube.com] would be productive.
  • We already have reversible robots. [transformers.com]. Why not reversible computers?

    "Apple Toast-Or! From G5 Power, to nice warm toast, back to G5 Power again!"

  • In fact, unless reversible computing is achieved, computer chips are expected to reach their maximum performance capabilities within the next three decades, effectively halting the rapid advances in speed that have driven the information technology revolution, Frank said. "Reversible computing is absolutely the only possible way to beat this limit," he said.

    Anyone who is this emphatic about his own technology, and that it is the "only possible way" is trying to pump stock prices, plain and simple.

    All

  • I commented on this University of Florida news release a week ago on my blog [weblogs.com]. Not only you'll see more references and details than on the news release, but you'll also read comments by Michael Frank, the UF assistant professor behind this research effort.
  • by MrLint (519792) on Tuesday November 11, 2003 @03:53PM (#7447745) Journal
    Imagine a computer that ran on heat and got colder the more you used it.. then i could pay video games and have ice cold beer.

    Oh thats not what they mean by reversible? Damn
  • An ARM processor running at a few hundred MHz uses around 1W and does not even get warm to the touch -- no heatsink required. THe same goes for MIPS, SH4 etc. Most of the heat in an x86 is caused by trying to make an inherently inefficient architecture work faster. Intel, quite long ago, demoed an ARM core running at over 1GHz. Making a 2 or 3 GHz ARM is not impossible, it's just that the ARM niche is portable devices etc and they have not been put into the x86-dominated space

    I remember a thread where peopl

  • For more info... (Score:3, Informative)

    by Niten (201835) on Tuesday November 11, 2003 @03:59PM (#7447798)

    You can find more information about Dr. Frank's research on his homepage [ufl.edu].

  • Stirling engine (Score:5, Interesting)

    by bs_02_06_02 (670476) on Tuesday November 11, 2003 @04:01PM (#7447813)
    Isn't it just easier to use the excess heat to power a Stirling engine to recapture waste energy?
    Maybe the Stirling idea is going too far.
    How about a more efficient circuit? It's been awhile since college, but isn't excess heat a sign that the circuit is inefficient?
    While it's not completely frivolous research, it's not the first avenue I would approach when looking at this problem. It seems more difficult and time-consuming to add in circuitry to re-use the energy to perform other actions inside of a CPU. It seems like you'd have a better chance at compounding the problem, rather than helping it.
    However, make the circuit more efficient, you'll generate less heat. That would be my first goal. What kind of efficiency do they get with today's CPUs?

    With this reversible thinking, I have an idea. I need a little help from the anti-SUV crowd... wouldn't all gasoline engines be better off with really big flywheels?
  • by Cordath (581672) on Tuesday November 11, 2003 @04:05PM (#7447855)
    Asynchronous Logic (i.e. no clock) has many of the same benefits, as well as potentially increasing the speed of processors significantly.

    A rather large portion of the heat genreated by a processor is just from the clock signal propagating to every bloody logic gate in the mess including the parts not in use. With asynchronous logic, if a part isn't in use, it gets no current. Of course, clock signals have been used for the last half century for a reason. Clock signals are used to time signals so that you don't have 3 digits of a number showing up before the rest, etc. With asynchronous logic you have to worry about path lengths down to the picometer so you don't need the clock to act like a traffic warden. The biggest holdup to asynchronous logic has been the immense design difficulty involved, but that is changing as new design tools are developed.

    Anyways, the big reason why Asynchronous logic is going to arrive on the processor scene long before reversable logic is that it already has. Intel and other manufacturers are already incorporating asynchronous logic into their designs, and plan to increase the ammount used as time goes by. The different manufacturing techniques required are slowly being phased in. Reversible computing, on the other hand, has virtually no chance of showing up within the decade.

    My point is that the article linked made no allowance for the increasing use of asynchronous logic. It's going to have a significant impact on heat dissipation in the neBuew years.
    • Asynchronous blows for non-trivial computation.

      You are correct, the clock signal needs to get stronger/faster as speed increases.

      But try designing a whole motherboard using asynchronous design...it would be VERY hard.

      Hence why nobody has (that I am aware of)

      Clocked is much simpler...

      Another benefit of asynchronous would be speed benefits...instead of something taking 1/3 of a clock cycle having to wait, it just finishes when its done.
      • by Yobgod Ababua (68687) on Tuesday November 11, 2003 @05:17PM (#7448512)
        "Asynchronous blows for non-trivial computation."

        Please cite your references or evidence to this statement if you wish to be taken seriously.

        Several companies are currently working on complex and high-performance designs using asynchronous techniques. It's true that it is currently more difficult, predominantly because current design tools are all geared towards generating and testing "standard" clocked logic, but it is being done and it does not by any stretch "blow".

        It will be quite some time before all of the components on a motherboard are asynchronous, but groups -have- designed processors, memory controllers, and other components in asynchronous.

        For but the briefest of examples... check out this article [eetimes.com] or this article [technologyreview.com]. No, it isn't the answer to everything... but it's much farther along than you seem to realize

  • by John Sokol (109591) on Tuesday November 11, 2003 @04:16PM (#7447939) Homepage Journal
    This book by Richard Feynman is based on a series of lectures given at CalTech in the mid 1980s.

    In it he discusses Reversible Computation and the Thermodynamics of Computing and quantum computing.

    As usual, Feynman was way ahead of his time.

    I highly recomend this book.

    The basic idea is heat is only generated when information is destroyed. So don't destroy information when performing computations.

    How this relates to something actualy practical is hard to say, but it didn't strike me as something that would apply to silicon very easily.

    John

    • This book by Richard Feynman is [from the ] 1980s. In it he discusses Reversible Computation and the Thermodynamics of Computing and quantum computing.

      As usual, Feynman was way ahead of his time.


      Reversible computing had been proposed twenty years earlier by an IBM engineer and widely recognized as an important idea, so one can hardly credit Feynman for this one.

      There has been steady research on reversible computation over the last ten years or so. In fact the best paper award at one of the major CS con
  • Size penalty (Score:4, Insightful)

    by toybuilder (161045) on Tuesday November 11, 2003 @04:24PM (#7448034)
    The problem with reversability is that for any given semiconductor process, it effectively doubles the number of gates that need to be built on the chip, and manufacturers are currently more interested in cramming more features into the chip; not to make them more efficient.

    It might be theoretically possible to build smaller and faster chips by reducing the energy/thermal issues, but I suspect most companies are not willing to take that leap of faith.

    I bet the first places we'll see reversible gates being used in a full-fledged MCU/CPU would be for a mobile/handheld processor running reversified version of an older (less gates) core using latest processes...
    • Re:Size penalty (Score:3, Insightful)

      by foniksonik (573572)
      What about elegant design? From reading the summary it sounds like they want to do more than just add more gates to 'reverse' the computations... they want to use new design methods such as oscillators and springs to capture and hold the energy as potential which would then be reused when needed in an alternate process.

      The point is not to over-engineer for this but to intelligently engineer. it will take more R and D time but will hopefully gain enough to justify the expense.

  • by John.P.Jones (601028) on Tuesday November 11, 2003 @04:36PM (#7448168)
    If you are interested in reversable circuits read what Feynman had to say about them in his lectures on computation.

    While they may be helpful for certain things, especially quantum computers (but that is a whole different story) there is a snag. They are deterministic; great CS people like Rabin have taught us the value of probabilistic turing machines and today we use them as the basis of determining what is computationally efficient (BPP, see Michael Sipser's intro to computation and complexity). Every once in a while you have to take a non-reversable step to pick a random number (as well as through away garbage you don't want to store any more) and this negates the thermodynamic advantages of reversible computing.

    No Free Lunch

  • by stienman (51024) <[adavis] [at] [ubasics.com]> on Tuesday November 11, 2003 @09:47PM (#7450356) Homepage Journal
    You want to cut back on the 100W of heat being released by today's processors?

    100W?

    I piss 100W when I get up in the morning.

    100W will cost you $79 [US] a year if you run it hard and constant every second 24/7/365. ($0.09 per KWH)

    In the US, each average family has more power, more cheaply than some cities in other parts of the world.

    Furthermore, the energy is still going to be released as heat at some point. Where else does it go??? Sure, you might be able to switch a given transister 3-4 times with the same energy, but once it drops in voltage and current, the transister no longer switches. Furthermore the chips are already being run at 1.x volts, which is barely enough to account for the voltage drop anyway. To get enough energy back after a transister you'd have to put in a greater initial voltage, wasting more heat.

    Furthermore, more transisters means more complexity, more electricity, and more speed problems. I'm sure there's some savings, but once you add everything up it simply isn't worth it for mainstream desktop processors.

    It may be worthwhile in battery operated, low speed, high efficiency processors, but it'll be a long time before a wall is hit that only this technology can help with.

    The reality is that this guy's patent is running out, and he's shopping it around to see if he can eke anything out of it.

    -Adam

Two is not equal to three, even for large values of two.

Working...