'Reversible' Computers More Energy Efficient 330
James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
Re:Vaporware? (Score:5, Informative)
The idea is to (down at the gate level) keep everything reversible. For example, current OR gates are not reversible (given a true output you can't definitively tell what either input was individually). If you have two outputs on the gate instead of one, you make the gate reversible. However, since you are just using it for OR, you are free to ignore the second bit you added on to make it reversible.
The bit doesn't help your computation in the sense of the answer you are looking for, but it can make things more energy efficient at the gate level.
Re:Vaporware? (Score:3, Informative)
Frank, who first worked on reversible computing as a doctoral student at the Massachusetts Institute of Technology, heads UF's Reversible & Quantum Computing Research Group. Among other recent publications and presentations, he presented three papers dealing with topics related to reversible computing this summer, including "Reversible Computing: Quantum Computing's Practical Cousin" at a conference in Stony Brook, N.Y.
and here:
Frank currently is trying to persuade major chipmakers to direct more of their research-and-development resources toward reversible technologies.
Re:Sounds good, but... (Score:5, Informative)
People started looking at reversibility in earnest when quantum computing came on the scene. A quantum computer HAS to be reversible in order to function. That made it a very important field of study.
We only recently realized that reversible circuits are also more energy efficient. So basically, we didn't do it before because we didn't know. There is no "catch."
From the article... (Score:3, Informative)
It has at least gotten to the chip level so far...
Re:Reversing entropy? (Score:2, Informative)
Re:WTF is reversable computing? (Score:1, Informative)
If I tell you that (x && y) == 0, can you tell me what x and y are? No: it could be (0, 0) or (0, 1) or (1, 0). Therefore, the operation AND is not reversible.
A reversible computer always performs operations that can be uncomputed. Given the outputs, you can reconstruct the inputs. This means, for one thing, that a reversible computer has no concept of boolean AND. Or OR, for that matter. NOT is reversible, though.
Thermodynamics 101 (Score:5, Informative)
Losing the ability to reverse computations means increasing entropy and thus lower efficiency. Interestingly, there is a whole class of functional programming methods that is intrinsically reversible (because evaluating expressions without side effects is reversible).
The best explanations of the issues involved is in Richard Feynman's "Lectures on Computation", that show how thermodynamics constrain what is ultimately possible with a computer.
Read the Feynman book (Score:4, Informative)
He has a great deal of info about how reversable computers work and why they save energy.
Re:Cool (Score:3, Informative)
What's going on here is a circuit implementation detail. In a normal chip, when you have a bit set to 1 and a bit set to 0 and you flip them both, the bit set to 0 is charged with fresh energy from the power supply and the energy in the bit set to 1 is converted to heat. In this proposed system, the charges would be moved from the 1 to the 0 with no loss and no additional draw on the power supply. Less work, same informational content.
Re:What about cars? (Score:1, Informative)
The energy lost is through exhaust heat, water heating, and friction.
Re:What about cars? (Score:5, Informative)
You mean, the power companies are going to force Intel to make their chips more wasteful, causing progress to halt and people to buy fewer Intel chips? Yeah, sure.
I mean, there's paranoia, and there's paranoia.
Come on, wake up. I won't claim that kind of thing never happens but by and large capatalism is too powerful; Intel isn't going to act against its own best interests for any mere money the power companies can throw at it, because it won't be worth it. Growth is worth more then mere money to Intel. (If you don't understand why, go learn about business; the explanation is too complicated for a Slashdot posting.)
The power company is made of people like you and me; far too busy to hover over various scientific journals and swoop around like super-villians repressing "dangerous" information.
Re:What about cars? (Score:3, Informative)
Re:Vaporware? (Score:5, Informative)
The concept is somewhat analogous to hybrid cars now on the market that take the energy generated during braking and recycle it into electricity used to power the car.
So, the logical realm is no different here. Physically, and electrically, there is a big difference from existing computers. Now, when a bit changes from 1->0, the voltage (accumulated charge) is simply shorted to ground (via resistive path that dissipates heat). That energy is lost. In a reversible computer, that charge would be stored, in the electrical equivalent of a spring or flywheel in a mechanical system. So, next time it needs to go 0->1, the energy is sitting there, ready to be re-used(stored in the spring's compression or flywheel's rotation).
I assume these electrical "springs or flywheels" need to be phsycally close to the transistors they're storing energy for. If all transistor's storage were common, the heat loss (and time delay) to get the energy back to where it's needed would defeat the entire purpose.
In the article, they mention that current prototypes use oscillators to store the energy (which are more like a flywheel than a spring, to continue the mechanical analogy), but the efficiency is not quite good enough to be called "reversible". Too much energy is lost in storing and un-storing the energy. The current work is focused on improving the efficiency of storing and un-storing energy from state changes.
However, as a chip designer, I know that oscillators are usually (1) much much bigger than simple logic gates and (2) much more difficult to design with (it's analog design stuff, really). So, my concerns are (1) how much bigger will dice need to be to use this system (linear increase in die size equals exponential increase in manufacturing cost) and (2) how much longer is it going to take to close a design with all those little analog cells all over the place.
I don't even want to think about the implications for STA (static timing analysis) or LVS (layout versus schematic verification) -- it makes my head hurt.
Re:HOW does it make it more efficent? (Score:3, Informative)
Re:HOW does it make it more efficent? (Score:5, Informative)
Energy is also lost during switching, as the charge needed to switch is moved around. This is called dynamic power.
Reversible computing endeavors to reduce/eliminate dynamic power. It does nothing for static power. A long time ago, dynamic power was dominant and static power was negligble. Now, gates are so small, static power is approaching the same order of magnitude as dynamic.
So, even though they're only addressing about 1/2 of the problem, it would be great to have the magnitude of that big problem halved.
Basic Problems with Reversible Computing (Score:2, Informative)
Consider multiplying two numbers, a and b. So a * b = c. Now to undo the operation you only need c and either a or b. So with normal multiplication (or addition, etc) you have two inputs as such and you need to remember two outputs. This gets worse with modular multiplication (depending on the exact set up) since you may need to remember a, b, and c to undo the operation.
When you think of standerd computer operations, most of them are lossy. The problem with reversible computing is coming up with algorithms that are reversible and still useful. This is the case with quantum computers -- quantum operations are not allowed to lose info, so they are reversible. The most famous quantum algorithm, Shor's Algorithm, will factor very large integers quite easily on a quantum computer. It is actually a probabalistic algorithm, and quite complicated (and interesting). Although the entire opeartion is not reversible (and hense not all quantum), the key components are indeed reversible. Other than Shor's Algorithm, there are not a whole lot of algorithm's for quantum computers becuase they are reversible by nature, and, as such, are limiting to work with.
I agree with the author of the article that more research should be done on reversible chips, algorithims, etc. However, I feel that people should understand the limitations inherent in such a system.
For more info... (Score:3, Informative)
You can find more information about Dr. Frank's research on his homepage [ufl.edu].
Re:Thermodynamics 101 (Score:3, Informative)
Um, that's not the basic premise. The basic premise is that with each bit of information lost, that bit is converted to heat. More bits lost = more heat = limit on how fast a processor can be due to temperature-caused failures. Removing that problem results in much faster theoretical limits on processors.
-T
don't know exact details (Score:3, Informative)
So I don't know how to explain in terms of currents and transistors, but it is similar to what mikee is saying in this thread (that thermodynamic laws say that destroying information will always consume energy).
The reason quantum computation guys tend to know about this area is because all logical operations on a quantum computer (except for the measurement at the end) are reversible operations.
Feynman Lectures on Computation (Score:5, Informative)
In it he discusses Reversible Computation and the Thermodynamics of Computing and quantum computing.
As usual, Feynman was way ahead of his time.
I highly recomend this book.
The basic idea is heat is only generated when information is destroyed. So don't destroy information when performing computations.
How this relates to something actualy practical is hard to say, but it didn't strike me as something that would apply to silicon very easily.
John
Re:WTF is reversable computing? (Score:3, Informative)
Actually, you can add one additional output to any binary logic gate in order to make it reverseable; most reverseable computing designs focus on that and the logic circuits themselves ignore the secondary output...
Re:Resistance == Heat (Score:2, Informative)
So it looks as if we could get rid of the resistance, we could have essentially "perpetual" computing, much like we have essentially "perpetual" current flow in a superconducting ring... except this time with switches directing the electron flow amongst many parallel channels.
I think you are onto the superconducting computer.
I read the main article and was kinda confused about the use of "resonators" to store energy with much less loss. I design a lot of switching power supplies, and I use those techniques a lot to boost the efficiency, as well as reduce stress, on my power supply components. By doing resonant designs, I can use stray capacitances to my advantage, storing their energy in inductors during switching intervals, then re-introduce the stored energy back into the circuit at the proper time to make some really cool power converters.. ( pun intended ).
But here's the problem.. my frequencies are determined by the laws of physics and are either sinusoidal or sinusoidal derivatives. Data is not. I would find it hard to store energy is some sort of inductor, as the energy will bounce back at me in a given time... and if I am not prepared to route the energy in a constructive way when it comes back at me, its wasted, only thing it does then is expend its energy heating up and stressing my switch. I have looked at enough core-dump to know data is not periodic.
It doesn't look like an easy thing to do to try to recover energy from the edges of many switching lines that are all switching at asynchronous times. I would have to know a lot more about this before I could really generate a cogent comment.
The Fredkin Gate (Score:1, Informative)
Re:Vaporware? (Score:5, Informative)
Actually, you are wrong, in that the two things are very intimately related. I will assume that, as a chip designer, you are aware of what AND, OR, and NOT gates are, and that NAND is an example of a universal gate. NAND, however, is not reversible; you cannot in general determine the inputs by looking at the output.
The Fredkin Gate is an example of a reversible gate. As it happens, it is impossible to do reversible computing with two input gates. The Fredkin Gate (a controlled swap; two inputs, two outputs, and a control wire that passes through) has the property that it is
reversible (Fredkin inverts Fredkin), and
it has the same number of non-zero outputs as it does non-zero inputs.
To achieve reversible computing, you need reversible gates. Furthermore, with reversible gates, you can perform any computation with an arbitrarily small amount of energy; the catch is that you need more time (see adiabatic circuits, Carnot engines).