'Reversible' Computers More Energy Efficient 330
James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
"Reversible" a bad name? (Score:5, Insightful)
Sounds good, but... (Score:2, Insightful)
It sounds good, but what's the catch?
Re:Thermodynamics 101 (Score:5, Insightful)
I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery. What we DO need to do is get smarter about where we get our energy - instead of adding to net heat budget and pollution budget of earth getting really serious about solar energy (which might just mean making hydrocarbon fuel out of plant & suitable waste materials)
Re:"Reversible" a bad name? (Score:2, Insightful)
Re:Cool (Score:2, Insightful)
Actually, you are just spewing, at least kind of. As long as the Second Law of Thermodynamics holds true, there is no "conservation of information" law in this universe.
What's really happening here is a lot more simple. In a digital computer, information is stored as a series of energy states. A bit is either 1 or 0, with a 1 meaning that a circuit is energized, while a 0 means that the circuit is not energized. The important thing is that both energized and non-energized circuits hold exactly the same amount of information.
The only thing that this article is talking about is storing the energy from the energized bits in an "energy cache" once the 1 has been switched back to 0, so it can then be used to power other bits. It's really not a very radical idea at all. The only semi-radical thought here is that it would be worthwhile to recover this energy, and that chip manufacturers would benefit from investing in this research.
Asynchronous Logic will be here first. (Score:5, Insightful)
A rather large portion of the heat genreated by a processor is just from the clock signal propagating to every bloody logic gate in the mess including the parts not in use. With asynchronous logic, if a part isn't in use, it gets no current. Of course, clock signals have been used for the last half century for a reason. Clock signals are used to time signals so that you don't have 3 digits of a number showing up before the rest, etc. With asynchronous logic you have to worry about path lengths down to the picometer so you don't need the clock to act like a traffic warden. The biggest holdup to asynchronous logic has been the immense design difficulty involved, but that is changing as new design tools are developed.
Anyways, the big reason why Asynchronous logic is going to arrive on the processor scene long before reversable logic is that it already has. Intel and other manufacturers are already incorporating asynchronous logic into their designs, and plan to increase the ammount used as time goes by. The different manufacturing techniques required are slowly being phased in. Reversible computing, on the other hand, has virtually no chance of showing up within the decade.
My point is that the article linked made no allowance for the increasing use of asynchronous logic. It's going to have a significant impact on heat dissipation in the neBuew years.
Re:Huh? (Score:4, Insightful)
Thermodynamics also says that you lose non-heat energy in reversable systems as well. If you throw a ball into the air, you lose some energy from wind resistance, from converting chemical energy in your arm into mechanical energy, etc.
Sure, but mechanical losses can always be recovered and put back into a system. Heat losses can't, which is the point of the second law of thermodynamics.
Size penalty (Score:4, Insightful)
It might be theoretically possible to build smaller and faster chips by reducing the energy/thermal issues, but I suspect most companies are not willing to take that leap of faith.
I bet the first places we'll see reversible gates being used in a full-fledged MCU/CPU would be for a mobile/handheld processor running reversified version of an older (less gates) core using latest processes...
Reversable versus Probabilistic Computation (Score:3, Insightful)
While they may be helpful for certain things, especially quantum computers (but that is a whole different story) there is a snag. They are deterministic; great CS people like Rabin have taught us the value of probabilistic turing machines and today we use them as the basis of determining what is computationally efficient (BPP, see Michael Sipser's intro to computation and complexity). Every once in a while you have to take a non-reversable step to pick a random number (as well as through away garbage you don't want to store any more) and this negates the thermodynamic advantages of reversible computing.
No Free Lunch
Re:Asynchronous Logic will be here first. (Score:3, Insightful)
You are correct, the clock signal needs to get stronger/faster as speed increases.
But try designing a whole motherboard using asynchronous design...it would be VERY hard.
Hence why nobody has (that I am aware of)
Clocked is much simpler...
Another benefit of asynchronous would be speed benefits...instead of something taking 1/3 of a clock cycle having to wait, it just finishes when its done.
Re:Vaporware? (Score:5, Insightful)
It is at the information theory and logic level of description where reversible computing must be implemented.
Re:Reversing entropy? (Score:3, Insightful)
Me, I like to display my ignorance, so here I go:
The computer is just a big abacus really, a physical model of the data. When you shake up an abacus it still has just as many beads on it, only their state has changed, and the now random ordering of the beads can still be read as representing a number. What has been lost is meaning.
Is "2" data? (bearing in mind that we're talking about the logical number here and not the numeral that serves as its physical model)
No. It's just a number. Unless the number relates to a logical model (the number of quarts of milk in my refridgerator) it isn't data.
So the state of your all shook up (uh uh huh, uh uh huh, yeah, yeah)abacus is still a physical reprentation of a number, but it isn't data.
When we run code we keep just as many "beads on our abacus," only their state changes, but data is the physical state and what they mean in the logical model. So if we lose meaning we have lost data or if we lose the proper state to reflect that meaning we have lost data.
So to "destroy" data means to dispose of the physical model and/or its meaning in the logical model.
When we change the state of a computer we certainly don't change the total amount of data, but we certainly change its state and that state's relationship to the logical model.
Take Linux in a Nutshell off your shelf (a physical representation of data) and burn it. Have you not destroyed data? Now fill the same space on the shelf with a gardening book.
You are now still in possession of the same amount of data as you had before, but both its state and meaning have changed. You aren't going to be able to use that book to look up a vi command because the gardening book doesn't contain that meaning.
You destroyed that data.
KFG
Re:Size penalty (Score:3, Insightful)
The point is not to over-engineer for this but to intelligently engineer. it will take more R and D time but will hopefully gain enough to justify the expense.
Re:Vaporware? (Score:2, Insightful)