Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Hardware

'Reversible' Computers More Energy Efficient 330

James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
This discussion has been archived. No new comments can be posted.

'Reversible' Computers More Energy Efficient

Comments Filter:
  • by cant_get_a_good_nick ( 172131 ) on Tuesday November 11, 2003 @04:31PM (#7447462)
    Wouldn't "regenerative", like regenerative braking on most electrics/hybrids been a better term?
  • by Tin Foil Hat ( 705308 ) on Tuesday November 11, 2003 @04:32PM (#7447470)
    I have to admit that I'm no chip designer, but I have to wonder why this hasn't been done before? What are the problems with this technic?

    It sounds good, but what's the catch?
  • by iggymanz ( 596061 ) on Tuesday November 11, 2003 @04:48PM (#7447678)
    funny this topic has popped up again after I saw it featured in Scientific American over 10 years ago....the real problem is that no one wants to halve the number of useful gates on a chp in order to bulid all the extra circuitry required to reduce (of course, not eliminate, entropy still will increase though at lessened rate) the thermodynamic cost of "forgetting" data.

    I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery. What we DO need to do is get smarter about where we get our energy - instead of adding to net heat budget and pollution budget of earth getting really serious about solar energy (which might just mean making hydrocarbon fuel out of plant & suitable waste materials)
  • by Anonymous Coward on Tuesday November 11, 2003 @04:48PM (#7447682)
    This is an area where information theory and physics meet. To minimize heat you must minimize entropy production and that means you must in fact make your computations reversible, as much as possible.
  • Re:Cool (Score:2, Insightful)

    by hchaos ( 683337 ) on Tuesday November 11, 2003 @04:54PM (#7447748)

    Yet more evidence that information is in fact a quantifiable property. We're starting to see hints that information and energy are flip sides of the same coin.

    I'm not just spewing. There are serious theoretical problems associated with how information "disappears" when it falls into a black hole. Fortunately, you get the information back again from Hawking radiation, as the hole converts mass into energy. From a theoretical standpoint it's really starting to look like "information == energy," or to put it more precisely, there is a specific equivalence between information and energy like the equivalence between matter and energy.


    Actually, you are just spewing, at least kind of. As long as the Second Law of Thermodynamics holds true, there is no "conservation of information" law in this universe.

    What's really happening here is a lot more simple. In a digital computer, information is stored as a series of energy states. A bit is either 1 or 0, with a 1 meaning that a circuit is energized, while a 0 means that the circuit is not energized. The important thing is that both energized and non-energized circuits hold exactly the same amount of information.

    The only thing that this article is talking about is storing the energy from the energized bits in an "energy cache" once the 1 has been switched back to 0, so it can then be used to power other bits. It's really not a very radical idea at all. The only semi-radical thought here is that it would be worthwhile to recover this energy, and that chip manufacturers would benefit from investing in this research.
  • by Cordath ( 581672 ) on Tuesday November 11, 2003 @05:05PM (#7447855)
    Asynchronous Logic (i.e. no clock) has many of the same benefits, as well as potentially increasing the speed of processors significantly.

    A rather large portion of the heat genreated by a processor is just from the clock signal propagating to every bloody logic gate in the mess including the parts not in use. With asynchronous logic, if a part isn't in use, it gets no current. Of course, clock signals have been used for the last half century for a reason. Clock signals are used to time signals so that you don't have 3 digits of a number showing up before the rest, etc. With asynchronous logic you have to worry about path lengths down to the picometer so you don't need the clock to act like a traffic warden. The biggest holdup to asynchronous logic has been the immense design difficulty involved, but that is changing as new design tools are developed.

    Anyways, the big reason why Asynchronous logic is going to arrive on the processor scene long before reversable logic is that it already has. Intel and other manufacturers are already incorporating asynchronous logic into their designs, and plan to increase the ammount used as time goes by. The different manufacturing techniques required are slowly being phased in. Reversible computing, on the other hand, has virtually no chance of showing up within the decade.

    My point is that the article linked made no allowance for the increasing use of asynchronous logic. It's going to have a significant impact on heat dissipation in the neBuew years.
  • Re:Huh? (Score:4, Insightful)

    by Aardpig ( 622459 ) on Tuesday November 11, 2003 @05:20PM (#7447990)

    Thermodynamics also says that you lose non-heat energy in reversable systems as well. If you throw a ball into the air, you lose some energy from wind resistance, from converting chemical energy in your arm into mechanical energy, etc.

    Sure, but mechanical losses can always be recovered and put back into a system. Heat losses can't, which is the point of the second law of thermodynamics.

  • Size penalty (Score:4, Insightful)

    by toybuilder ( 161045 ) on Tuesday November 11, 2003 @05:24PM (#7448034)
    The problem with reversability is that for any given semiconductor process, it effectively doubles the number of gates that need to be built on the chip, and manufacturers are currently more interested in cramming more features into the chip; not to make them more efficient.

    It might be theoretically possible to build smaller and faster chips by reducing the energy/thermal issues, but I suspect most companies are not willing to take that leap of faith.

    I bet the first places we'll see reversible gates being used in a full-fledged MCU/CPU would be for a mobile/handheld processor running reversified version of an older (less gates) core using latest processes...
  • by John.P.Jones ( 601028 ) on Tuesday November 11, 2003 @05:36PM (#7448168)
    If you are interested in reversable circuits read what Feynman had to say about them in his lectures on computation.

    While they may be helpful for certain things, especially quantum computers (but that is a whole different story) there is a snag. They are deterministic; great CS people like Rabin have taught us the value of probabilistic turing machines and today we use them as the basis of determining what is computationally efficient (BPP, see Michael Sipser's intro to computation and complexity). Every once in a while you have to take a non-reversable step to pick a random number (as well as through away garbage you don't want to store any more) and this negates the thermodynamic advantages of reversible computing.

    No Free Lunch

  • by mrtroy ( 640746 ) on Tuesday November 11, 2003 @05:42PM (#7448220)
    Asynchronous blows for non-trivial computation.

    You are correct, the clock signal needs to get stronger/faster as speed increases.

    But try designing a whole motherboard using asynchronous design...it would be VERY hard.

    Hence why nobody has (that I am aware of)

    Clocked is much simpler...

    Another benefit of asynchronous would be speed benefits...instead of something taking 1/3 of a clock cycle having to wait, it just finishes when its done.
  • Re:Vaporware? (Score:5, Insightful)

    by Bingo Foo ( 179380 ) on Tuesday November 11, 2003 @05:53PM (#7448310)
    Sorry, but reversible computing is about having N distinct ouputs for N distinct inputs in any logical operation. Think thermodynamics and statistical mechanics, where reversibility is intimately coupled with "no production of entropy" which means "no loss of information."

    It is at the information theory and logic level of description where reversible computing must be implemented.

  • by kfg ( 145172 ) on Tuesday November 11, 2003 @06:20PM (#7448543)
    You are not a real idiot. You have asked a deep question. Pity those around you choose to cover their own ignorance by being arrogant.

    Me, I like to display my ignorance, so here I go:

    The computer is just a big abacus really, a physical model of the data. When you shake up an abacus it still has just as many beads on it, only their state has changed, and the now random ordering of the beads can still be read as representing a number. What has been lost is meaning.

    Is "2" data? (bearing in mind that we're talking about the logical number here and not the numeral that serves as its physical model)

    No. It's just a number. Unless the number relates to a logical model (the number of quarts of milk in my refridgerator) it isn't data.

    So the state of your all shook up (uh uh huh, uh uh huh, yeah, yeah)abacus is still a physical reprentation of a number, but it isn't data.

    When we run code we keep just as many "beads on our abacus," only their state changes, but data is the physical state and what they mean in the logical model. So if we lose meaning we have lost data or if we lose the proper state to reflect that meaning we have lost data.

    So to "destroy" data means to dispose of the physical model and/or its meaning in the logical model.

    When we change the state of a computer we certainly don't change the total amount of data, but we certainly change its state and that state's relationship to the logical model.

    Take Linux in a Nutshell off your shelf (a physical representation of data) and burn it. Have you not destroyed data? Now fill the same space on the shelf with a gardening book.

    You are now still in possession of the same amount of data as you had before, but both its state and meaning have changed. You aren't going to be able to use that book to look up a vi command because the gardening book doesn't contain that meaning.

    You destroyed that data.

    KFG
  • Re:Size penalty (Score:3, Insightful)

    by foniksonik ( 573572 ) on Tuesday November 11, 2003 @06:31PM (#7448632) Homepage Journal
    What about elegant design? From reading the summary it sounds like they want to do more than just add more gates to 'reverse' the computations... they want to use new design methods such as oscillators and springs to capture and hold the energy as potential which would then be reused when needed in an alternate process.

    The point is not to over-engineer for this but to intelligently engineer. it will take more R and D time but will hopefully gain enough to justify the expense.

  • Re:Vaporware? (Score:2, Insightful)

    by tho 1234 ( 709100 ) on Tuesday November 11, 2003 @07:11PM (#7448937)
    Are you really a chip designer? 1- In CMOS technology (or any other logic type used in the last 20 years) there is absolutely no resistive path to ground. (except for gate leakage) Two complementary (the C in cmos) PMOS and NMOS transistors are used to eliminate the need for any resistive branch. 2-voltage is not "accumulated charge", its a difference in potential energy. Changing voltage levels in itself does not cause any power to be lost, and a logic level 0 certainly isn't produced by shoring vcc to ground. Power is consumed in a logic circuit precisely because it stores energy- the capacitance of the transistor causes charge to be stored and later released to ground whenever the voltage level changes. Power consumtion in chips has been reduced over the last 30 years by reducing the amount of stored energy- by making transistors smaller and reducing capacitance. Technically, you can create an oscillator by adding an inductor to the circuit, but that would increase complexity/cost with little benefit in itself. I am not familiar with reversable computing, but i would expect they would need a substantial change in logic structure to extract the stored energy. Also, without a change in logic structure, it would simply be a process improvement, not an entirely new branch of computing. Yes, analog design is more involved than digital design, but the layout and composition of transistors, capacitors, interconnects, etc on any digital logic circuit are analog in themselves. THe only reason you don't have to deal with them is they are generated automatically with CAD tools. If resonators were found to be a viable way to decrease power consumtion, they could be easily added to your CAD tools and make their design just as simple as what you're used to. However, i'm sure its not as simple as adding a resonator to the circuit, and most likely requires an entire new method of computing to extract the stored energy.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...