Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology Hardware

'Reversible' Computers More Energy Efficient 330

James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
This discussion has been archived. No new comments can be posted.

'Reversible' Computers More Energy Efficient

Comments Filter:
  • Vaporware? (Score:5, Interesting)

    by Carnildo ( 712617 ) on Tuesday November 11, 2003 @04:30PM (#7447443) Homepage Journal
    Has anyone ever built even a very simple reversible computer? Or is this like quantum computers: all theory, no practice?
  • Reversing entropy? (Score:3, Interesting)

    by oGMo ( 379 ) on Tuesday November 11, 2003 @04:31PM (#7447459)

    IANAP, but this sounds like trying to reverse entropy as much as possible to me. Won't it take more energy to do a reverse computation than you'll save? Where does the lost energy from that go?

  • Cool (Score:1, Interesting)

    by pclminion ( 145572 ) on Tuesday November 11, 2003 @04:33PM (#7447489)
    Yet more evidence that information is in fact a quantifiable property. We're starting to see hints that information and energy are flip sides of the same coin.

    I'm not just spewing. There are serious theoretical problems associated with how information "disappears" when it falls into a black hole. Fortunately, you get the information back again from Hawking radiation, as the hole converts mass into energy. From a theoretical standpoint it's really starting to look like "information == energy," or to put it more precisely, there is a specific equivalence between information and energy like the equivalence between matter and energy.

    We've already got space == time, matter == energy, why not also information == energy? There are starting parallels between Shannon's information theory, and the theory of thermodynamics. There is some mysterious shit going on here.

    Another boost to my pet theory of the universe: everything is equal to everything else, and we delude ourselves into perceiving imaginary distinctions between things.

  • by Anonymous Coward on Tuesday November 11, 2003 @04:46PM (#7447655)
    I don't know about AMD, but at least Via and Transmeta and I think Intel are already producing processors that can handle most any PC application including playing high resolution videos while running on around ten watts of power. That's not a terribly significant amount of power even compared to flourescent lighting.
  • by autopr0n ( 534291 ) on Tuesday November 11, 2003 @04:48PM (#7447679) Homepage Journal
    I mean, say you have a CMOS OR gate. If both of the inputs are high, then the NMOS transistors will close and the PMOS transistors will open. Energy is lost only when electrons 'leak through' when the gate changes (and of course, electrons that leak through but don't affect the computation, which I guess happens all the time). How would reversing the computation affect this? Maybe if you were using plain PMOS or something...
  • by Dlugar ( 124619 ) on Tuesday November 11, 2003 @04:54PM (#7447747) Homepage

    Universiteit Gent [rug.ac.be] has some pictures of reversible logic gates, including a four-bit adder composed out of Feynman's "NOT, the CONTROLLED NOT, and the CONTROLLED CONTROLLED NOT" reversible logic gates, and some other circuits they've built.

    They also have links to other sites about reversible logic and reversible computing, such as Ralph Merkle's Reversible Computing page [zyvex.com] (from Xerox).

    Also note the bottom of the page: there's a vacancy in the research group, [ugent.be] for all those just aching for a chance to work on reversible computing! (Looks like you'll have to speak Dutch, though.) ;-)


    Dlugar
  • Huh? (Score:3, Interesting)

    by autopr0n ( 534291 ) on Tuesday November 11, 2003 @04:58PM (#7447786) Homepage Journal
    Thermodynamics also says that you lose non-heat energy in reversable systems as well. If you throw a ball into the air, you lose some energy from wind resistance, from converting chemical energy in your arm into mechanical energy, etc.
  • Stirling engine (Score:5, Interesting)

    by bs_02_06_02 ( 670476 ) on Tuesday November 11, 2003 @05:01PM (#7447813)
    Isn't it just easier to use the excess heat to power a Stirling engine to recapture waste energy?
    Maybe the Stirling idea is going too far.
    How about a more efficient circuit? It's been awhile since college, but isn't excess heat a sign that the circuit is inefficient?
    While it's not completely frivolous research, it's not the first avenue I would approach when looking at this problem. It seems more difficult and time-consuming to add in circuitry to re-use the energy to perform other actions inside of a CPU. It seems like you'd have a better chance at compounding the problem, rather than helping it.
    However, make the circuit more efficient, you'll generate less heat. That would be my first goal. What kind of efficiency do they get with today's CPUs?

    With this reversible thinking, I have an idea. I need a little help from the anti-SUV crowd... wouldn't all gasoline engines be better off with really big flywheels?
  • by greg_barton ( 5551 ) * <greg_barton@yaho ... minus herbivore> on Tuesday November 11, 2003 @05:10PM (#7447900) Homepage Journal
    I attack instead the basic premise, that there is a shortage of energy, or that we must accept lower standard of life or lower capability in our machinery.

    Then you are attacking a straw man, at least with respect to the article. A reversible chip would be no less capable. In fact, in the long run, it would be more capable. Less energy heat dissapated means we can continue to use the same materials far into the future with faster and faster chips. As it stands silicon will become ununsable once the heat dissapation reaches the melting point of silicon. (Far sooner, actually.)

    Now, I'd love to see chips made from artificial diamond, but I think reversible chips will come sooner.
  • Re:Theory (Score:5, Interesting)

    by TeknoHog ( 164938 ) on Tuesday November 11, 2003 @05:39PM (#7448193) Homepage Journal
    Entropy S = k ln W
    (k = Boltzmann's constant, W = number of states)
    Information (in bits) I = log_2 W = ln W / ln 2
    Hence S = kI/ln 2 or roughly S = kI.

    Heat Q = ST = kTI.

    Let's say we destroy 100 gigabits of information at a temperature of 300 K. Since k = 1.38E-23 J/K, this means a heat of about 4E-10 Joules. Which is not very much, and does not really contribute to the heat produced by CPUs.

    In fact, I think this is the way to find a theoretical minimum for the heat produced from information processing. We can try and make more efficient processors to get closer, just like we can increase the efficiency of engines to get closer to the thermodynamic limit.

  • by Anonymous Coward on Tuesday November 11, 2003 @05:39PM (#7448201)
    Reading the article (nice and short, it was!) reminded me of the way the Cray-1 was designed: All the logic signals had both true and complement forms. This was necessary to drive the twisted-pair interconnect if the signals went off-module, and also had the advantage that the power supply mostly saw a DC load instead of a wildly-varying load depending on what was happening in the CPU. Thus, the power-supply filtering could be a lot smaller than it otherwise would have had to be, which was good, because it drew a LOT of power!

    I realize that the point here is to not draw a lot of power, but somehow the two things seem related...
  • is this possible? (Score:2, Interesting)

    by Major_Small ( 720272 ) on Tuesday November 11, 2003 @05:44PM (#7448249) Journal
    i don't know much about processors, but is it possible to send all the unused electricity to a capacitor somewhere nearby on the motherboard, and then draw some of the power for something else (a smaller chip?) from that capacitor?

    what I'm thinking is that the CPU does billions of calculations/second, but some other chips don't run as fast and don't need as much power, so they can take what's left over from the CPU and other chips and use some outside energy.

    is that possible? like i said, I don't know much about electrical engineering, so I don't even know if it's practical to map a ground pin to a capacitor...

  • Re:Stirling engine (Score:2, Interesting)

    by zymano ( 581466 ) on Tuesday November 11, 2003 @05:46PM (#7448259)
    1. Stirling engines are too bulky . While efficient heat engines they are not practical . They have been used once in project by detroit but didn't produce much power. Power is key ,what everyone wants.

    2. Circuits compared to reverse logic are probably inefficient but the heat comes from Clock speed and thinner interconnects and poor insulators(dielectrics at present sizes).

    This is the reason for the push for spintronic transistors . google that.
  • by plastik55 ( 218435 ) on Tuesday November 11, 2003 @05:54PM (#7448327) Homepage
    It's not strictly "data" as you're accutomed to thinking about it as a bunch of ones and zeroes, but "information" in the sense of information theory--more or less another word for entropy.

    The actual measurement of entropy has to do with counting the possible states that a system could be in. A computer containing a list of numbers in its memory could be in any of a large number of states depending on what you know about the list of numbers and the contents of the rest of its memory. If you instruct it to go replace the list of numbers with its sum, the number of states it could be in afterwards is decreased. So its entropy has gone down.

    Thermodynamics says that the only way to reduce a system's entropy is to expend energy on it. So if you worked out a way to juggle the bits in the list of numbers around so that you would get the sum, but in such a way that you could back out the operation afterwards to recover the original list, and do it without overwriting any of the other information on the system, then you could do the operation without reducing the entropy of the computer, and wouldn't be forced to expend energy.

    Now, a lot of confusion comes from the fact that Shannon decided to call mutual entropy "information" when he was working out coding theory. The concept has a lot of parallels to how we ordinarily think of information, but the correspondence isn't exact, and trying to think of mutual entropy as "information" informally will lead you to a lot of wrong conclusions. It's one of those all-too-common instances where picking a commonly understood name to stand for a subtle concept can do more harm than good.
  • by Yobgod Ababua ( 68687 ) on Tuesday November 11, 2003 @06:17PM (#7448512)
    "Asynchronous blows for non-trivial computation."

    Please cite your references or evidence to this statement if you wish to be taken seriously.

    Several companies are currently working on complex and high-performance designs using asynchronous techniques. It's true that it is currently more difficult, predominantly because current design tools are all geared towards generating and testing "standard" clocked logic, but it is being done and it does not by any stretch "blow".

    It will be quite some time before all of the components on a motherboard are asynchronous, but groups -have- designed processors, memory controllers, and other components in asynchronous.

    For but the briefest of examples... check out this article [eetimes.com] or this article [technologyreview.com]. No, it isn't the answer to everything... but it's much farther along than you seem to realize

  • by Alomex ( 148003 ) on Tuesday November 11, 2003 @07:02PM (#7448881) Homepage
    This book by Richard Feynman is [from the ] 1980s. In it he discusses Reversible Computation and the Thermodynamics of Computing and quantum computing.

    As usual, Feynman was way ahead of his time.


    Reversible computing had been proposed twenty years earlier by an IBM engineer and widely recognized as an important idea, so one can hardly credit Feynman for this one.

    There has been steady research on reversible computation over the last ten years or so. In fact the best paper award at one of the major CS conferences in 1993 was for a reversible computing paper.
  • by MillionthMonkey ( 240664 ) on Tuesday November 11, 2003 @07:08PM (#7448918)
    I second this. Feynman's lectures on computation are at a very fundamental level, so they are impractical for day to day use, but the theory is solid. Thousands of years from now, computers will undoubtedly have changed a lot, but the principles in this book will still apply to them since they merely describe how the laws of physics affect any computational system.

    Here is an interesting excerpt on pages 149-150 that explains Maxwell's demon in terms of reversible computing:

    The demon has a very simple task. Set into the partition is a flap, which he can open and shut at will. He looks in one half of the box (say, the left) and waits until he sees a fast-moving molecule approaching the flap. When he does, he opens the flap momentarily, letting the molecule through into the right side, and then shuts the flap again. Similarly, if the demon sees a slow-moving molecule approaching from the right side of the flap, he lets that through into the side the fast one came from. After a period of such activity, our little friend will have separated the fast- and slow-moving molecules into the two compartments. In other words, he will have separated the hot from the cold, and hence created a temperature difference between the two sides of the box. This means that the entropy of the system has decreased, in clear violation of the Second Law!


    This seeming paradox, as I have said, caused tremendous controversy among physicists. The Second Law of Thermodynamics is a well-established principle in physics, and if Maxwell's demon appears to be able to violate it, there is probably something fishy about him. Since Maxwell came up with his idea in 1867, many people have tried to spot the flaw in his argument. Somehow, somewhere, in the process of looking for molecules of a given type and letting them through the flap, there had to be some entropy generated.

    Until recently, it was generally accepted that this entropy arose as a result of the demon's measurement of the position of the molecules. This did not seem unreasonable. For example, one way in which the demon could detect fast-moving molecules would be to shine a demonic torch at them; but such a process would involve dispersing at least one photon, which would cost energy. More generally, before looking at a particular molecule, the demon could not know whether it was moving left or right. Upon observing it, however this was done, his uncertainty, and hence entropy, would have reduced by half, surely accompanied by the corresponding generation of entropy in the environment.

    In fact, and surprisingly, Bennett has shown that Maxwell's demon can actually make its measurements with zero energy expenditure, providing it follows certain rules for recording and erasing whatever information it obtains. The demon must be in a standard state of some kind before measurement, which we will call S: this is the state of uncertainty. After it measures the direction of motion of a molecule, it enters one of two other states- say L for "left-moving", or R for "right-moving". It overwrites the S with whichever is appropriate. Bennett has demonstrated that this procedure can be performed for no energy cost. The cost comes in the next step, which is the erasure of the L or R to reset the demon in the S state in preparation for the next measurement. This realization, that it is the erasure of information, and not measurement, that is the source of entropy generation in the computational process, was a major breakthrough in the study of reversible computation.

  • Re:Vaporware? (Score:3, Interesting)

    by randyest ( 589159 ) on Tuesday November 11, 2003 @09:55PM (#7450048) Homepage
    0) Yes, for the last 10 years. 1) There is leakage through the substrate, and that path is resistive -- though you're correct that the majority of the current flow is not really to "ground" (which, BTW, is a relative thing), rather use in charging/discharging parasitic capacitors -- the relevent fact is that switching dissipates power in the form of heat. Sometimes simplifications are necessary to get a point across. If you want to be a pedant, we can note that there are no 1's or 0's, whip out our detailed SPICE models to explain an AND gate, and worry that there are no "holes", just electrons, though we assume there are in our models, etc. Were this "EEdot", I might have been more precise, but probably not, since it doesn't matter for the understanding of this issue. 2) You can't have a potential difference without some charge accumulation -- chicken: meet egg. This is semantic.

    Bah, I can't keep up that formatting. :). As for inductors: they're really hard to make out of silicon. I'd love to hear about how you'd implement that without using way more wiring tracks than it's worth.

    You're close on that last point -- the reason I don't have to deal with analog design (too much) is not that my cad tools do it, rather that we can (safely) simplify our models of logic gates down from the complex analog circuits they are and treat them as digital logic. Accounting for crosstalk, transition times, wire delays, signal integrity, etc. outside of the primary design flow allows us to maintain this gross simplification throughout most of the design flow. EDA tools that can handle a new idea usually follow initial real-world implementations by at least a few years. Check out the hierarchical design tools available over the last 5 years, then look at the hierarchical tapeouts for the last 5 years. The tools were (and are, largely) crap. But the chips work (each of mine included).

    One of my favorite design tools is Perl. :)
  • Re:Vaporware? (Score:3, Interesting)

    by onomatomania ( 598947 ) on Tuesday November 11, 2003 @10:06PM (#7450130)
    1- In CMOS technology (or any other logic type used in the last 20 years) there is absolutely no resistive path to ground. (except for gate leakage) Two complementary (the C in cmos) PMOS and NMOS transistors are used to eliminate the need for any resistive branch.

    Yeah, no shit sherlock. Just because there are no explicit resistors drawn in the circuit doesn't mean that the stored charge isn't dumped to ground through a resistive path. When the NMOS gates turn on, they're effectively shorting the stored charge in the load capacitance to ground through the ON resistance of the gate. And similarly, when the PMOS gates turn on, they charge the load capacitance through the supply rails in an analogous manner.

    So just because there aren't explicit resistors (thanks to complimentary logic) doesn't mean that the charge isn't effectively being just supplied to a temporary store and then dumped to ground though resistive paths, which is what the original poster was saying.
  • by stienman ( 51024 ) <adavis@@@ubasics...com> on Tuesday November 11, 2003 @10:47PM (#7450356) Homepage Journal
    You want to cut back on the 100W of heat being released by today's processors?

    100W?

    I piss 100W when I get up in the morning.

    100W will cost you $79 [US] a year if you run it hard and constant every second 24/7/365. ($0.09 per KWH)

    In the US, each average family has more power, more cheaply than some cities in other parts of the world.

    Furthermore, the energy is still going to be released as heat at some point. Where else does it go??? Sure, you might be able to switch a given transister 3-4 times with the same energy, but once it drops in voltage and current, the transister no longer switches. Furthermore the chips are already being run at 1.x volts, which is barely enough to account for the voltage drop anyway. To get enough energy back after a transister you'd have to put in a greater initial voltage, wasting more heat.

    Furthermore, more transisters means more complexity, more electricity, and more speed problems. I'm sure there's some savings, but once you add everything up it simply isn't worth it for mainstream desktop processors.

    It may be worthwhile in battery operated, low speed, high efficiency processors, but it'll be a long time before a wall is hit that only this technology can help with.

    The reality is that this guy's patent is running out, and he's shopping it around to see if he can eke anything out of it.

    -Adam

For God's sake, stop researching for a while and begin to think!

Working...