'Reversible' Computers More Energy Efficient 330
James Clark writes "As Congress continues work on a federal energy bill, a group of University of Florida researchers is working to implement a radical idea for making computers more energy efficient -- as well as smaller and faster." Reversible computing rears its head again.
Vaporware? (Score:5, Interesting)
Reversing entropy? (Score:3, Interesting)
IANAP, but this sounds like trying to reverse entropy as much as possible to me. Won't it take more energy to do a reverse computation than you'll save? Where does the lost energy from that go?
Cool (Score:1, Interesting)
I'm not just spewing. There are serious theoretical problems associated with how information "disappears" when it falls into a black hole. Fortunately, you get the information back again from Hawking radiation, as the hole converts mass into energy. From a theoretical standpoint it's really starting to look like "information == energy," or to put it more precisely, there is a specific equivalence between information and energy like the equivalence between matter and energy.
We've already got space == time, matter == energy, why not also information == energy? There are starting parallels between Shannon's information theory, and the theory of thermodynamics. There is some mysterious shit going on here.
Another boost to my pet theory of the universe: everything is equal to everything else, and we delude ourselves into perceiving imaginary distinctions between things.
Is PC power use really a big issue at this time? (Score:1, Interesting)
HOW does it make it more efficent? (Score:4, Interesting)
Photographs of "a very simple reversible computer" (Score:3, Interesting)
Universiteit Gent [rug.ac.be] has some pictures of reversible logic gates, including a four-bit adder composed out of Feynman's "NOT, the CONTROLLED NOT, and the CONTROLLED CONTROLLED NOT" reversible logic gates, and some other circuits they've built.
They also have links to other sites about reversible logic and reversible computing, such as Ralph Merkle's Reversible Computing page [zyvex.com] (from Xerox).
Also note the bottom of the page: there's a vacancy in the research group, [ugent.be] for all those just aching for a chance to work on reversible computing! (Looks like you'll have to speak Dutch, though.) ;-)
Dlugar
Huh? (Score:3, Interesting)
Stirling engine (Score:5, Interesting)
Maybe the Stirling idea is going too far.
How about a more efficient circuit? It's been awhile since college, but isn't excess heat a sign that the circuit is inefficient?
While it's not completely frivolous research, it's not the first avenue I would approach when looking at this problem. It seems more difficult and time-consuming to add in circuitry to re-use the energy to perform other actions inside of a CPU. It seems like you'd have a better chance at compounding the problem, rather than helping it.
However, make the circuit more efficient, you'll generate less heat. That would be my first goal. What kind of efficiency do they get with today's CPUs?
With this reversible thinking, I have an idea. I need a little help from the anti-SUV crowd... wouldn't all gasoline engines be better off with really big flywheels?
Re:Thermodynamics 101 (Score:3, Interesting)
Then you are attacking a straw man, at least with respect to the article. A reversible chip would be no less capable. In fact, in the long run, it would be more capable. Less energy heat dissapated means we can continue to use the same materials far into the future with faster and faster chips. As it stands silicon will become ununsable once the heat dissapation reaches the melting point of silicon. (Far sooner, actually.)
Now, I'd love to see chips made from artificial diamond, but I think reversible chips will come sooner.
Re:Theory (Score:5, Interesting)
(k = Boltzmann's constant, W = number of states)
Information (in bits) I = log_2 W = ln W / ln 2
Hence S = kI/ln 2 or roughly S = kI.
Heat Q = ST = kTI.
Let's say we destroy 100 gigabits of information at a temperature of 300 K. Since k = 1.38E-23 J/K, this means a heat of about 4E-10 Joules. Which is not very much, and does not really contribute to the heat produced by CPUs.
In fact, I think this is the way to find a theoretical minimum for the heat produced from information processing. We can try and make more efficient processors to get closer, just like we can increase the efficiency of engines to get closer to the thermodynamic limit.
Didn't Cray already sort of do this? (Score:1, Interesting)
I realize that the point here is to not draw a lot of power, but somehow the two things seem related...
is this possible? (Score:2, Interesting)
what I'm thinking is that the CPU does billions of calculations/second, but some other chips don't run as fast and don't need as much power, so they can take what's left over from the CPU and other chips and use some outside energy.
is that possible? like i said, I don't know much about electrical engineering, so I don't even know if it's practical to map a ground pin to a capacitor...
Re:Stirling engine (Score:2, Interesting)
2. Circuits compared to reverse logic are probably inefficient but the heat comes from Clock speed and thinner interconnects and poor insulators(dielectrics at present sizes).
This is the reason for the push for spintronic transistors . google that.
Re:Reversing entropy? (Score:3, Interesting)
The actual measurement of entropy has to do with counting the possible states that a system could be in. A computer containing a list of numbers in its memory could be in any of a large number of states depending on what you know about the list of numbers and the contents of the rest of its memory. If you instruct it to go replace the list of numbers with its sum, the number of states it could be in afterwards is decreased. So its entropy has gone down.
Thermodynamics says that the only way to reduce a system's entropy is to expend energy on it. So if you worked out a way to juggle the bits in the list of numbers around so that you would get the sum, but in such a way that you could back out the operation afterwards to recover the original list, and do it without overwriting any of the other information on the system, then you could do the operation without reducing the entropy of the computer, and wouldn't be forced to expend energy.
Now, a lot of confusion comes from the fact that Shannon decided to call mutual entropy "information" when he was working out coding theory. The concept has a lot of parallels to how we ordinarily think of information, but the correspondence isn't exact, and trying to think of mutual entropy as "information" informally will lead you to a lot of wrong conclusions. It's one of those all-too-common instances where picking a commonly understood name to stand for a subtle concept can do more harm than good.
Re:Asynchronous Logic will be here first. (Score:4, Interesting)
Please cite your references or evidence to this statement if you wish to be taken seriously.
Several companies are currently working on complex and high-performance designs using asynchronous techniques. It's true that it is currently more difficult, predominantly because current design tools are all geared towards generating and testing "standard" clocked logic, but it is being done and it does not by any stretch "blow".
It will be quite some time before all of the components on a motherboard are asynchronous, but groups -have- designed processors, memory controllers, and other components in asynchronous.
For but the briefest of examples... check out this article [eetimes.com] or this article [technologyreview.com]. No, it isn't the answer to everything... but it's much farther along than you seem to realize
Re:Feynman Lectures on Computation (Score:3, Interesting)
As usual, Feynman was way ahead of his time.
Reversible computing had been proposed twenty years earlier by an IBM engineer and widely recognized as an important idea, so one can hardly credit Feynman for this one.
There has been steady research on reversible computation over the last ten years or so. In fact the best paper award at one of the major CS conferences in 1993 was for a reversible computing paper.
Re:Read the Feynman book (Score:3, Interesting)
Here is an interesting excerpt on pages 149-150 that explains Maxwell's demon in terms of reversible computing:
Re:Vaporware? (Score:3, Interesting)
Bah, I can't keep up that formatting.
You're close on that last point -- the reason I don't have to deal with analog design (too much) is not that my cad tools do it, rather that we can (safely) simplify our models of logic gates down from the complex analog circuits they are and treat them as digital logic. Accounting for crosstalk, transition times, wire delays, signal integrity, etc. outside of the primary design flow allows us to maintain this gross simplification throughout most of the design flow. EDA tools that can handle a new idea usually follow initial real-world implementations by at least a few years. Check out the hierarchical design tools available over the last 5 years, then look at the hierarchical tapeouts for the last 5 years. The tools were (and are, largely) crap. But the chips work (each of mine included).
One of my favorite design tools is Perl.
Re:Vaporware? (Score:3, Interesting)
Yeah, no shit sherlock. Just because there are no explicit resistors drawn in the circuit doesn't mean that the stored charge isn't dumped to ground through a resistive path. When the NMOS gates turn on, they're effectively shorting the stored charge in the load capacitance to ground through the ON resistance of the gate. And similarly, when the PMOS gates turn on, they charge the load capacitance through the supply rails in an analogous manner.
So just because there aren't explicit resistors (thanks to complimentary logic) doesn't mean that the charge isn't effectively being just supplied to a temporary store and then dumped to ground though resistive paths, which is what the original poster was saying.
You want to save *how* much electricity??? (Score:3, Interesting)
100W?
I piss 100W when I get up in the morning.
100W will cost you $79 [US] a year if you run it hard and constant every second 24/7/365. ($0.09 per KWH)
In the US, each average family has more power, more cheaply than some cities in other parts of the world.
Furthermore, the energy is still going to be released as heat at some point. Where else does it go??? Sure, you might be able to switch a given transister 3-4 times with the same energy, but once it drops in voltage and current, the transister no longer switches. Furthermore the chips are already being run at 1.x volts, which is barely enough to account for the voltage drop anyway. To get enough energy back after a transister you'd have to put in a greater initial voltage, wasting more heat.
Furthermore, more transisters means more complexity, more electricity, and more speed problems. I'm sure there's some savings, but once you add everything up it simply isn't worth it for mainstream desktop processors.
It may be worthwhile in battery operated, low speed, high efficiency processors, but it'll be a long time before a wall is hit that only this technology can help with.
The reality is that this guy's patent is running out, and he's shopping it around to see if he can eke anything out of it.
-Adam