The Not-So-Cool Future 155
markmcb writes "Researchers at Purdue University and several other universities are looking to start work on a major problem standing in the way of future chip design: heat. The team is proposing a new center to consolidate efforts in finding solutions for the problem that is expected to become a reality within the next 15 years as future chips are expected to produce around 10 times as much heat as today's chips. The new center would work to develop circuits that consume less electricity and couple them with micro cooling devices."
diamond cooling (Score:3, Informative)
Diamonds are about five times better at heat conducting as copper and could thus be used for passive cooling.
Re:Photonic chips? (Score:5, Informative)
Minimum Energy Requirements of Information Transfer and Computing [aeiveos.com]
Alliances (Score:3, Informative)
Re:But can you make a cluster of them...? (Score:1, Informative)
Re:diamond cooling (Score:2, Informative)
see here [geek.com] for more info.
(This was reported extensively at the time)
Re:diamond cooling (Score:5, Informative)
Diamond's superior thermal, optical, and chemical-resistance properties make it attractive for future microprocessors... but unfortunately it is more difficult to make it work as a semiconductor, which is why silicon has always been the substrate of choice.
It's very interesting research, and we'll see where it goes. For more info, this C&E News article is good, [acs.org] or check here, [ferret.com.au] or here [geek.com] and there's a bit here. [wikipedia.org]
Re:1kW?! (Score:4, Informative)
Current chips generate about 50-100 watts of heat per square centimeter.
"But in the future, say 15 to 20 years from now, the heat generation will likely be much more than that, especially in so-called hot spots, where several kilowatts of heat per square centimeter may be generated over very small regions of the chip..."
Let's not confuse power with power density. When the article says "10 times the heat" they mean kW/cm^2, not kW. Chips of the future will generate a few kW/cm^2 of heat in their hottest spots, but they will still be supplied from conventional 200W power supplies that run off of normal 120V power lines. It's the dissipation of so much heat in such a small area that is the issue, not the raw amount of energy being consumed.
So, again, it's not the the processor will draw 1 kW of power (it may draw considerably less), but rather that it's hottest spots will need to dissipate ~1 kW/cm^2 (i.e.: 1000 joules of heat per second per square centimeter).
Re:Why is heat reclamation not worth it? (Score:3, Informative)
AFAIK, it really is an engineering issue. Converting a temperature gradient to electricity works great when you have huge temperature gradients (like in nuclear reactors, coal plants, steam engine, etc.), but is not so useful in a computer tower. Firstly, the whole point of putting fins on a chip is to spread the heat out quickly, so that it doesn't build up and make the chip too hot (i.e. melt it and stuff). So for our chips to work, we can't run them any hotter than 60C (or maybe 100C or whatever). The gradient between 60C and room temperature, over a few centimeters, is not that great (imagine putting a paddle wheel above your CPU, and letting the current of up-flowing air turn it... now imagine how much useful work that puny paddle wheel is really going to do). If you actually built a device to extract that energy, it wouldn't be worth it. It would take a 1000 years (or whatever) of running it before the electricity savings would offset the cost of having built that little device.
So even though in principle you're right, in practice (from an engineering perspective) there's no economic advantage to doing this.
Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost. So solar panels that burn out before this mark are actually *worse* for the environment that getting electricity from coal (or wherever)... (because producing a solar panel also pollutes the environment) Solar power is only going to be viable if they are either 1. cheaper or 2. longer-lasting or 3. more efficient than they are now (all of the above would be great).
Lastly, thermodynamics guarantees that in the winter, in a cold place, it's impossible to waste electricity (if you have a thermostated heating system). Basically any inefficiency in your home (be it from your vacuum cleaner or computer) ends up as heat, which makes the house warmer, and makes the thermostat's job a little easier. In the summer, however, it really is wasted energy.
Re:Photonic chips? (Score:2, Informative)
Re:Why is heat reclamation not worth it? (Score:4, Informative)
eta = (Thot - Tcold)/Thot.
using absolute temperatures (Kelvin or Rankine)
So assuming the limit is Thot = 60C = 333 K and Tcold = 25C (average room temp) = 298 K, The maximum efficiency would be 10%. Assuming further that 100W is lost by the chip alone, only 10W would be potentially recoverable. Unfortunately it gets worse: The Carnot cycle is theoretical and no real carnot engine could ever be produced. There are some very efficient cycles available (stirling and rankine come to mind) however none can exceed the carnot efficiency.
It also gets worse as you make the engine smaller. Consider the tolerance of pistons or turbines. Suppose you must leave 1mm of gap between surfaces. For large engines this is no problem, but as the machines become smaller, the minimum gap becomes a greater percentage of the total area.
Machines to extract energy from such a small source at such a low temperature difference have significant theoretical inefficiencies before you even get to the practical ones. This does not mean that you can't recover any of the "wasted heat" but only that you've pretty much gotten all the useful work out of it that you can and recovering the rest would be very impractical.
Have you ever eaten a lobster? did you suck the meat from the legs?
Re:Why is heat reclamation not worth it? (Score:3, Informative)
yeah, but it it's over a few nanometers it's pretty big. If we built a generator on that scale it might be worthwhile...
"Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost."
Where do you get this from? I keep seeing that argument over and over again, but I can't seem to find any data to back it up.
A little googling, found this:
http://www.thecomma.co.uk/globalism/ [thecomma.co.uk]
"Lastly, thermodynamics guarantees that in the winter, in a cold place, it's impossible to waste electricity"
I call BS. Most home heating is not by resistive heating, but through heat pumps which are thermodynamically required to be more efficient than any resistive heat losses. Heat pumps operate like air conditioners in reverse, pumping heat from the outside into the inside. This means that the energy from a heat pump only goes to moving already existing heat, so they can enjoy effective thermodynamic "efficiencies" of greater than 100% (which aren't real efficiencies, because they don't take into account the heat drawn from the environment, and so are called Coefficients Of Performance).
A little googling provides this informative link:
http://energyoutlet.com/res/heatpump/efficiency.h
In summary, that means that of the "wasted energy", you have a net energy waste of (COP_hp-1)*E_wasted in winter.
Re:Nothing new (Score:3, Informative)
Reversible computation is quite real, but it doesn't work in the way you explained. You don't need to actually run the computation backwards. To make a long story short, the only time that a reversible computer needs to expend energy as heat is when it's producing output, or setting/clearing variables to a known state. And then, it only requires energy proportional to the number of bits being output, and the temperature. So if you're testing whether a billion-digit number is prime, the entire calculation can take zero energy, except for the one bit of output.
Unfortunately, to get truly reversible computing, the computation has to be done arbitrarily slowly.
If you don't have it, Feynman Lectures on Computation [amazon.com] has one of the clearest discussions of reversible computation. Very highly recommended, and fun. We're 35+ years past the time when Feynman made these lectures, and we're still nowhere close to the limits or the technology that he described. Techniques for varying the power supply on the chip alone would very greatly reduce energy usage.