Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Hardware

The Not-So-Cool Future 155

markmcb writes "Researchers at Purdue University and several other universities are looking to start work on a major problem standing in the way of future chip design: heat. The team is proposing a new center to consolidate efforts in finding solutions for the problem that is expected to become a reality within the next 15 years as future chips are expected to produce around 10 times as much heat as today's chips. The new center would work to develop circuits that consume less electricity and couple them with micro cooling devices."
This discussion has been archived. No new comments can be posted.

The Not-So-Cool Future

Comments Filter:
  • diamond cooling (Score:3, Informative)

    by myukew ( 823565 ) on Saturday April 16, 2005 @12:56PM (#12255558) Homepage
    they should look for ways to mass produce cheap diamonds.
    Diamonds are about five times better at heat conducting as copper and could thus be used for passive cooling.
  • Re:Photonic chips? (Score:5, Informative)

    by Rorschach1 ( 174480 ) on Saturday April 16, 2005 @01:00PM (#12255581) Homepage
    Thermodynamics makes sure of that.

    Minimum Energy Requirements of Information Transfer and Computing [aeiveos.com]

  • Alliances (Score:3, Informative)

    by Brainix ( 748988 ) <brainix@gmail.com> on Saturday April 16, 2005 @01:03PM (#12255605) Homepage
    The alliance proposed in the article, to me, seems similar to the AIM Alliance [wikipedia.org] of the early 90s. Several companies united in a common goal. I've heard the AIM Alliance failed because competitors united in a common goal remain competitors, and as such tend not to fully disclose "trade secrets," even to further the common goal. If this proposed alliance takes off, I fear it will suffer the same fate as the AIM Alliance.
  • by Anonymous Coward on Saturday April 16, 2005 @01:10PM (#12255665)
    Unfortunately, with a multi-core/multi-CPU system, you will use up probably more power, and you will produce an enormous amount of heat within the case (although not all on one die). That heat then has to be removed from the inside of the case one way or another, so it still wouldn't solve the problem.
  • Re:diamond cooling (Score:2, Informative)

    by LiquidCoooled ( 634315 ) on Saturday April 16, 2005 @01:23PM (#12255754) Homepage Journal
    Actually, diamond is looking better and better for use as a replacement for silicon.

    see here [geek.com] for more info.
    (This was reported extensively at the time)
  • Re:diamond cooling (Score:5, Informative)

    by kebes ( 861706 ) on Saturday April 16, 2005 @01:27PM (#12255765) Journal
    Actually many researchers are in fact seriously pursuing using diamond as a future replacement for silicon. Both diamond and silicon are *very bad* conductors in their pure state. Both have to be doped (with phosphorous, boron, etc.) to become p-type or n-type semiconductors, which makes them useful as a substrate for microprocessors (note that when doped they are semiconductors, not conductors... your microchip would just short-out if the entire wafer was made of a metal/conductor).

    Diamond's superior thermal, optical, and chemical-resistance properties make it attractive for future microprocessors... but unfortunately it is more difficult to make it work as a semiconductor, which is why silicon has always been the substrate of choice.

    It's very interesting research, and we'll see where it goes. For more info, this C&E News article is good, [acs.org] or check here, [ferret.com.au] or here [geek.com] and there's a bit here. [wikipedia.org]
  • Re:1kW?! (Score:4, Informative)

    by kebes ( 861706 ) on Saturday April 16, 2005 @01:40PM (#12255866) Journal
    FTA:
    Current chips generate about 50-100 watts of heat per square centimeter.
    "But in the future, say 15 to 20 years from now, the heat generation will likely be much more than that, especially in so-called hot spots, where several kilowatts of heat per square centimeter may be generated over very small regions of the chip..."


    Let's not confuse power with power density. When the article says "10 times the heat" they mean kW/cm^2, not kW. Chips of the future will generate a few kW/cm^2 of heat in their hottest spots, but they will still be supplied from conventional 200W power supplies that run off of normal 120V power lines. It's the dissipation of so much heat in such a small area that is the issue, not the raw amount of energy being consumed.

    So, again, it's not the the processor will draw 1 kW of power (it may draw considerably less), but rather that it's hottest spots will need to dissipate ~1 kW/cm^2 (i.e.: 1000 joules of heat per second per square centimeter).
  • by kebes ( 861706 ) on Saturday April 16, 2005 @02:04PM (#12256066) Journal
    In principle, yes, any temperature gradient can be harnessed to do some amount of useful work. Thermodynamics certainly allows this (without perfect 100% conversion, obviously).

    AFAIK, it really is an engineering issue. Converting a temperature gradient to electricity works great when you have huge temperature gradients (like in nuclear reactors, coal plants, steam engine, etc.), but is not so useful in a computer tower. Firstly, the whole point of putting fins on a chip is to spread the heat out quickly, so that it doesn't build up and make the chip too hot (i.e. melt it and stuff). So for our chips to work, we can't run them any hotter than 60C (or maybe 100C or whatever). The gradient between 60C and room temperature, over a few centimeters, is not that great (imagine putting a paddle wheel above your CPU, and letting the current of up-flowing air turn it... now imagine how much useful work that puny paddle wheel is really going to do). If you actually built a device to extract that energy, it wouldn't be worth it. It would take a 1000 years (or whatever) of running it before the electricity savings would offset the cost of having built that little device.

    So even though in principle you're right, in practice (from an engineering perspective) there's no economic advantage to doing this.

    Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost. So solar panels that burn out before this mark are actually *worse* for the environment that getting electricity from coal (or wherever)... (because producing a solar panel also pollutes the environment) Solar power is only going to be viable if they are either 1. cheaper or 2. longer-lasting or 3. more efficient than they are now (all of the above would be great).

    Lastly, thermodynamics guarantees that in the winter, in a cold place, it's impossible to waste electricity (if you have a thermostated heating system). Basically any inefficiency in your home (be it from your vacuum cleaner or computer) ends up as heat, which makes the house warmer, and makes the thermostat's job a little easier. In the summer, however, it really is wasted energy.
  • Re:Photonic chips? (Score:2, Informative)

    by marcosdumay ( 620877 ) <marcosdumay&gmail,com> on Saturday April 16, 2005 @02:43PM (#12256322) Homepage Journal
    The tecnologies we have now for fotonics produce an incredible amount of hot (if you use milions of switches). Can't compete with CMOS. And there is no teoric limitation on either field that makes one more attractive than the other for low consumation.
  • by zippthorne ( 748122 ) on Saturday April 16, 2005 @03:52PM (#12256749) Journal
    The maximum amount of useful work you can extract from a heat engine with two temperature pools has been derived and is known as Carnot Efficiency:

    eta = (Thot - Tcold)/Thot.

    using absolute temperatures (Kelvin or Rankine)
    So assuming the limit is Thot = 60C = 333 K and Tcold = 25C (average room temp) = 298 K, The maximum efficiency would be 10%. Assuming further that 100W is lost by the chip alone, only 10W would be potentially recoverable. Unfortunately it gets worse: The Carnot cycle is theoretical and no real carnot engine could ever be produced. There are some very efficient cycles available (stirling and rankine come to mind) however none can exceed the carnot efficiency.

    It also gets worse as you make the engine smaller. Consider the tolerance of pistons or turbines. Suppose you must leave 1mm of gap between surfaces. For large engines this is no problem, but as the machines become smaller, the minimum gap becomes a greater percentage of the total area.

    Machines to extract energy from such a small source at such a low temperature difference have significant theoretical inefficiencies before you even get to the practical ones. This does not mean that you can't recover any of the "wasted heat" but only that you've pretty much gotten all the useful work out of it that you can and recovering the rest would be very impractical.

    Have you ever eaten a lobster? did you suck the meat from the legs?
  • by shadow_slicer ( 607649 ) on Saturday April 16, 2005 @08:49PM (#12258387)
    "The gradient between 60C and room temperature, over a few centimeters, is not that great"
    yeah, but it it's over a few nanometers it's pretty big. If we built a generator on that scale it might be worthwhile...

    "Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost."
    Where do you get this from? I keep seeing that argument over and over again, but I can't seem to find any data to back it up.
    A little googling, found this:
    http://www.thecomma.co.uk/globalism/ [thecomma.co.uk]

    "Lastly, thermodynamics guarantees that in the winter, in a cold place, it's impossible to waste electricity"
    I call BS. Most home heating is not by resistive heating, but through heat pumps which are thermodynamically required to be more efficient than any resistive heat losses. Heat pumps operate like air conditioners in reverse, pumping heat from the outside into the inside. This means that the energy from a heat pump only goes to moving already existing heat, so they can enjoy effective thermodynamic "efficiencies" of greater than 100% (which aren't real efficiencies, because they don't take into account the heat drawn from the environment, and so are called Coefficients Of Performance).
    A little googling provides this informative link:
    http://energyoutlet.com/res/heatpump/efficiency.ht ml [energyoutlet.com]
    In summary, that means that of the "wasted energy", you have a net energy waste of (COP_hp-1)*E_wasted in winter.
  • Re:Nothing new (Score:3, Informative)

    by eliasen ( 566529 ) on Saturday April 16, 2005 @11:03PM (#12259111) Homepage
    Why is the parent moderated funny?

    Reversible computation is quite real, but it doesn't work in the way you explained. You don't need to actually run the computation backwards. To make a long story short, the only time that a reversible computer needs to expend energy as heat is when it's producing output, or setting/clearing variables to a known state. And then, it only requires energy proportional to the number of bits being output, and the temperature. So if you're testing whether a billion-digit number is prime, the entire calculation can take zero energy, except for the one bit of output.

    Unfortunately, to get truly reversible computing, the computation has to be done arbitrarily slowly.

    If you don't have it, Feynman Lectures on Computation [amazon.com] has one of the clearest discussions of reversible computation. Very highly recommended, and fun. We're 35+ years past the time when Feynman made these lectures, and we're still nowhere close to the limits or the technology that he described. Techniques for varying the power supply on the chip alone would very greatly reduce energy usage.

Software production is assumed to be a line function, but it is run like a staff function. -- Paul Licker

Working...