Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Hardware

The Not-So-Cool Future 155

markmcb writes "Researchers at Purdue University and several other universities are looking to start work on a major problem standing in the way of future chip design: heat. The team is proposing a new center to consolidate efforts in finding solutions for the problem that is expected to become a reality within the next 15 years as future chips are expected to produce around 10 times as much heat as today's chips. The new center would work to develop circuits that consume less electricity and couple them with micro cooling devices."
This discussion has been archived. No new comments can be posted.

The Not-So-Cool Future

Comments Filter:
  • by Deltaspectre ( 796409 ) on Saturday April 16, 2005 @11:48AM (#12255508)
    Think about the people up in northern Canada, who need that precious heat! Unless this is some evil conspiracy to kill them off?
  • Nothing new (Score:5, Insightful)

    by koreaman ( 835838 ) <uman@umanwizard.com> on Saturday April 16, 2005 @11:49AM (#12255510)
    What this boils down to is "researches are looking at ways to make cooler chips." Well, duh, haven't they always?
    • Re:Nothing new (Score:5, Interesting)

      by lrichardson ( 220639 ) on Saturday April 16, 2005 @06:09PM (#12257877) Homepage
      A few years back, I read a couple of articles about reversible chips ... run the op through one way, store the results, then run the exact mirror back through. Net heat result was (theoretically) zero. Reality was about 1-2% of regular heat build-up. But I haven't heard anything more on this. Sure, it effectively halves chip speed. And, even at the time, I thought it would be insane to engineer with the pre-emptive tasking coming into vogue. But something that drops heat production by two orders of magnitude seemed worthwhile pursuing. Anyone else heard where this research is at?
      • Re:Nothing new (Score:3, Informative)

        by eliasen ( 566529 )
        Why is the parent moderated funny?

        Reversible computation is quite real, but it doesn't work in the way you explained. You don't need to actually run the computation backwards. To make a long story short, the only time that a reversible computer needs to expend energy as heat is when it's producing output, or setting/clearing variables to a known state. And then, it only requires energy proportional to the number of bits being output, and the temperature. So if you're testing whether a billion-digit nu

  • Photonic chips? (Score:5, Insightful)

    by Mysticalfruit ( 533341 ) on Saturday April 16, 2005 @11:50AM (#12255525) Homepage Journal
    I thought the future of processors was going ot be photonic processors. I'm not sure if these will be producing any heat or not.
    • Re:Photonic chips? (Score:5, Informative)

      by Rorschach1 ( 174480 ) on Saturday April 16, 2005 @12:00PM (#12255581) Homepage
      Thermodynamics makes sure of that.

      Minimum Energy Requirements of Information Transfer and Computing [aeiveos.com]

    • Yes, ultimately they will produce heat. When an electron is excited by a photon it moves to a higher energy orbit, when the electron falls back to its original orbit it gives off that energy as infrared.
      • No. When it falls back down, it most likely will give back the same photon, unless it goes though more than one transition to get back to ground state.

        Heat is caused by friction, not electron energy state transitions! There is no energy "loss" as heat in eletron state transitions.

    • Re:Photonic chips? (Score:3, Insightful)

      by Have Blue ( 616 )
      Everything that performs work produces heat. This is what we mean by "nothing can be 100% efficient".
    • Re:Photonic chips? (Score:2, Informative)

      by marcosdumay ( 620877 )
      The tecnologies we have now for fotonics produce an incredible amount of hot (if you use milions of switches). Can't compete with CMOS. And there is no teoric limitation on either field that makes one more attractive than the other for low consumation.
    • Well apart from obvious thermodynamic laws which implies that it must produce some heat, I think that photonic processors will produce much heat.

      Think a little bit: photons do not interact directly, so it means that you need some matter to create interactions, and photon-matter interactions will definitely generate heat, possibly lot of heat as many useful interactions are "second order" effect ie the change of transparency of the matter is a 'byproduct' which means light must be very intense to induce the
  • Not Cooling (Score:5, Interesting)

    by LordoftheFrings ( 570171 ) <null@ f r a g fest.ca> on Saturday April 16, 2005 @11:53AM (#12255543) Homepage
    I think that the solution to the heat problem will not come with better and more powerful cooling solutions, but rather radically changing how chips are designed and manufactured. The article doesn't contradict this, but I just want to emphasize that. Having some liquid nitrogen cooling unit is not the optimal, or even a good solution.
  • diamond cooling (Score:3, Informative)

    by myukew ( 823565 ) on Saturday April 16, 2005 @11:56AM (#12255558) Homepage
    they should look for ways to mass produce cheap diamonds.
    Diamonds are about five times better at heat conducting as copper and could thus be used for passive cooling.
    • Diamonds would not be any better for passive cooling than aluminium (or copper). The rate they can transfer heat to the air, has nothing to do with how well they conduct heat internally.
      • Diamond will be advantageous for passive cooling because the heatsink can be made much larger for the same thermal drop between the generating element and the air. The more area that can be exposed to the air and still have heat flowing through it, the more effective the heatsink will be.
      • Re:diamond cooling (Score:3, Interesting)

        by N3Bruce ( 154308 )
        Being able to conduct heat internally is a major asset. Conductivity of heat is based on the difference in temperature between the heated end and the unheated end of a material of a given shape and surface area. Think about this junior high school level experiment with a cigarette:

        A 1 gram mass of loosely packed tobacco is wrapped into a paper sleeve .5 cm in diameter and 10 cm long and is a very poor conductor of heat. A match is applied to one end for a few seconds, causing the tobacco to smoulder red-
    • I thought diamonds werent any better at conducting heat and if anything are worse, they just didnt burn up when heated as quickly as silicone making them a good replacement for silicone in the processor itself.
      • diamonds may conuct heat, but not electricity. not aren't usable at all for chip manufacturing.
        fyi silicone is in breasts. silicium is in chips.
        • You don't put straight diamond into the chips... you dope it with copperjust as with silicon.
        • Actually, diamond is looking better and better for use as a replacement for silicon.

          see here [geek.com] for more info.
          (This was reported extensively at the time)
        • Re:diamond cooling (Score:5, Informative)

          by kebes ( 861706 ) on Saturday April 16, 2005 @12:27PM (#12255765) Journal
          Actually many researchers are in fact seriously pursuing using diamond as a future replacement for silicon. Both diamond and silicon are *very bad* conductors in their pure state. Both have to be doped (with phosphorous, boron, etc.) to become p-type or n-type semiconductors, which makes them useful as a substrate for microprocessors (note that when doped they are semiconductors, not conductors... your microchip would just short-out if the entire wafer was made of a metal/conductor).

          Diamond's superior thermal, optical, and chemical-resistance properties make it attractive for future microprocessors... but unfortunately it is more difficult to make it work as a semiconductor, which is why silicon has always been the substrate of choice.

          It's very interesting research, and we'll see where it goes. For more info, this C&E News article is good, [acs.org] or check here, [ferret.com.au] or here [geek.com] and there's a bit here. [wikipedia.org]
          • but unfortunately it is more difficult to make it work as a semiconductor, which is why silicon has always been the substrate of choice.

            Not to mention that sand is cheap, but debeers has been artificially raising the prices of diamonds for ages, and they have usually been expensive and/or difficult to manufacture.

            • You're right, diamond is more expensive. But let me add:

              Most real proposals for using diamond in microprocessors suggest using synthetic diamond, not natural diamond. You can use CVD (chemical vapor deposition) to make good quality artificial diamonds. Currently, growing CVD-diamond is expensive, but then again, taking sand and purifying it into a huge cylinder of single-crystal silicon is also not cheap. If synthetic diamond research continues, it could prove to be competitive with Si.

              The cost of DeBeers
    • The ultimate way to propose to that geek girl you love... a diamond engagement heatsink!
    • they should look for ways to mass produce cheap diamonds. Diamonds are about five times better at heat conducting as...

      If you tried to do that, Debeers would Jimmyhaffa you faster than the oil companies did to that guy who invented 150 mpg engine.
  • 1kW?! (Score:3, Insightful)

    by AaronLawrence ( 600990 ) on Saturday April 16, 2005 @11:56AM (#12255562)
    ("ten times as much heat as today's processors")
    I don't think that 1kW processors will be practical. Nobody is going to want to pay to run that, and nobody will want a heater running in their room all the time either.

    I'd say that they should be looking to limit it to not much more than current figures (100W) - maybe 200W if we are generous. After that it gets silly.
    • yeh but processors will get much smaller. if you remember from your school days, what gives off more heat energy: one candle or a fire(a fire as in ya know like a campfire, i'm not saying candles dont have fire) if they're burning at the same temperature? so even though they get hotter it wont be a heater
    • I would not buy a processor with a rating of 100W. 80W is crazzy, but beyond 100W the fan gets noisy as hell.

      100W * 5c/kWh -> ~$45/year to power it (yeah, low power prices in Canada thanks to tons of hydro :). If you raise it to 20c/kWh, you are paying about $180/year to power your 100W processor... Double that? 10x that? Not me.

    • Re:1kW?! (Score:4, Informative)

      by kebes ( 861706 ) on Saturday April 16, 2005 @12:40PM (#12255866) Journal
      FTA:
      Current chips generate about 50-100 watts of heat per square centimeter.
      "But in the future, say 15 to 20 years from now, the heat generation will likely be much more than that, especially in so-called hot spots, where several kilowatts of heat per square centimeter may be generated over very small regions of the chip..."


      Let's not confuse power with power density. When the article says "10 times the heat" they mean kW/cm^2, not kW. Chips of the future will generate a few kW/cm^2 of heat in their hottest spots, but they will still be supplied from conventional 200W power supplies that run off of normal 120V power lines. It's the dissipation of so much heat in such a small area that is the issue, not the raw amount of energy being consumed.

      So, again, it's not the the processor will draw 1 kW of power (it may draw considerably less), but rather that it's hottest spots will need to dissipate ~1 kW/cm^2 (i.e.: 1000 joules of heat per second per square centimeter).
    • 1+ kW processors used to be common, back when processors were built from hundreds, or thousands, of chips. Cooling wasn't that difficult. You just needed a source of chilled air at positive pressure. The power density was low, so all you needed was a steady flow of air over the IC packages.
  • Breeze (Score:5, Funny)

    by MikeD83 ( 529104 ) on Saturday April 16, 2005 @11:56AM (#12255563)
    "Meanwhile, the cloud of electrons would be alternatively attracted to and repelled by adjacent electrodes. Alternating the voltages on the electrodes creates a cooling breeze because the moving cloud stirs the air."

    Amazing, Purdue is developing the same technology used in such high tech devices as the Ionic Breeze air purifier. [as-seen-on...tore-1.com]
    • great, now we are going to have to worry about chips polluting us [healthdiaries.com]
    • Comment removed based on user account deletion
  • Hot and bothered! (Score:4, Interesting)

    by 3770 ( 560838 ) on Saturday April 16, 2005 @12:00PM (#12255583) Homepage
    Not that I claim to have a solution to the problem with overheating processors. But the power consumption of computers are starting to bother me.

    I used to want the fastest computer around. But a few things have changed I guess.

    First of all computers are starting to be fast enough for most needs.

    Secondly, the way I use computers has changed with always on Internet. I never turn my computer off because I want to be able to quickly look something up on the web.

    I also have a server that is running 24/7. Most of the time it is idling, but even when it is working I don't need it to be a speed demon.

    So it is starting to be really important for me that a computer doesn't use a lot of power. I don't know if it affects my electric bill in a noticeable way, but it feels wrong.
    • Re:Hot and bothered! (Score:3, Interesting)

      by Hadlock ( 143607 )

      So it is starting to be really important for me that a computer doesn't use a lot of power. I don't know if it affects my electric bill in a noticeable way, but it feels wrong.

      well a quick google says it's about five cents per kWh... assume your server spins down the disk drives when idling, and your monitor turns off when not in use; you're probably averaging 200watts an hour. That comes out to be abour $6.72/month in electricity, or $80 per year.

      If you're looking for power savings, an old laptop with

  • as other slashdot articles have proposed, future PCs (probably) won't be much more powerfull than today, but rather, like back in the mainframe days, dependend on some supercomputer selling it's processing power.
    obviously such a mainframe can use massive parallel processing techniques were cooling is less of an issue.
  • Screw this (Score:3, Funny)

    by Timesprout ( 579035 ) on Saturday April 16, 2005 @12:01PM (#12255597)
    We need to start working on the next generation of gerbil powered chips asap!!
  • Alliances (Score:3, Informative)

    by Brainix ( 748988 ) <brainix@gmail.com> on Saturday April 16, 2005 @12:03PM (#12255605) Homepage
    The alliance proposed in the article, to me, seems similar to the AIM Alliance [wikipedia.org] of the early 90s. Several companies united in a common goal. I've heard the AIM Alliance failed because competitors united in a common goal remain competitors, and as such tend not to fully disclose "trade secrets," even to further the common goal. If this proposed alliance takes off, I fear it will suffer the same fate as the AIM Alliance.
  • by ites ( 600337 ) on Saturday April 16, 2005 @12:03PM (#12255606) Journal
    Not a joke.

    The future is multi-core / multi-CPU boards where scaling comes from adding more pieces, not making them individually faster.

    Yes, chips will always get faster and hopefully cooler, but it's no longer the key to performance.

    • by Anonymous Coward
      Unfortunately, with a multi-core/multi-CPU system, you will use up probably more power, and you will produce an enormous amount of heat within the case (although not all on one die). That heat then has to be removed from the inside of the case one way or another, so it still wouldn't solve the problem.
      • It can help though -- My CPU fan spins at 3000rpm and only manages to keep my CPU around 35C.

        Cooling my case is much easier, I have a 120cm fan in my power supply and a couple 80cm fans elsewhere, all spinning at 800rpm, making substantially less noise.

        Keep in mind that 100% of the CPU's output, plus the heat from all the other components is dumped into the case, and from there my case fans dump the heat into my office.

        Spreading out the heat from one single core vs multiple cores and make the cooling pro
    • monolithic cores with higher speeds are faster for some types of problems even than having multiple processors, and not for others. Having two processors doesn't make your system twice as fast unless you normally spend an inordinate amount of time context switching. while I have been eagerly awaiting the introduction of multiprocessing into the home market - make no mistake, this IS the first time any significant effort is being made to sell multiple cores to consumers - I'd still rather have a few very fas
  • hardware DRM (Score:2, Interesting)

    When I think of future problems that will happen to hardware, Hardware DRM comes to mind.
  • Working on the latest generation of computers, its no suprise that the cheaper/generic fans are very noisy trying to turn faster to compensate for the greater cooling requirements.
    An efficient and inexpensive cooling solution would be more desireable, IMHO.

    Has anyone else experienced the "jet engine" noise comeing from newer systems?

    Guess if you make the chip with less need for cooling requirements, we'll solve the puzzle also, however, that may be the more expensive road to the solution of fan noise, no?
  • by KarmaOverDogma ( 681451 ) on Saturday April 16, 2005 @12:07PM (#12255642) Homepage Journal
    Especially for those of us with newer motherboards who want a completely silent system with as few fans as possible

    First it was CPUs with cooling and big/slow/no fans and big heatsinks, then PSUs GPUs and now MOBOs. My current custom box (now 14 months old) was built to be silent and I had a hard time settling on a motherboard that was state of the art, stable, and still used a passive heatsink to cool the board chipset fan-free. I finally settled on an Asus P4P800.

    I can definately believe heat becoming even more of an issue. For those of us who want power/performance and quiet at the same time, this will become even more of a challenge as time goes on. I for one hope not to rely on expensive and/or complicated cooling devices, like peltier units, water pumps and the like. I hope the focus is on efficient chips that only clock up/power up as they need to, like the pentuim M.

    my 2 cents.
  • by kennycoder ( 788223 ) on Saturday April 16, 2005 @12:12PM (#12255674) Homepage
    Whoa that's cool, now it means no more petrol is needed.

    If i take out my CPU cooler it reaches about 100'C. Now lets see, 100 x 10 = 1000'C in only 15 years of chip industry. If we manage out to put this heat into work, lets say we can have 'PC + hairdryer' packages or 'PC + free home-heating' winter offers or even 'PC - burn-a-pizza' boxes. Think about it, its only good news.
    Funny, -1
  • by account_deleted ( 4530225 ) on Saturday April 16, 2005 @12:13PM (#12255683)
    Comment removed based on user account deletion
  • I'd like to hear from some engineering types about why we can't use the excess heat from CPUs to do useful work. I know virtually all large-scale methods of generating electricity involve generating large amounts of heat through some process (nuclear reactions, burning coal or oil, etc), using it to create a hot gas, which turns a turbine, generating electricity.

    I also have some vague handwaving idea that there are processes for generating electricity that have to do with harnessing temperature different
    • Google on "Stirling engines".

      Google on "Thermodynamics, laws of" while you're about it.
    • Okay, according to theory it is possible to use that heat. But it would be economically unsound.

      Most efficent heat transfer can be achieved by convection using materials (fluids) with high absorption of heat (as water) and movement of said fluid (now hot) to the power generator. The size of the required devices would be the size of your desktop at least. At they would be expensive too.

      And, as all thermal and mechanical processes, they are not 100% efficient (2nd law of thermodynamics) nor in the CPU side
    • In principle, yes, any temperature gradient can be harnessed to do some amount of useful work. Thermodynamics certainly allows this (without perfect 100% conversion, obviously).

      AFAIK, it really is an engineering issue. Converting a temperature gradient to electricity works great when you have huge temperature gradients (like in nuclear reactors, coal plants, steam engine, etc.), but is not so useful in a computer tower. Firstly, the whole point of putting fins on a chip is to spread the heat out quickly, s
      • "The gradient between 60C and room temperature, over a few centimeters, is not that great"
        yeah, but it it's over a few nanometers it's pretty big. If we built a generator on that scale it might be worthwhile...

        "Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost."
        Where do you get this from? I keep seeing that argument over and over again, but I can't seem to find any data to back it up.
        A little googling, found this:
        http://www.thecomm [thecomma.co.uk]
        • yeah, but it it's over a few nanometers it's pretty big. If we built a generator on that scale it might be worthwhile...

          The gradient isn't over a few nanometers. The chip has nano-sized components, but overall it is basically a 10mm X 10mm slice of metal that is getting hot. It will try to equilibrate with its surroundings, and the gradient in temperature near it is really not that substantial.

          "Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset t
    • by zippthorne ( 748122 ) on Saturday April 16, 2005 @02:52PM (#12256749) Journal
      The maximum amount of useful work you can extract from a heat engine with two temperature pools has been derived and is known as Carnot Efficiency:

      eta = (Thot - Tcold)/Thot.

      using absolute temperatures (Kelvin or Rankine)
      So assuming the limit is Thot = 60C = 333 K and Tcold = 25C (average room temp) = 298 K, The maximum efficiency would be 10%. Assuming further that 100W is lost by the chip alone, only 10W would be potentially recoverable. Unfortunately it gets worse: The Carnot cycle is theoretical and no real carnot engine could ever be produced. There are some very efficient cycles available (stirling and rankine come to mind) however none can exceed the carnot efficiency.

      It also gets worse as you make the engine smaller. Consider the tolerance of pistons or turbines. Suppose you must leave 1mm of gap between surfaces. For large engines this is no problem, but as the machines become smaller, the minimum gap becomes a greater percentage of the total area.

      Machines to extract energy from such a small source at such a low temperature difference have significant theoretical inefficiencies before you even get to the practical ones. This does not mean that you can't recover any of the "wasted heat" but only that you've pretty much gotten all the useful work out of it that you can and recovering the rest would be very impractical.

      Have you ever eaten a lobster? did you suck the meat from the legs?
    • In most temperate climates, the waste heat from computers pretty much has a neutral effect on your heating bill.

      In the wintertime, running a server farm in your office 24/7 might generate enough excess heat to make a noticeable dent in your heating needs, but even so, unless you use resistance heating or burn a very expensive fuel to heat your office, it is probably cheaper to use a heat pump or other device for building heat.

      In the summer however, you get hit with a double whammy. First, you are paying
  • Take off all your....um...inefficient circuits! Its getting hot in here, take off all your inefficient circuits!
  • Using energy creates heat. If they use less energy there is less heat. I think they should ignore the direct problem and fix the indirect problem.
  • w00t (Score:2, Funny)

    w00t, no more heaters! now we just need a new way to cool my house...
  • "The microscopic cloud of ionized air then leads to an imbalance of charge in the micro-atmosphere, and lightning results. "

    Using lightning to cool a CPU?
    Doesn't EMF pose a problem here?

    Guess you could shield, but thats counter productive isn't it?
  • >problem that is expected to become a reality within the next 15 years as future chips are expected to produce around 10 times as much heat as today's chips.

    This is bullshit. I am never even considering buying a >>100W CPU for my desktop, certainly not 1000W.

    I'd rather see a less fans in my machine, not more.

    Looking into heat/area is more reasonable as area will decrease for a while still.
  • Missing an option? (Score:3, Interesting)

    by andreMA ( 643885 ) on Saturday April 16, 2005 @12:32PM (#12255802)
    It sounds like (RTFA? who, me?) they're focussing on either reducing the amount of heat generated or finding ways to dispose of it more efficiently. Important, sure... but what about developing more heat-tolerant processors? If things ran reliably at 600C, you'd have an easier time moving x amount of waste heat away to the ambient (room-temp) environment, no? Proportional to the 4th power of the temperature difference, no?

    Or perhaps I'm grossly physics-impaired.

    • Dumping all that extra heat into the environment isn't really an option after a certain point. No one wants computers which will raise the ambient temperature 15 or 20 degrees in a bedroom (AMD and P4 jokes aside)
    • No, convection and conduction are proportional to the temperature difference, radiation is proportional to the difference of the 4th power of the temperature, but I wouldn't rely only on radiation to cool my CPU...
      • I wouldn't rely only on radiation to cool my CPU...
        Spoilsport! Think of what a great night-light it'd make! Or emergency lighting, to help you escape your burning house.

        Thanks for the correction re: radiative heat dissipation vs. conduction.

    • interesting point, Convection is pretty much a fourth-power process (unless you restrict the air from flowing.. then you've just got conduction)
      Don't most materials become less conductive as the temperature increases though? thus requireing greater voltage and generating more heat?
  • What would it take to replace x86 with another chip like Crusoe or MiPS and make it better for desktop PCs?
  • "Mechanical engineers at Purdue have filed patents for ... "

    "The patents arose from a research project funded in part by the National Science Foundation."

    The idea of getting the NSF funding, (in part), the research that will later lead to mechanical engineers getting the patent would be a great way to make money at the expense of others.

    Should not the patent rights be shared among those who funded the project?

    • Patents are better than, say, making the developed process a trade-secret. When you get a patent, the process is out in the open, for everyone to see. People can license it in order to use it, and, after a while, a license is no longer required.

      A trade secret, on the other hand, need never be released.
    • "Mechanical engineers at Purdue have filed patents for ... "

      "Should not the patent rights be shared among those who funded the project?"

      The people filing the patents are rarely the owners of the patent rights.

  • Various solutions (Score:3, Insightful)

    by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Saturday April 16, 2005 @01:09PM (#12256102) Homepage Journal
    One "obvious" solution to the chip heating problem would be the following:


    • Have a thin layer of some liquid like flourinert over the chip surface. It just has to conduct heat well, but not electricity.
    • Put a Peltier device in contact with the top of the liquid. Peliters are metal, so that's why you want the electrically insulating layer.
    • Have the top layer of the Peltier device double as a cold-plate.


    This would let you get all the benefits of existing tried-and-tested cooling methods, but would eliminate the bugbears of the chip's casing being an insulator and the possibility of condensation screwing everything up.


    A variant on this would be to have the chip stand upright, so that you could have a cooling system on both sides. The pins would need to be on the sides of the chip, then, not on the base.


    A second option would be to look at where the heat is coming from. A lot of heat is going to be produced through resistance and the bulk of chips still use aluminum (which has a relatively high resistance) for the interconnects. Copper interconnects would run cooler, and (if anyone can figure out how to do it) silver would be best of all.


    A third option is to look at the layout of the chips. I'm not sure exactly how memory chips are organized, but it would seem that the more interleaving you have, the lower the concentration of heat at any given point, so the cooler the chip will run. Similarly for processors, it would seem that the more spaced out a set of identical processing elements are, the better.


    A fourth option is to double the width of the inputs to the chips (eg: you'd be looking at 128-bit procrssors) and to allow instructions to work on vectors or matrices. The idea here is that some of the problem is in the overheads of fetching and farming out the work. If you reduce the overheads, by transferring work in bulk, you should reduce the heat generated.

  • Human brains, being as powerful processors as they are don't run as hot... Therefore as a pc chip doesnt _need_ to either surely?
  • The difference is that while today's x86 processors run at full clock speed almost all the time, the human brain does no such thing.
    You are never using 100% of your brain all the time, the usage depends on how much you need. Thus, your head never overheats.
  • because I want the G8 to go into Powerbook first when its release. I tired of this whole G5 fiasco
  • Now we know the real cause of global warming.

  • ... recycling the bit-bucket?

    Finding better ways to suck excess energy of a chip is very well and good, but it might be better to reduce the energy produced by the chip in the first place.

    If every time a 1 is set to a zero, why not feed that into a bank of capacitors rather than the current solution (which I believe is to sink it to ground, thus producing heat)?

Avoid strange women and temporary variables.

Working...