The Not-So-Cool Future 155
markmcb writes "Researchers at Purdue University and several other universities are looking to start work on a major problem standing in the way of future chip design: heat. The team is proposing a new center to consolidate efforts in finding solutions for the problem that is expected to become a reality within the next 15 years as future chips are expected to produce around 10 times as much heat as today's chips. The new center would work to develop circuits that consume less electricity and couple them with micro cooling devices."
But think about the,,, (Score:5, Funny)
Re:[OT]But think about the,,, (Score:2)
Nothing new (Score:5, Insightful)
Re:Nothing new (Score:5, Interesting)
Re:Nothing new (Score:3, Informative)
Reversible computation is quite real, but it doesn't work in the way you explained. You don't need to actually run the computation backwards. To make a long story short, the only time that a reversible computer needs to expend energy as heat is when it's producing output, or setting/clearing variables to a known state. And then, it only requires energy proportional to the number of bits being output, and the temperature. So if you're testing whether a billion-digit nu
Photonic chips? (Score:5, Insightful)
Re:Photonic chips? (Score:5, Informative)
Minimum Energy Requirements of Information Transfer and Computing [aeiveos.com]
Re:Photonic chips? (Score:1)
Re:Photonic chips? (Score:2)
Heat is caused by friction, not electron energy state transitions! There is no energy "loss" as heat in eletron state transitions.
Re:Photonic chips? (Score:2)
Re:Photonic chips? (Score:3, Insightful)
Re:Photonic chips? (Score:2, Informative)
Re:Photonic chips? (Score:2)
Think a little bit: photons do not interact directly, so it means that you need some matter to create interactions, and photon-matter interactions will definitely generate heat, possibly lot of heat as many useful interactions are "second order" effect ie the change of transparency of the matter is a 'byproduct' which means light must be very intense to induce the
Not Cooling (Score:5, Interesting)
diamond cooling (Score:3, Informative)
Diamonds are about five times better at heat conducting as copper and could thus be used for passive cooling.
Re:diamond cooling (Score:1)
Re:diamond cooling (Score:2)
Re:diamond cooling (Score:3, Interesting)
A 1 gram mass of loosely packed tobacco is wrapped into a paper sleeve
Re:diamond cooling (Score:1)
Re:diamond cooling (Score:1)
fyi silicone is in breasts. silicium is in chips.
Re:diamond cooling (Score:1)
Re:diamond cooling (Score:2, Informative)
see here [geek.com] for more info.
(This was reported extensively at the time)
Re:diamond cooling (Score:5, Informative)
Diamond's superior thermal, optical, and chemical-resistance properties make it attractive for future microprocessors... but unfortunately it is more difficult to make it work as a semiconductor, which is why silicon has always been the substrate of choice.
It's very interesting research, and we'll see where it goes. For more info, this C&E News article is good, [acs.org] or check here, [ferret.com.au] or here [geek.com] and there's a bit here. [wikipedia.org]
Re:diamond cooling (Score:2)
Not to mention that sand is cheap, but debeers has been artificially raising the prices of diamonds for ages, and they have usually been expensive and/or difficult to manufacture.
Re:diamond cooling (Score:2)
Most real proposals for using diamond in microprocessors suggest using synthetic diamond, not natural diamond. You can use CVD (chemical vapor deposition) to make good quality artificial diamonds. Currently, growing CVD-diamond is expensive, but then again, taking sand and purifying it into a huge cylinder of single-crystal silicon is also not cheap. If synthetic diamond research continues, it could prove to be competitive with Si.
The cost of DeBeers
Re:diamond cooling (Score:3, Funny)
Re:diamond cooling (Score:1)
If you tried to do that, Debeers would Jimmyhaffa you faster than the oil companies did to that guy who invented 150 mpg engine.
1kW?! (Score:3, Insightful)
I don't think that 1kW processors will be practical. Nobody is going to want to pay to run that, and nobody will want a heater running in their room all the time either.
I'd say that they should be looking to limit it to not much more than current figures (100W) - maybe 200W if we are generous. After that it gets silly.
Re:1kW?! (Score:1)
Re:1kW?! (Score:2)
100W * 5c/kWh -> ~$45/year to power it (yeah, low power prices in Canada thanks to tons of hydro :). If you raise it to 20c/kWh, you are paying about $180/year to power your 100W processor... Double that? 10x that? Not me.
Re:1kW?! (Score:4, Informative)
Current chips generate about 50-100 watts of heat per square centimeter.
"But in the future, say 15 to 20 years from now, the heat generation will likely be much more than that, especially in so-called hot spots, where several kilowatts of heat per square centimeter may be generated over very small regions of the chip..."
Let's not confuse power with power density. When the article says "10 times the heat" they mean kW/cm^2, not kW. Chips of the future will generate a few kW/cm^2 of heat in their hottest spots, but they will still be supplied from conventional 200W power supplies that run off of normal 120V power lines. It's the dissipation of so much heat in such a small area that is the issue, not the raw amount of energy being consumed.
So, again, it's not the the processor will draw 1 kW of power (it may draw considerably less), but rather that it's hottest spots will need to dissipate ~1 kW/cm^2 (i.e.: 1000 joules of heat per second per square centimeter).
Re:1kW?! (Score:2)
Breeze (Score:5, Funny)
Amazing, Purdue is developing the same technology used in such high tech devices as the Ionic Breeze air purifier. [as-seen-on...tore-1.com]
Re:Breeze (Score:1)
Re: (Score:1)
Re:Breeze (Score:2)
Re:Breeze (Score:2)
Wrong. It does smell good, though.
Hot and bothered! (Score:4, Interesting)
I used to want the fastest computer around. But a few things have changed I guess.
First of all computers are starting to be fast enough for most needs.
Secondly, the way I use computers has changed with always on Internet. I never turn my computer off because I want to be able to quickly look something up on the web.
I also have a server that is running 24/7. Most of the time it is idling, but even when it is working I don't need it to be a speed demon.
So it is starting to be really important for me that a computer doesn't use a lot of power. I don't know if it affects my electric bill in a noticeable way, but it feels wrong.
Re:Hot and bothered! (Score:3, Interesting)
well a quick google says it's about five cents per kWh... assume your server spins down the disk drives when idling, and your monitor turns off when not in use; you're probably averaging 200watts an hour. That comes out to be abour $6.72/month in electricity, or $80 per year.
If you're looking for power savings, an old laptop with
what about parallel (Score:1)
obviously such a mainframe can use massive parallel processing techniques were cooling is less of an issue.
Re:what about parallel (Score:1)
For editing media - maybe...
For playing games - NO
What other high-performance jobs are PCs supposed to perform? Hi-speed decompression of tar-balls?
Re:what about parallel (Score:1)
Re:what about parallel (Score:1)
Screw this (Score:3, Funny)
Alliances (Score:3, Informative)
But can you make a cluster of them...? (Score:4, Insightful)
The future is multi-core / multi-CPU boards where scaling comes from adding more pieces, not making them individually faster.
Yes, chips will always get faster and hopefully cooler, but it's no longer the key to performance.
Re:But can you make a cluster of them...? (Score:1, Informative)
Re:But can you make a cluster of them...? (Score:2)
Cooling my case is much easier, I have a 120cm fan in my power supply and a couple 80cm fans elsewhere, all spinning at 800rpm, making substantially less noise.
Keep in mind that 100% of the CPU's output, plus the heat from all the other components is dumped into the case, and from there my case fans dump the heat into my office.
Spreading out the heat from one single core vs multiple cores and make the cooling pro
Re:But can you make a cluster of them...? (Score:2)
Re:But can you make a cluster of them...? (Score:2)
Granted, since we seem to be reaching a point where it is prohibitively expensive to get faster, we will get broader, and work on further parallelization. As an AMD fanboy I am compelled to point out that AMD figured this out sooner rather than later and produced a product superior to in
hardware DRM (Score:2, Interesting)
Re:hardware DRM (Score:1)
Do something about the noise first. (Score:2)
An efficient and inexpensive cooling solution would be more desireable, IMHO.
Has anyone else experienced the "jet engine" noise comeing from newer systems?
Guess if you make the chip with less need for cooling requirements, we'll solve the puzzle also, however, that may be the more expensive road to the solution of fan noise, no?
Re:Do something about the noise first. (Score:1)
Heavy research is put in silencing aircraft turbines and it's not as easy as one way think. I guess it's the same with the fans cooling your CPU. Unless you want to pay $100 per fan with special widgets to reduce the noise you won't get fans very silent.
IMHO it's much easier to reduce the overall heatoutput of a system than developing silent fans. As a plus less heat means less power consumption and nobody wants to pays those bills.
Re:Do something about the noise first. (Score:2)
It would be great if they made a chip that would only need passive cooling instead of using any fans.
Re:Do something about the noise first. (Score:1)
The 24-port switch I picked up recently easily drowns it out.
Re:Do something about the noise first. (Score:2)
The other thing that is annoying about small fans, is the lack of supply.
heat has already been MOBO issue (Score:4, Interesting)
First it was CPUs with cooling and big/slow/no fans and big heatsinks, then PSUs GPUs and now MOBOs. My current custom box (now 14 months old) was built to be silent and I had a hard time settling on a motherboard that was state of the art, stable, and still used a passive heatsink to cool the board chipset fan-free. I finally settled on an Asus P4P800.
I can definately believe heat becoming even more of an issue. For those of us who want power/performance and quiet at the same time, this will become even more of a challenge as time goes on. I for one hope not to rely on expensive and/or complicated cooling devices, like peltier units, water pumps and the like. I hope the focus is on efficient chips that only clock up/power up as they need to, like the pentuim M.
my 2 cents.
Re: (Score:1)
10 times more heat? (Score:3, Funny)
If i take out my CPU cooler it reaches about 100'C. Now lets see, 100 x 10 = 1000'C in only 15 years of chip industry. If we manage out to put this heat into work, lets say we can have 'PC + hairdryer' packages or 'PC + free home-heating' winter offers or even 'PC - burn-a-pizza' boxes. Think about it, its only good news.
Funny, -1
Re:10 times more heat? (Score:2)
100C = 373K
10x373K = 3730K = 3457C
Of course mine aren't any less wrong
Comment removed (Score:3, Funny)
Re:Let me get this straight (Score:1)
Re:Let me get this straight (Score:1)
Why is heat reclamation not worth it? (Score:2, Interesting)
I also have some vague handwaving idea that there are processes for generating electricity that have to do with harnessing temperature different
Re:Why is heat reclamation not worth it? (Score:2)
Google on "Thermodynamics, laws of" while you're about it.
Re:Why is heat reclamation not worth it? (Score:1)
Most efficent heat transfer can be achieved by convection using materials (fluids) with high absorption of heat (as water) and movement of said fluid (now hot) to the power generator. The size of the required devices would be the size of your desktop at least. At they would be expensive too.
And, as all thermal and mechanical processes, they are not 100% efficient (2nd law of thermodynamics) nor in the CPU side
Re:Why is heat reclamation not worth it? (Score:3, Informative)
AFAIK, it really is an engineering issue. Converting a temperature gradient to electricity works great when you have huge temperature gradients (like in nuclear reactors, coal plants, steam engine, etc.), but is not so useful in a computer tower. Firstly, the whole point of putting fins on a chip is to spread the heat out quickly, s
Re:Why is heat reclamation not worth it? (Score:3, Informative)
yeah, but it it's over a few nanometers it's pretty big. If we built a generator on that scale it might be worthwhile...
"Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset the production cost."
Where do you get this from? I keep seeing that argument over and over again, but I can't seem to find any data to back it up.
A little googling, found this:
http://www.thecomm [thecomma.co.uk]
Re:Why is heat reclamation not worth it? (Score:2)
The gradient isn't over a few nanometers. The chip has nano-sized components, but overall it is basically a 10mm X 10mm slice of metal that is getting hot. It will try to equilibrate with its surroundings, and the gradient in temperature near it is really not that substantial.
"Another fun-fact is that it takes about ~7 years of using a solar-panel before the energy savings offset t
Re:Why is heat reclamation not worth it? (Score:4, Informative)
eta = (Thot - Tcold)/Thot.
using absolute temperatures (Kelvin or Rankine)
So assuming the limit is Thot = 60C = 333 K and Tcold = 25C (average room temp) = 298 K, The maximum efficiency would be 10%. Assuming further that 100W is lost by the chip alone, only 10W would be potentially recoverable. Unfortunately it gets worse: The Carnot cycle is theoretical and no real carnot engine could ever be produced. There are some very efficient cycles available (stirling and rankine come to mind) however none can exceed the carnot efficiency.
It also gets worse as you make the engine smaller. Consider the tolerance of pistons or turbines. Suppose you must leave 1mm of gap between surfaces. For large engines this is no problem, but as the machines become smaller, the minimum gap becomes a greater percentage of the total area.
Machines to extract energy from such a small source at such a low temperature difference have significant theoretical inefficiencies before you even get to the practical ones. This does not mean that you can't recover any of the "wasted heat" but only that you've pretty much gotten all the useful work out of it that you can and recovering the rest would be very impractical.
Have you ever eaten a lobster? did you suck the meat from the legs?
Re:Why is heat reclamation not worth it? (Score:2)
In the wintertime, running a server farm in your office 24/7 might generate enough excess heat to make a noticeable dent in your heating needs, but even so, unless you use resistance heating or burn a very expensive fuel to heat your office, it is probably cheaper to use a heat pump or other device for building heat.
In the summer however, you get hit with a double whammy. First, you are paying
It's getting hot in here... (Score:1)
Energy (Score:1)
w00t (Score:2, Funny)
Mini Lightning next to the CPU??? (Score:2)
Using lightning to cool a CPU?
Doesn't EMF pose a problem here?
Guess you could shield, but thats counter productive isn't it?
Re:Mini Lightning next to the CPU??? (Score:2)
Thinking more along the lines of Electric and Magnetic Fields, (EMF).
Sheilding [fms-corp.com]
The lighting will induce currents of electricity and interference on everything around it.
Here is a good chuckle:
Home project [asilo.com]
(The Windows95 screen shot)
Interesting sidebar:
A new electric producer [theverylas...ternet.com]
not exactly (Score:2)
This is bullshit. I am never even considering buying a >>100W CPU for my desktop, certainly not 1000W.
I'd rather see a less fans in my machine, not more.
Looking into heat/area is more reasonable as area will decrease for a while still.
Missing an option? (Score:3, Interesting)
Or perhaps I'm grossly physics-impaired.
Re:Missing an option? (Score:2)
Re:Missing an option? (Score:1)
Re:Missing an option? (Score:2)
Thanks for the correction re: radiative heat dissipation vs. conduction.
Re:Missing an option? (Score:2)
Don't most materials become less conductive as the temperature increases though? thus requireing greater voltage and generating more heat?
Is this a feature of x86? (Score:1)
optical chips are the answer!!!!! (Score:1)
Patents with funding money. (Score:2)
"The patents arose from a research project funded in part by the National Science Foundation."
The idea of getting the NSF funding, (in part), the research that will later lead to mechanical engineers getting the patent would be a great way to make money at the expense of others.
Should not the patent rights be shared among those who funded the project?
Re:Patents with funding money. (Score:1)
A trade secret, on the other hand, need never be released.
Re:Patents with funding money. (Score:2)
"Should not the patent rights be shared among those who funded the project?"
The people filing the patents are rarely the owners of the patent rights.
Various solutions (Score:3, Insightful)
This would let you get all the benefits of existing tried-and-tested cooling methods, but would eliminate the bugbears of the chip's casing being an insulator and the possibility of condensation screwing everything up.
A variant on this would be to have the chip stand upright, so that you could have a cooling system on both sides. The pins would need to be on the sides of the chip, then, not on the base.
A second option would be to look at where the heat is coming from. A lot of heat is going to be produced through resistance and the bulk of chips still use aluminum (which has a relatively high resistance) for the interconnects. Copper interconnects would run cooler, and (if anyone can figure out how to do it) silver would be best of all.
A third option is to look at the layout of the chips. I'm not sure exactly how memory chips are organized, but it would seem that the more interleaving you have, the lower the concentration of heat at any given point, so the cooler the chip will run. Similarly for processors, it would seem that the more spaced out a set of identical processing elements are, the better.
A fourth option is to double the width of the inputs to the chips (eg: you'd be looking at 128-bit procrssors) and to allow instructions to work on vectors or matrices. The idea here is that some of the problem is in the overheads of fetching and farming out the work. If you reduce the overheads, by transferring work in bulk, you should reduce the heat generated.
Re:Various solutions (Score:2)
Human Brains (Score:1)
Re:Human Brains (Score:2)
RE: Human Brains (Score:1)
You are never using 100% of your brain all the time, the usage depends on how much you need. Thus, your head never overheats.
Re: (Score:1)
its good to get off to an early start (Score:2)
Yah Greehouse Gases (Score:2)
How about... (Score:2)
Finding better ways to suck excess energy of a chip is very well and good, but it might be better to reduce the energy produced by the chip in the first place.
If every time a 1 is set to a zero, why not feed that into a bank of capacitors rather than the current solution (which I believe is to sink it to ground, thus producing heat)?
Re:A strange question, but... (Score:3, Interesting)
compare the typical light bulb with the typical wire running through your house. the light bulb gets hot because of the thin wire.
Re:A strange question, but... (Score:1)
Re:Expect to see Asynchronous Processors instead (Score:3, Insightful)
Where did that "general rule" come from? It's nonsense.