Forgot your password?
typodupeerror
Technology Science Hardware

New Solution For Your Transistor BBQ 191

Posted by timothy
from the but-the-ribs-aren't-done-yet dept.
servantsoldier writes "There's a new solution for the transistor heat problem: Make them out of charcoal... The AP is reporting that Japanese researchers, led by Daisuke Nakamura of Toyota Central R&D Laboratories Inc., have discovered a way to use silicon carbide instead of silicon in the creation of transistor wafers. The Japanese researchers discovered that they can build silicon carbide wafers by using a multiple-step process in which the crystals are grown in several stages. As a result, defects are minimized. Other benefits are decreased weight and a more rugged material. The researchers say that currently only a 3" wafer has been produced and that a marketable product is at least six years away."
This discussion has been archived. No new comments can be posted.

New Solution For Your Transistor BBQ

Comments Filter:
  • Charcoal? (Score:5, Insightful)

    by mikeophile (647318) on Thursday August 26, 2004 @02:34AM (#10075925)
    Think knife-sharpener.

    Silicon carbide is really hard stuff.

    It's not quite diamond, but with a hardness of 9.25, you could use your SiC processor to grind real axes and not just figurative ones in flamewars.
  • by harlemjoe (304815) on Thursday August 26, 2004 @02:39AM (#10075946)
    From the article....
    In an advance that could lead to lighter spacecraft and smarter cars, researchers have developed a new technique for producing a high-quality computer chip that is much more resistant to extreme conditions than the silicon found in most of today's electronics.

    So a chip more resistant to extreme conditions is also somehow 'lighter' and 'smarter'...

    A good step forward for science, but not for science journalism...

  • Re:Wow (Score:2, Insightful)

    by Stripe7 (571267) on Thursday August 26, 2004 @02:52AM (#10075985)
    Developments like this in Japan and other countries tell me the the US not losing its technological edge it has already lost it. Japan patents brand new tech like this while in the US we patent SUDO and 1 click shopping.
  • Re:Wow (Score:2, Insightful)

    by Donny Smith (567043) on Thursday August 26, 2004 @03:55AM (#10076121)
    Before, Taiwan (or Japan) would do just fine by making the same thing cheaper, now that doesn't quite cut it any more.

    Necessity ... I'd say that overall, the ability to innovate is inversely proportional to well-being of individuals.

    Money-hungry folks from India and China should out-innovate equally smart people from other countries, just because they're trying harder.

    In some ways, I think social injustice is perhaps a motivating factor, too - unless you come up with something new, it's hard to make it to the top by hard work alone.
  • Re:a good idea? (Score:3, Insightful)

    by jimicus (737525) on Thursday August 26, 2004 @04:18AM (#10076172)
    No you won't. Can you imagine Compaq, Dell or IBM voluntarily producing a PC which never wears out?
  • Re:In Japan... (Score:1, Insightful)

    by Anonymous Coward on Thursday August 26, 2004 @04:38AM (#10076204)
    Yes [wikipedia.org].
  • Re:a good idea? (Score:3, Insightful)

    by Y2K is bogus (7647) on Thursday August 26, 2004 @04:45AM (#10076217)
    So the major PC makers wouldn't want to make products that never fail and never require spare parts, except due to catastrophe?

    Producing spares isn't their primary focus, and every RMA for stupid broken stuff is costly. A laptop that exceeds the 3 year warranty without breaking would be music to their ears, and consumers.

    Your logic is flawed. It isn't "wearing out" that makes people buy new computers, it's the fact that it's too slow or old. Most computers end up surplused, just check the HUGE secondary market that feeds many multi-million dollar surplus businesses. There are a handful of long time surplus shops in Silicon valley that have derived a long history from the computing industry around here.
  • U lot (Score:3, Insightful)

    by Anonymous Coward on Thursday August 26, 2004 @05:52AM (#10076379)
    Ha you lot, you think this will be used for CPU's.

    It wont. Silicon/Germanium is fastest you can get at teh mo (until they can dope diamond)

    SiC will be used in hi-temp areas (eg aircraft engines) or where they want it to run hotter to up the current handling (ie power electronics)

    at the mo I am limited to 800A at 1200V for an IGBT and that is 8IGBT die in parallel.the die is limited to 100A at 125C.

    When I get SiC IGBT I will be able to pass 800A thorugh a single die and let the die heat up to 300C.

    This will mean that expensive heavy heatsinks will be able to shrink

    SiC will NOT be use for hi speed CPU!!!
  • by Moraelin (679338) on Thursday August 26, 2004 @08:50AM (#10076952) Journal
    Yes, silicon carbide and water cooling will get the heat out of the CPU faster.

    The problem still remains that a metric buttload of heat is produced, and that it comes out of the electricity bill. Sometimes twice: in the summer you also pay for the air conditioning, since that shiny new CPU is heating the room some more.

    I think it's getting ludicrious.

    The Prescott is already over 100 W, and Intel apparently plans dual core versions. Whoppee for 200+ W CPUs. NVidia 6800 Ultras are rated for 120 W, and they're hyping SLI setups now. Yep, _two_ graphics cards, if just 120W worth of hot air blowing off the back of the case wasn't enough.

    Add hard drives, motherboard, and the PSUs own inefficiency, and you're already looking at 1000W worth of heat for the whole computer. That's already like a space heater.

    In fact, go ahead and turn a space heater on near your desk in the summer, and you've got a pretty good approximation of what the next generation of computers promises to be like. Now picture some 4 of them in the same room, at the office.

    And it's raising exponentially. Carbide and water cooling will only help them get further along that curve.

    And I'll be damned if I'm thrilled at the prospect.

    This also brings the problem of even more fans. Even with water cooling, you then have to get the heat out of the water. It still means fans. More heat will just mean more fans, bigger fans, or faster fans. Or all the above.

    And I'm not thrilled at the prospect of the return of the noisy computer either. I can jolly well do without the machine sounding like a jumbo jet. Especially when I'm watching a DVD or such, I can do without having to turn the volume sky high just to be able to hear what they're saying. And at the office I can do without four noisy hovercrafts in the same room.
  • by Anonymous Coward on Thursday August 26, 2004 @07:18PM (#10083433)
    This also brings the problem of even more fans. Even with water cooling, you then have to get the heat out of the water. It still means fans. More heat will just mean more fans, bigger fans, or faster fans. Or all the above.

    Actually, maybe not. Remember the difference between "heat" and "temperature".

    For example- heat transfer away from the CPU by radiation is very small- but is proportional of the CPU temperature raised to the 4th power.

    Also, passive convection removes heat at a greater rate when the temperature difference between an object and its surroundings is greater. When a CPU isn't much warmer than its surroundings, you've got to blow on it with a fan to keep its equilibrium temp below its operating threshold.

    What this adds up to is the idea that if you have a processor that can survive and operate at a few hundred degrees C, then it gets easier to keep cool. Does that make sense?

All the evidence concerning the universe has not yet been collected, so there's still hope.

Working...