Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Technology Science Hardware

New Solution For Your Transistor BBQ 191

servantsoldier writes "There's a new solution for the transistor heat problem: Make them out of charcoal... The AP is reporting that Japanese researchers, led by Daisuke Nakamura of Toyota Central R&D Laboratories Inc., have discovered a way to use silicon carbide instead of silicon in the creation of transistor wafers. The Japanese researchers discovered that they can build silicon carbide wafers by using a multiple-step process in which the crystals are grown in several stages. As a result, defects are minimized. Other benefits are decreased weight and a more rugged material. The researchers say that currently only a 3" wafer has been produced and that a marketable product is at least six years away."
This discussion has been archived. No new comments can be posted.

New Solution For Your Transistor BBQ

Comments Filter:
  • by Baka_kun ( 647710 ) on Thursday August 26, 2004 @02:31AM (#10075910) Journal
    the text said "... that Japanese researchers, led by Daisuke Nakamura of Toyota Central R&D Laboratories Inc., ..."

    but i read "...that Japanese researchers, led by Duke Nukem of Toyota Central R&D Laboratories Inc., ..."

    other than this, Great, if this works in practice well be having new smaller cpus for everything.

    but im still waiting for a pda without screen, that uses my glasses as a screen.. but thats more of scifi than reality.
  • by chatgris ( 735079 ) on Thursday August 26, 2004 @02:32AM (#10075913) Homepage
    This may be modded as funny.. But realistically, think about this.

    The amount of heat being generated by chips does not seem to be decreasing at all, and this material appears to be produced to be "heat resistant" instead of more efficient.

    How long until your PC puts out enough heat that it would be economical to re-use that heat for a hot water tank, or for winter heating?

    How long until we need special 240V plugs like electric stoves have for power?

    I think that emphasis on more efficient chips is a better venture than heat resistant materials, as the whole heat byproduct of CPU's seems to be sprialling out of control.


    • by Anonymous Coward
      Most countries have 240v to begin with.
    • You can do that right now. My Athlon 1800+ keeps my room nicely warm in winter, and it's a (relatively) low consumption chip.
    • A couple of those, connected via heatpipe to a hotplate at the top of the case, would make an excellent hot-plate for a coffee or tea pot =)

      As for the plugs - well, there's some way to go yet. At the moment, power supplies are on the order of 5-600W. An electric heater can put out up to 3000 or so watts.

      I used to run a constantly-on heater, two PCs, three monitors, some random home networking equipment and a desk lamp all off a series of four-way power bars connected through a single 13A 230V UK plug. The
      • Oh, before anyone tries the stage-lighting thing: it *worked*, but the plug got pretty hot and eventually the circuit breakers tripped. The problem was solved by splitting the load over two plugs on opposite sides of the stage =P

        Ah, the days of helping out with school stage tech. I still don't think the music dept. has forgiven me for blowing up two of their (old, crappy, faulty-but-not-diagnosed-until-they-failed) PA amps in one night...
      • You're lucky. With our lowly 120V supplies here, 2000 Watts is about as much as you can ever expect on a single circuit. (theoretically 2400W on a 20A circuit, but once you're pulling close to 20A, the wires and cords themselves start to draw enough in heating that it adds up)

        On the other hand, I have accidentally touched live AC wires a few times (and even stuck my finger in a light socket as a kid) and had relatively minor effects from it. I'd imagine 220/240 has a bit more of a kick... :)

        - Peter
    • well here in australia we already use 240V for everything.

      and then we have three phase for serious stuff...

      Oh and 16amp plugs for real servers...

      hmmm well it was a nice idea.
    • by spellraiser ( 764337 ) on Thursday August 26, 2004 @03:19AM (#10076049) Journal
      Yes, but I still think water cooling [avforums.com] is the way to go, personally.
      • by Moraelin ( 679338 ) on Thursday August 26, 2004 @08:50AM (#10076952) Journal
        Yes, silicon carbide and water cooling will get the heat out of the CPU faster.

        The problem still remains that a metric buttload of heat is produced, and that it comes out of the electricity bill. Sometimes twice: in the summer you also pay for the air conditioning, since that shiny new CPU is heating the room some more.

        I think it's getting ludicrious.

        The Prescott is already over 100 W, and Intel apparently plans dual core versions. Whoppee for 200+ W CPUs. NVidia 6800 Ultras are rated for 120 W, and they're hyping SLI setups now. Yep, _two_ graphics cards, if just 120W worth of hot air blowing off the back of the case wasn't enough.

        Add hard drives, motherboard, and the PSUs own inefficiency, and you're already looking at 1000W worth of heat for the whole computer. That's already like a space heater.

        In fact, go ahead and turn a space heater on near your desk in the summer, and you've got a pretty good approximation of what the next generation of computers promises to be like. Now picture some 4 of them in the same room, at the office.

        And it's raising exponentially. Carbide and water cooling will only help them get further along that curve.

        And I'll be damned if I'm thrilled at the prospect.

        This also brings the problem of even more fans. Even with water cooling, you then have to get the heat out of the water. It still means fans. More heat will just mean more fans, bigger fans, or faster fans. Or all the above.

        And I'm not thrilled at the prospect of the return of the noisy computer either. I can jolly well do without the machine sounding like a jumbo jet. Especially when I'm watching a DVD or such, I can do without having to turn the volume sky high just to be able to hear what they're saying. And at the office I can do without four noisy hovercrafts in the same room.
    • The amount of heat being generated by chips does not seem to be decreasing at all ...

      I disagree. I've just upgraded an Athlon XP 1800+ system to an Athlon64 3500+.
      The new box runs around 20 degrees C cooler than the old one at idle and under heavy load; both use the supplied retail AMD heatsinks. I'm not using "Cool 'n Quiet" on the '64; it might take a bit off the idle temperature, but I don't see the point.
    • by Anonymous Coward
      from the article:

      Devices built with the rugged material would not require cooling and other protections that add size, weight and cost to traditional silicon electronics in power systems, jet engines, rockets, wireless transmitters and other equipment exposed to harsh environments.

      So you see, besides that it is nearly as hard as diamond and can survive the temperatures of re-entry into the Earth's atmosphere, they want use it to replace silicon electronics that are used in more stressful environments. A
    • This isn't exactly answering your post, but don't forget that there are other uses for silicon than processors. Think industrial power switching, high power drives.


    • All we need is some way to convert heat directly into electricity.... dream on I guess.
      • They do they are called thermo-couples and operate on the peltier effect.
        Take two different wires twist them together into two junctions, break one wire put in a meter; then heat one junction, cool the other and electrical current flows. the peltier cooler work by adding current which causes one junction to warm, and the other to cool.

        You should be able to take a peltier cooler, heat one side and cool the other and get some electricity out of it. I imagine the efficency is pathetic, but its just waste he
    • The amount of heat being generated by chips does not seem to be decreasing at all, and this material appears to be produced to be "heat resistant" instead of more efficient.

      Heat resistance isn't the point -- current IC's don't melt, they get trashed via difusion processes that will still be there in SiC.

      The advantage of SiC is substantially enhanced (2x) thermal conductivity vs. Si. This makes it easier to get heat out of the chip, allowing it to run cooler at any given heat production rate.

    • The voltage doesn't matter; it's the wattage. So, you probably won't need more than 120V for future machines, but you may need better wiring so that more amps can be carried to it without blowing a fuse (or lighting your house on fire).
    • Honestly, I do that as it is today! During the winter, I'm a cheap miser, and keep the rest of the house at about 50. I keep my computer in my room and always keep the door closed, and it'll reach a balmy 70 degrees just from the PC.

  • Imagine ... (Score:1, Redundant)

    by valmont ( 3573 )
    ... a lighter and more rugged beowulf cluster of those.
  • Charcoal? (Score:5, Insightful)

    by mikeophile ( 647318 ) on Thursday August 26, 2004 @02:34AM (#10075925)
    Think knife-sharpener.

    Silicon carbide is really hard stuff.

    It's not quite diamond, but with a hardness of 9.25, you could use your SiC processor to grind real axes and not just figurative ones in flamewars.
    • you're right: born from a star [http]
    • Re:Charcoal? (Score:5, Interesting)

      by DarkMan ( 32280 ) on Thursday August 26, 2004 @08:47AM (#10076915) Journal
      Not quite.

      I've got a quitea bit of experience with SiC abrasives, what with the materials engineering and being a bit of a lapidary.

      First off, it's nowhere near diamond in terms of hardness. The Mohs scale is semi-arbitary in assignement, and not even vaugely linear. On proper hardness scale (in this case Vickers), diamond has a hardness of around 90 GPa, compared to about 25 GPa for SiC. That's the reason I've got a box full of diamond abrasives, despite the cost (about 30 times more expensive), they are much faster, and last almost indefinitly. More later on this.

      Secondly, SiC needs to be rough. If you don't belive me, try grinding a carrot into shape on a window. The glass is very much harder then the carrot, but is nearly perfectly smooth, and as such, the carrot just sides about. Compare with rubbing the carrot on something like a concrete paving slab, which grinds it much better. The reative hardnesses are wrong here, but show the need for surface roughness.

      As an aside, if you think that paper cuts are bad from standard office paper, then try getting one from fine SiC abrasive paper. Stiffer paper, cuts deeper, and the abrasive roughs up one side of the cut, so it takes about four times as long to heal. It's a mistake I've made exactly once.

      A processor is not a single pure material - if it was, it wouldn't do anything. They are a complex layered system, with layers of copper and SiO. Trying to grind anything with a processor die will just succed in scraping off all that important stuff. The hardness of SiO is Mohs 7, well below that of anything actually used as an abrasive for metals. (It's the same as ground glass, near enough, sometimes used for abrading wood or plastics).

      For comparison silicon has a hardness of 12 GPa Vickers. SiC is only around twice as hard as that.

      So, no, you can't really use it as an abrasive. If you really want to be very careful, you might be able to use the edge of the die as a scraper, but you'd probably just remove the important stuff.

      That's alla moot point, however. I strongly supect that you'll never see the actuall die, it will be under a metal heat spreader. Because they can cope with higher temperatures [0], there is even less need to take the risk of mishandling breaking the die.

      And lest you think that SiC would be less likely to break then silicon, I'm afraid not. Aside from the fact that many broken Athlons are due to the top few layers of SiO and metal breaking, SiC is not that tougher than silicon. As any lapidary will tell you, it's perfectly possible to chip saphire and diamond, if you're not careful.

      Still, I can't deny that facts aside, it's a wonderfuly evocative metaphor.

      [0] And how much higher! Silicon tops out at 350 C, SiC could operatate at 600 C, where is it glowing red hot! sourced from Nasa [nasa.gov]
  • The article is kind of vague on the details, for instance, just how much hotter are these semiconductors going to be able to run? Is it possible that chips made from these will have to use a non-plastic casing material? If so, that would be very cool. I doubt it though, that'd have to be pretty hot.
  • In Japan... (Score:5, Funny)

    by Johnny Fusion ( 658094 ) <zenmondoNO@SPAMgmail.com> on Thursday August 26, 2004 @02:36AM (#10075937) Homepage Journal
    The researchers say that currently only a 3" wafer has been produced

    Hirohito: Oh! You must have very big wafer!

    Owner: Excuse me?! I was just asking you what you're up to with this manufacturing process!

    Nothing! We are very simple people with very small wafer! Mr. Hosek's wafer is especially small!

    Hosek: He he he! So small!

    Hirohito: We cannot achieve much with so small wafer! But, you Americans! Wow! Wafer so big! SO BIG Wafer!

    Owner: Well, I-I guess it is a pretty good size

  • by harlemjoe ( 304815 ) on Thursday August 26, 2004 @02:39AM (#10075946)
    From the article....
    In an advance that could lead to lighter spacecraft and smarter cars, researchers have developed a new technique for producing a high-quality computer chip that is much more resistant to extreme conditions than the silicon found in most of today's electronics.

    So a chip more resistant to extreme conditions is also somehow 'lighter' and 'smarter'...

    A good step forward for science, but not for science journalism...

  • by gatesh8r ( 182908 ) on Thursday August 26, 2004 @02:45AM (#10075961)
    Gives new meaning to "burning up your CPU". Better hope the non-techies never open up their machines...
  • Your cpus will have a new use when obsolete...
  • by teamhasnoi ( 554944 ) <.teamhasnoi. .at. .yahoo.com.> on Thursday August 26, 2004 @03:02AM (#10076008) Journal
    I'll be able to use these in my flexible paper display ebook with fuel cell technology as I drive to work in my hydrogen powered flying car!

    I can't wait!

  • I'm all for being able to OC the hell outa my proc and not be worried about burning it..


    These CPUs would be far more durable and last a lot longer. Why is that a problem? Think about the last time your job/office/place of business replaced computers. You're gonna be stuck with that slow machine a whole lot longer.
    • Re:a good idea? (Score:3, Insightful)

      by jimicus ( 737525 )
      No you won't. Can you imagine Compaq, Dell or IBM voluntarily producing a PC which never wears out?
      • Re:a good idea? (Score:3, Insightful)

        by Y2K is bogus ( 7647 )
        So the major PC makers wouldn't want to make products that never fail and never require spare parts, except due to catastrophe?

        Producing spares isn't their primary focus, and every RMA for stupid broken stuff is costly. A laptop that exceeds the 3 year warranty without breaking would be music to their ears, and consumers.

        Your logic is flawed. It isn't "wearing out" that makes people buy new computers, it's the fact that it's too slow or old. Most computers end up surplused, just check the HUGE secondar
  • Next step: diamond (Score:2, Interesting)

    by CityZen ( 464761 )
    If you've got the carbon, why bother with the silicon? Actually, I wonder what they use to "dope" diamond semiconductors?

    http://www.eetimes.com/at/hpm/news/OEG20030822S000 5 [eetimes.com]
  • ...or did that new supercomputer finally arrive??
  • by ArcticCelt ( 660351 ) on Thursday August 26, 2004 @03:49AM (#10076112)
    Steve Jobs when asked what's next for the iPod: [nwsource.com]

    "You know, our next big step is we want it to make toast," Jobs answered. "I want to brown my bagels when I'm listening to my music."

    Damn Steve, again, he saw this charcoal technology coming before anybody. :)
  • by Jason1729 ( 561790 ) on Thursday August 26, 2004 @03:51AM (#10076116)
    Silicon carbide is a very hard, brittle material with a very high melting point commonly used to make crucibles and high speed saw blades and drill bits.

    Comparing this to charcol is like saying that Carbon Monoxide is the same thing as Oxygen because CO contains oxygen.

  • by panurge ( 573432 ) on Thursday August 26, 2004 @04:23AM (#10076181)
    Silicon carbide and diamond both have significant potential use as power semiconductors. Forget CPUs, think I/O. Think smaller power supplies, smaller audio drivers, more rugged automotive systems, and, ultimately, being able to shrink robotics controllers as a next step to producing very small robots. If a robot's motors are running at 80C, you want the power semis to be able to handle that. Furthermore, a lot of possible fuel cell designs run at fairly high temperature and, again, you want the electronics to survive the environment without too much cooling.

    There are also huge potential benefits for rad-hard communications satellites, where cooling is a major problem (radiation only.)

    • If you google for silicon carbide transistors just about all the hits are for microwave and power applications.
  • Growing the crystals in a multi-step process sounds like a very expensive process. Probably useful for somehot chips though.

    So why the hell do we need hot chips anyway? ARM and MIPS devices run cool. Why does x86 have to be hot? Indeed why the hell are we still wedded to these power hungry devices?

    • Not all silicon is used in processors.

    • The main problem is getting the required purity, silicon based chips involve a multi-step process process to manufacture the substrate now. Basical they take very pure silica sand (SiO2), and purify it as much as possible chemicaly, Reduce it to remove the oxygen, melt it, then extract it by growing a single crystal. Then crystal of Si is then heated to just short of the melting point and then, moving it through a electric induction heater a small portion of the crystal melts, and any remaining impurities t
    • "ARM and MIPS devices run cool. Why does x86 have to be hot?

      Different markets. X86 is under extreme competitive pressure to produce the fastest possible processors in the medium price range. This means more complicated circuitry to produce the same function. (As a trivial example, compare a simple adder to a look-ahead-carry adder.) The complication adds heat.

      • It makes a mockery of "Green PCs" though. In the last 18 years that I have had various PCs the power usage has gone up from ~100W to ~350W for the box. CRT monitor power has gone up too and only switching to an LCD has improved things.

        A machine built with 8x ARM cores would have as much grunt as a P4, but cost less and would use only a fraction of the power.

  • by Hank the Lion ( 47086 ) on Thursday August 26, 2004 @05:33AM (#10076339) Journal
    It's very nice that SiC can withstand high temperatures and is very hard, but are these the most important features of a semiconductor material?
    I would be more interested in band gap voltage, electron/hole mobility etc.
    Who needs a chip that can run hot when it cannot run fast?
    Maybe for specialized hardened aplications like space, but I don't see these being used for mainstream applications.
    • Well, SiC has a wide range for bandgap, 2.2 to 3.25 eV, which is much less stable vs. temperature than Si. This is one of its "problems" for ICs. The other is the difficulty in making large wafers. The huge benefit of its large bandgap is long minority carrier lifetimes....think standard RAM cells that can hold their charge for hundreds of years. The real focus these days for SiC has been discrete power devices since they can function with a much higher junction temperature than silicon devices. Severa
    • Who needs a chip that can run hot when it cannot run fast?

      Not all electronics is high-speed logic. Think about high-power thyristors and diodes.

  • The BBC article (Score:3, Informative)

    by Mixel ( 723232 ) on Thursday August 26, 2004 @05:34AM (#10076341) Homepage
    linky [bbc.co.uk]
  • U lot (Score:3, Insightful)

    by Anonymous Coward on Thursday August 26, 2004 @05:52AM (#10076379)
    Ha you lot, you think this will be used for CPU's.

    It wont. Silicon/Germanium is fastest you can get at teh mo (until they can dope diamond)

    SiC will be used in hi-temp areas (eg aircraft engines) or where they want it to run hotter to up the current handling (ie power electronics)

    at the mo I am limited to 800A at 1200V for an IGBT and that is 8IGBT die in parallel.the die is limited to 100A at 125C.

    When I get SiC IGBT I will be able to pass 800A thorugh a single die and let the die heat up to 300C.

    This will mean that expensive heavy heatsinks will be able to shrink

    SiC will NOT be use for hi speed CPU!!!
  • "The researchers say that currently only a 3" wafer has been produced and that a marketable product is at least six years away."

    duplicate /. article incoming ... estimated period of arrival: 6 years later .. please update your calendar for Aug2010

  • by mikael ( 484 ) on Thursday August 26, 2004 @08:00AM (#10076708)
    The good news, your graphics card can be overclocked to 2 Terahertz, and still remain operational at over 650C.

    The bad news, is that the aluminum casing of your PC will melt at this temperature, so your PC will need te be built from titanium.
  • Now the chip's will get hot enough to ignite combustibles (paper, plastic insulation, dust) and still operate. Then you'll cut your hand on the edge of the SiC chip as you're trying to put out the fire...
  • If I remember correctly diamond chips are interesting because they can easily bind to organic molecules. I believe I saw a sample chip made by some students and Sumitomo is into it too.

    Does silicon carbide have any such properties? (i.e. anything besides heat resistance?)

    The flip side of course is for high temperature operation which I think is a bit scary, maybe the chip itself can handle it but what about the stuff next to it? I would rather have lower temperature circuits. As it is only a very tiny vo
  • First someone sends in a story while under the impression that aluminum == alumina, now we have silicon carbide == charcoal. Somebody sound the gong, please.

The unfacts, did we have them, are too imprecisely few to warrant our certitude.