Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Get HideMyAss! VPN, PC Mag's Top 10 VPNs of 2016 for 55% off for a Limited Time ×

Germanium Diodes Mean Progress Toward Silicon-Chip Lasers 66

David Orenstein writes "Teams at Stanford and MIT have each reported getting strong light signals from germanium-based diodes on silicon at room temperature. Engineers have long sought to do this because, with further refinement into lasers, such diodes would allow for optical interconnects on chips. Optical interconnects could operate much faster and with less power than electrical (metal) ones that are becoming bottlenecks on current chips."
This discussion has been archived. No new comments can be posted.

Germanium Diodes Mean Progress Toward Silicon-Chip Lasers

Comments Filter:
  • Just don't switch it to overload!

  • by cheebie ( 459397 ) on Thursday July 09, 2009 @10:45PM (#28645639)

    Wow, plant-based electronics! This will surely usher in a new age of biological computers that will be able to . . .

    What? It's not a geranium diode?

    Uh, how 'bout that new version of Firefox? Pretty snazzy, eh.

  • I know for sure that I used Germanium diodes before and I'm pretty sure Germanium-based LED's have been developed before. Dunno what the news is.

    • by ThreeGigs ( 239452 ) on Thursday July 09, 2009 @11:05PM (#28645715)

      The news is that they've found a way to grow 'em on silicon, which lends itself well to chip production.

      • That IS pretty big news.
        Does anyone here have any inside details?
        While SF has made great use of Tunneling Diodes, there are some genuinely freaky potential applications of this; the process outlined ITFA makes me think that they might be able to produce very strongly matching tunneling diodes....and that is just scary (in a mainly good way).

    • by MichaelSmith ( 789609 ) on Thursday July 09, 2009 @11:05PM (#28645717) Homepage Journal

      I know for sure that I used Germanium diodes before and I'm pretty sure Germanium-based LED's have been developed before. Dunno what the news is.

      They seem to have improved on Germanium LEDs by doping them differently to the point where the can look into using photons to transmit information around a silicon chip in place of electrons. I imagine they will look into building light pipes out of silicon, ie, little optical fibres.

      OT: somebody should teach ascribe how to use the title tag.

      • That's cool, but with modern chip designs using electron tunneling for some of the effects, it can't be used chip-wide. On the other hand, light can cross through light, so you would be able to avoid tediously long tracks currently required.

        There may be some additional interest in the aerospace industry for this. Optical circuits on the chips aren't going to be so affected by radiation, and by having more real-estate available for redundant components and optimal placement, they can improve the resistance to radiation considerably.

        Not sure how much heat this'll cut down on, as the transistors are the big heat-producers. On the other hand, better placement means more even heat production which means they should be able to push the designs a little bit further.

      • Re: (Score:2, Funny)

        by ulski ( 1173329 )
        well - they do know how to use the title tag - they actually used the title "Untitled Document"
    • by fuzzyfuzzyfungus ( 1223518 ) on Thursday July 09, 2009 @11:08PM (#28645723) Journal
      Germanium semiconductors are old news(in fact, I have this vague impression that they might have gotten germanium working in fairly common use earlier than silicon); but, according to TFA, germanium-based light emitters built into silicon structures under more or less reasonable production and operation conditions, is what is new. This isn't about discrete components; but about structures built into larger silicon ICs.
      • I always laugh when I see the claim that light emission will be more efficient than metal interconnect in IC's. The article claims that 50% of power is currently lost in the metal interconnect in high density integrated circuits. However, I suspect that the loss in a solid state laser diode will be far worse than a mere 50%. There is also (some) metal interconnect involved in driving the diode in the first place. (Admittedly, this interconnect will probably be shorter than the metal interconnect is to make
        • Re: (Score:2, Interesting)

          by Anonymous Coward

          2-3GHz is around the frequency that FR4 (fiber glass) material for building PCB starting to become lossy. You know some of the cheap plastic gets hot in the microwave oven, that's because the material become lossy and change the energy into heat. Same principle here.
          We played with 3GHz and was already pushing it back then.

          So transmitting that type of signals outside a chip for a long distance ~ 30-40cm to a backplane onto another card for say a router core or a blade server is going to take a bit more wor

        • by LoRdTAW ( 99712 ) on Friday July 10, 2009 @02:34AM (#28646471)

          The real benefit is you wont have to worry about cross talk or other electromagnetic interference. The short haul of the board level optical interconnects means we can have very high speed chip to chip interconnects without worrying too much about trace routing or length. And LED's are quite efficient when it comes to turning to electrical power into light. Metal wires at high frequencies develop a high resistance which has to be overcome by using more energy.

          • by kinnell ( 607819 ) on Friday July 10, 2009 @07:03AM (#28647661)

            The real benefit is you wont have to worry about cross talk or other electromagnetic interference. The short haul of the board level optical interconnects means we can have very high speed chip to chip interconnects without worrying too much about trace routing or length. And LED's are quite efficient when it comes to turning to electrical power into light. Metal wires at high frequencies develop a high resistance which has to be overcome by using more energy.

            I'm not convinced. You can still get electromagnetic interference with light - look at TV remotes. Of course, if you use fibre optic cable it's not a problem, but that's akin to using coaxial cable to route electrical signals. While it would be possible to embed coaxial structures in PCBs to eliminate the possibility of cross-talk and noise, in practice this would be prohibitively expensive and the same result can be achieved with stripline and careful routing. The question is, what does an optical PCB look like? You'll still need copper for power distribution so the optical PCB will need to tolerate soldering temperatures. Do you have a layer of interwoven fibre optic cables? How do these interface with the components such that there is tolerance in the size and position of the terminals? Do you use mirrors and optical waveguides embedded in the substrate? If so how do you make this cost effective to manufacture? If you use fibre optics, you have a minimum bend radius, so you open up a whole new set of routing problems. While there are obviously clear benefits in theory, when it comes to actually implementing this as a cost effective PCB interconnect you'll have a whole set of new problems to deal with, and it's unlikely to be anywhere close in cost to gluing layers of copper and plastic together.

            • Re: (Score:3, Informative)

              by limaxray ( 1292094 )
              Its actually very common to use coaxial like structures on PCB boards by placing sensitive signal traces between ground layers in a grounded copper pour. This works very well for shielding analog signals, but it doesn't work for high frequency digital signals. The grounded copper surrounding the trace creates a significant amount of capacitance that needs to be overcome every time there is a change in state. Once you get into the hundreds or thousands of megahertz, you start to consume more and more powe
            • I'm not convinced. You can still get electromagnetic interference with light - look at TV remotes.

              Its not my area of expertise, but my understanding was that part of the point of lasers is the light is coherent. If it has an appropriate wavelength and you aim it correctly, you can transmit data at a high rate despite a fairly high level of environmental interference.

      • Vague impression? (Score:5, Interesting)

        by Kupfernigk ( 1190345 ) on Friday July 10, 2009 @03:59AM (#28646853) do know that all the first transistors were germanium based and that early transistor computers used germanium? Before Schottky diodes, computer power supplies used germanium rectifiers because they were twice as efficient (half the heat) as silicon ones. And early audio amplifiers used germanium power transistors in the output stages because at the time they offered lower distortion than silicon, as they had better transfer characteristics in the crossover region. You could easily hear the difference between class AB tube amps, class B germanium amps and class B silicon into the early 70s. Germanium was initially seen as a low frequency technology because thin junctions were hard to form, but this is not necessarily true (Esaki (tunnel) diodes.)

        Having said that you are entirely right in your main observation. The main problem for germanium has always been fabrication; no germanium ICs. This is because there is no germanium equivalent of planar technology. It has been known for a long time that if this could be overcome there would be a role for germanium. It's just that, as with so many apparently breakthrough technologies, making it happen turns out to be very hard.

        • by Agripa ( 139780 )

          Germanium also fabricates better PNP transistor than silicon so for a while, high performance complimentary circuits used PNP germanium transistors with NPN silicon ones. I have a whole reel of 1N270 germanium diodes which come in handy sometimes.

          Ultimately though the increased leakage and lower peak junction temperature made germanium obsolete for most applications.

        • Indeed, I seem to recall that a germanium diode had a ~0.2V forward voltage drop which made them better in rectifiers and such than the silicon diodes with ~0.6V. At high currents, that voltage drop means less power wasted by the device. With transistors, that means a much lower base open voltage, though I don't remember exactly why or if that was useful...
    • The Birth of the Transistor []. Circuits In Stone []. watch and learn, kid.

    • Germanium-based LEDs have most definitely NOT been developed before. The first ever LED was GaAs-based, and it was deep in the infrared.

  • Everybody's talking about how this will be useful when they do X. Why can't it be useful now?

    If there's a nice open layer somewhere, maybe on the bottom of the chip, how about sending out a clock signal across the entire chip, faster than the current tree/mesh methods? Getting the entire chip in sync this way could probably let it run a good deal faster, too.

    Or would reflections be a problem or something?

    • by jd ( 1658 )

      If the light-sensitive components are not directional, you could have the clock emitter ABOVE the silicon on a completely independent layer. That wouldn't require them to perfect the transmission side, so could be done a lot quicker.

  • If any of the fabrication goes wrong, they can always send out these germanium-on-silicon diodes as parts for the world's most expensive foxhole radios [] ;)

  • If this comes to pass, will it mean that things will not be 'solid state' inside the computer any more?

  • by Laaserboy ( 823319 ) on Friday July 10, 2009 @12:53AM (#28646075)
    The promise of making a laser from indirect bandgap semiconductors, then gathering investors, then losing the investors' money goes back to the Sixties at least.

    Some scientists showed off SiC blue LEDs in the '60s that shown brilliantly like laser light, but were not the read deal. The real blue room-temperature laser had to wait for Nakamura and a direct bandgap material.

    Doping, adding nitrogen, and adding defects to the lattice to produce more light is nothing new. Look at your stop lights. It's working there, but don't count on these indirect materials suddenly turning into lasers. No need to hold your breath.

    A quick scientific note. Photons have a lot of energy, but not much momentum. You get hot on a sunny day, but not blown over by the sun. Electrons fall almost directly down in the bandgap diagram to produce light. This makes direct-gap semiconductors useful for lasers. The trick one can use is to provide momentum-shifting impurities to the lattice of an indirect bandgap crystal. The electron creates a photon by dropping directly down, but some other mechanism shifts the electron momentum to create an overall diagonal transition. It's not efficient, but it works.
    • Re: (Score:3, Informative)

      Yes, Germanium has an indirect band-gap, but SiC, I thought, had a direct one. The problem with SiC was (is?) to grow a crystal of a determined orientation. As it is now, the crystal structure of SiC is pretty much random. That said, growing a thin layer of SiC (by simple CVD) on Si is promising.

    • Re: (Score:3, Informative)

      That "some other mechanism" is phonons-- lattice vibrations. The lattice vibrations temporarily turn it into a direct band gap semiconductor).

      Even after all these years of research, it's still largely inefficient to have to create the phonons (heat) so that you can create the photons (the laser).

  • I get empty pages when I turn to the websites of Optics Express and Optics letters to find the articles mentioned in the 'article' that was linked to. Can someone point me to the pdfs of the articles?

  • by mlts ( 1038732 ) * on Friday July 10, 2009 @04:35AM (#28647039)

    I can see this technology being able to be used to help with inter-chip communication, perhaps to help with running more tasks in parallel, or locking/unlocking memory segments shared by the CPUs.

    The only thing I see that would be a limit is having to mux/demux a lot of signals before they get put on the fiber optic cable. However fiber optic cables have a lot of bandwidth, so this may not be a big issue.

    It would be nice if silicon chip lasers could replace most signal circuits on a PC board. Mainly because it would allow positioning of components to allow for better cooling and heat dissipation. Ultimately, if several fiber optic connections can replace the hundreds (going on thousands) of pins needed on a CPU to the motherboard, it would be a great advance in reliability.

    Fiber optics on chips isn't new though. I remember talk about the PowerPC 603 having the ability to have this for better SMP communication.

  • There must be an air gap to have an increase in speed over wire. Once fiber is used speed goes down, .8C to .6C. but RFI and VSWR are gone.
  • by bazorg ( 911295 ) on Friday July 10, 2009 @08:14AM (#28648019) Homepage
    that was my WTF moment for today. I'll get some coffee now. sorry for the interruption.
  • Now, if only we can make these diodes from Ironium and Boranium as well we will be able to conquer the universe!
  • What's the most efficient laser tech, in terms of watts of electrical power in to watts of laser power out? Are there any all-optical laser devices in the high efficiency (>80%, or eve >50%) class, that are powered by incoming non-coherent light (like sunlight) but emit coherent light?

Congratulations! You are the one-millionth user to log into our system. If there's anything special we can do for you, anything at all, don't hesitate to ask!