Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Data Storage Technology

100x Denser Chips Possible With Plasmonic Nanolithography 117

Roland Piquepaille writes "According to the semiconductor industry, maskless nanolithography is a flexible nanofabrication technique which suffers from low throughput. But now, engineers at the University of California at Berkeley have developed a new approach that involves 'flying' an array of plasmonic lenses just 20 nanometers above a rotating surface, it is possible to increase throughput by several orders of magnitude. The 'flying head' they've created looks like the stylus on the arm of an old-fashioned LP turntable. With this technique, the researchers were able to create line patterns only 80 nanometers wide at speeds up to 12 meters per second. The lead researcher said that by using 'this plasmonic nanolithography, we will be able to make current microprocessors more than 10 times smaller, but far more powerful' and that 'it could lead to ultra-high density disks that can hold 10 to 100 times more data than today's disks.'"
This discussion has been archived. No new comments can be posted.

100x Denser Chips Possible With Plasmonic Nanolithography

Comments Filter:
  • Fragility (Score:5, Interesting)

    by Renraku ( 518261 ) on Sunday October 26, 2008 @04:47PM (#25520267) Homepage

    A question for the physics people out there.

    At what point does Brownian motion become a serious consideration? What about tunneling electrons and other quantum-ish effects?

  • by tylerni7 ( 944579 ) on Sunday October 26, 2008 @05:01PM (#25520385) Homepage
    Do current chip manufacturers like Intel and AMD work on new lithography techniques, or do they focus more on architectural changes?
    It seems that they shrink their process at a fairly slow rate, and both companies seem to do it at about the same speed.

    Also, if they both have been just advancing the standard techniques using high frequency light to etch all the chips, how easily could they change their manufacturing process over to something radically different?

    Seeing chips with 100 times more density would offer incredible benefits for speed and power savings, seeing the recent changes that the 65nm to 45nm process has brought. Hopefully we'll actually be able to see this process being used inside the next 10 years though.
  • Re:Fragility (Score:5, Interesting)

    by mehtars ( 655511 ) on Sunday October 26, 2008 @05:36PM (#25520713)
    Actually with processors using a 90 and 45 nanometer transistor size, there is a very high likely hood that a number of transistors will fail over the lifetime of the chip due to diffusion alone. Though modern processors have taken care of this by routing data through parts of the chip that are still active. Though this has an interesting affect of slowing the processor down as it gets older.
  • by Moraelin ( 679338 ) on Sunday October 26, 2008 @05:37PM (#25520719) Journal

    Well, that's kinda the whole point. Given that today's transistors are 45nm or so, 10 times smaller would be 4.5nm, or about 15 silicon atoms IIRC. I think we can worry about that already.

  • Re:Fragility (Score:2, Interesting)

    by ThisNukes4u ( 752508 ) * <tcoppi@@@gmail...com> on Sunday October 26, 2008 @06:13PM (#25520985) Homepage
    Do you have any source/references on techniques used to compensate for this effect?
  • Re:5-10 years (Score:1, Interesting)

    by cong06 ( 1000177 ) on Sunday October 26, 2008 @07:17PM (#25521535)
    And artificial intelligence. That's always 20 years away.
  • by Yarhj ( 1305397 ) on Sunday October 26, 2008 @09:59PM (#25522633)

    Modern 40/45nm and the upcoming 23nm chips need very short wavelengths to get produced. This is expensive.

    The new technique uses relatively long ultraviolet light wavelengths.

    There's certainly a cost advantage to using longer-wavelength light for the exposure, but there's also a tradeoff in device complexity. Using longer-wavelength light for the exposure translates to cheaper lamps, mirrors, and optics, but the added complexity is going to add a lot of cost to the design and maintenance of these tools.

    A conventional stepper performs a series of mechanical and optical alignments before exposing a die on the wafer, then steps to the next die to continue the process. A lithography tool based on floating-head plasmonic technology requires at least two things:

    1. Precise control of the rotation speed. We need the pattern to be written uniformly across the entire wafer, and we want ALL the devices on the wafer to be exactly the same.

    2. Fine control of the exposure lamp. We need to expose nanometer-scale sections of a 300mm wafer, spinning at 12m/s (roughly 400rpm). Furthermore, we need to align exposures in each exposure "track" to one another.

    The hardware and software to provide this level of dynamic control will add a lot of complexity (read: $$$) to the design and operation of these tools.

    One other thing; the entire point of the floating write-head is to keep the separation between the head and the wafer constant as it scans across the wafer: the surface of a silicon wafer becomes more and more erratic as the processing continues. Surface variations are on the scale of the transistor dimensions (~50-100nm these days). This would tend to hamper the massive parallelization that the article authors hope for, as a 100k microlens write head will be significantly larger than a single transistor, and won't be able to float accurately over the surface of the wafer.

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...