Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Communications Networking Technology

New Flat Lens Focuses Without Distortion 202

yahyamf writes "Applied physicists at Harvard have created an ultrathin, flat lens that focuses light without the distortions of conventional lenses. 'Our flat lens opens up a new type of technology,' says principal investigator Federico Capasso. 'We're presenting a new way of making lenses. Instead of creating phase delays as light propagates through the thickness of the material, you can create an instantaneous phase shift right at the surface of the lens. It's extremely exciting.'" And by "ultrathin," they mean it — 60 nanometers thin. The big advantage for this technology, aimed at telecommunications signals, is that "the flat lens eliminates optical aberrations such as the 'fish-eye' effect that results from conventional wide-angle lenses. Astigmatism and coma aberrations also do not occur with the flat lens, so the resulting image or signal is completely accurate and does not require any complex corrective techniques."
This discussion has been archived. No new comments can be posted.

New Flat Lens Focuses Without Distortion

Comments Filter:
  • But... (Score:5, Funny)

    by craftycoder ( 1851452 ) on Sunday August 26, 2012 @11:05PM (#41133597)

    Will is make my ass look big?

    • Re:But... (Score:5, Informative)

      by Anonymous Coward on Sunday August 26, 2012 @11:24PM (#41133709)

      Have the cameraman back up, which lessens perspective distortion [stepheneastwood.com]. When taking pictures of people you should always get as far back as possible and zoom in. Staying close and zooming out is bad.

      • Re:But... (Score:4, Interesting)

        by mug funky ( 910186 ) on Sunday August 26, 2012 @11:31PM (#41133745)

        depends on the effect you want. sometimes the photographer doesn't wish to be flattering.

        i've seen some stunning stuff shot on 16mm film with a 2mm lens up real close. it makes their nose look a metre long and their neck seem far away, but it's often just what you need.

        also, real estate pics.

      • Re:But... (Score:5, Informative)

        by mcgrew ( 92797 ) * on Monday August 27, 2012 @10:19AM (#41136337) Homepage Journal

        Perspective distortion was used extensively in the filmimg of LOTR. It's how they made the hobbits look much smaller than the actors actually were.

        • It wasn't done very well however. Every other shot the hobbits look like their different heights. Sometimes they look waist high, sometimes chest high. And the height conveyed in most perspective shots is very different than the height when they used the body double stand-ins (i.e. any shot where you can't see the actor's face). It's quite jarring once you notice it, and it gets worse every time.
          • the hobbits look like they are of different heights. Sometimes they look waist high, sometimes chest high.

            And after they smoke some weed they are high as kites!

    • by Hentes ( 2461350 )

      Yes, but only in one wavelength.

  • by a_hanso ( 1891616 ) on Sunday August 26, 2012 @11:08PM (#41133619) Journal
    Does this mean that very large refractive telescopes will make sense again? If we sandwich a few of these with the metasurfaces tuned right, could we build a telescope that is a slab instead of a tube? How about telephoto lenses built into camera phones? Or cheaper orbital telescopes?
    • by ceoyoyo ( 59147 ) on Sunday August 26, 2012 @11:16PM (#41133661)

      Nope. It's IR and down, and it sounds like it probably only works in a fairly narrow frequency band. It also seems like it's probably going to stay that way, since the feature size determines the frequency it's tuned for. Visible light may require impractically small features.

      You could probably build an IR telescope using it, but it would still be a tube, it's just the lens would be very thin (which is likely a problem, rather than an advantage for a large aperture - how do you keep it from flexing? Plus your telescope would probably only work properly in a narrow frequency band (and you'd have to filter out other frequencies).

      • The article I read said Near IR up to THz. The problem would probably be the 1mm aperture of the stated lens.
      • Comment removed (Score:4, Interesting)

        by account_deleted ( 4530225 ) on Sunday August 26, 2012 @11:49PM (#41133861)
        Comment removed based on user account deletion
        • by ceoyoyo ( 59147 ) on Monday August 27, 2012 @12:01AM (#41133931)

          "because the temperature difference can distort images during the cooling down phase."

          If someone told you that they were either way to credulous or thought you were. You may want to let your camera adjust to the ambient temperature (either cooler or hotter), mostly to avoid condensation, which is a pain to wipe off constantly and will make all your pictures look like you took them in the fog. If you're doing astrophotography you want the sensor to be as cool as possible to decrease the thermal noise. But heating or cooling in a lens on a regular camera doesn't affect the image quality noticeably. Unless of course the lens actually shatters, which I've seen happen, but only growing up in northern Canada.

          But if you don't think flexing might be a problem take a piece of plastic wrap, stretch it across a five gallon pail and blow on it. Try and get it tight enough so it doesn't move but also doesn't tear. Now think that this lens is thinner than that.

        • by Bengie ( 1121981 )
          I'm sure a telescope-sized 60nm-thick lense will have flexing problems in anything other than a vacuum. The weakest of air currents will cause it to flex all over the place. That is assuming it is flexible and doesn't just break once you get to certain size.
      • An array of small thin flat lenses in the tube could do about the same thing as a big conventional lens. I suspect the new method isn't as stable as conventional lenses, even if you support it with a rigid grid.

        Also, if this is SWIR or thereabouts, those cameras and their thick lenses aren't exactly cheap nowadays - it could compete with the traditional IR lens materials pretty easily.

        • by ceoyoyo ( 59147 )

          It might replace some special purpose instruments. The problem is, it's still narrowband, so you wouldn't be able to just stick a new camera on your telescope and look at a different part of the spectrum. You'd probably still be better off with a big mirror and a little lens instead of a refractor.

    • this could see application on extreme close up technology similar to those found in videoscopes (like this http://www.rfsystemlab.us/vj-adv-4mm.html [rfsystemlab.us] ) the wider the angle the greater the distortion. what is interesting is the method they used to create the lens effect. They coated the flat lens in concentric circles with slightly different coatings that would change the phase delay for each ring thus focusing the incoming light. Very interesting technology.
    • no (Score:5, Informative)

      by SuperBanana ( 662181 ) on Sunday August 26, 2012 @11:47PM (#41133843)

      If we sandwich a few of these with the metasurfaces tuned right, could we build a telescope that is a slab instead of a tube?

      Only in limited cases, because it's only applicable from near-infrared to terahertz frequencies. UV and visible band are pretty much all out from the sounds of it.

      Also: the lens is very thin. Nothing else is - just the lens. Ie, the objective or sensor still has to be some distance behind it, and I'm sure there are limitations with respect to angles. So you still need a tube - especially if the lens is very large in diameter.

      This is fascinating, because it sounds like it is operating as a phased array; they *delay* the light depending on where it strikes on the lens. Wild! Phased arrays work by delaying the signal, thus steering the electromagnetic wave, but that's when you're generating or receiving...not modifying and retransmitting!

      However, they're doing it in this case by physical manipulation of the gold/silicon structures at construction time. It's not tuneable afterward.

      That's fine for telecom / fiber applications, where you only have a fixed number of specific wavelengths. However, astronomers might not mind being restricted to imaging just that one wavelength or that high in the light spectrum.

      Sadly, this limitation also makes it useless for semiconductor lithography, which is UV to x-ray range.

    • by pcjunky ( 517872 )

      This technique would seem to operate on only a narrow range of wavelengths. Although it may work at visible wavelengths each lens is tuned to a single wavelength. Optical telescopes need ro refract at least a octive with one lens. This lens would need to be tuned to a single frequency. Not a problem for LASER light, monochromatic. No substtute for standard lenses for imaging.

    • Canon already has a couple [the-digital-picture.com] production lenses [the-digital-picture.com] which use a diffractive optic element. The construction of the diffractive optics is different from TFA, but the principle is the same.

      The results have been... mixed. They do yield smaller and lighter lenses, but also introduce new distortions of their own. The tradeoff is worth it in most photographic applications, but for precision astronomical work I think the loss of contrast and sharpness may limit its usefulness.

      Also note that DO reduces but does n
      • for precision astronomical work I think the loss of contrast and sharpness may limit its usefulness

        That's only a problem for that subset of astronomical work that requires precise pretty pictures - for the rest (such as taking spectrograms), not so much.

  • by Grayhand ( 2610049 ) on Sunday August 26, 2012 @11:11PM (#41133645)
    Look at pin hole cameras. They actually lack lenses but focus to infinity. The trick is to filter out the incidental indirect rays that cause the blurring. The downside with pin holes is they only allow in a small amount of light. I'd love to see a fast lenses, something below F2.8 that doesn't require focusing.
    • by mug funky ( 910186 ) on Sunday August 26, 2012 @11:35PM (#41133773)

      why go with such big apertures if you want everything in focus? the beauty of such apertures is you can isolate your subject and blur the tits off everything else in the frame.

      sensor dynamic range is increasing all the time - i'd say by the time one of these lenses works for visible light, they'd be unnecessary.

      of course, there'd still be a need to focus with one of these - the focal point depends on where the subject is.

      • by viperidaenz ( 2515578 ) on Sunday August 26, 2012 @11:50PM (#41133871)
        Sometimes you want to focus on the tits and blur everything else...
      • by smellotron ( 1039250 ) on Sunday August 26, 2012 @11:59PM (#41133919)

        the beauty of such apertures is you can isolate your subject and blur the tits off everything else in the frame.

        Either I'm taking you too literally, or you're doing it wrong.

      • why go with such big apertures if you want everything in focus? the beauty of such apertures is you can isolate your subject and blur the tits off everything else in the frame.

        sensor dynamic range is increasing all the time - i'd say by the time one of these lenses works for visible light, they'd be unnecessary.

        of course, there'd still be a need to focus with one of these - the focal point depends on where the subject is.

        Because sometimes the entire landscape *is* the subject and you want to take the photo without long exposure times which necessitates a tripod.

      • why go with such big apertures if you want everything in focus? the beauty of such apertures is you can isolate your subject and blur everything else in the frame.

        Note to prospective camera geeks: don't take that at face. Consumer pocket cameras lenses and sensors don't provide that type of blur and the number means different things for different lenses. Besides, the fastest apertures they offered at brick stores a month ago were 3.5 to 3.1. My android phone does about the same --everything is "in focus". The end result is zero "blur" for your average group shots. My advanced point and shoot forces 2.0 whenever it can and has a large sensor... yet its "blur" effect

        • by wisty ( 1335733 )

          Yeah, there's a few factors:

          Having a small DoF (lots of blur for out-of-plane stuff) relies on real lens length (a 150mm full frame equivalent P&S might be more like a 25mm DSLR), and a wide apature.

          Image quality is ... complicated. If you want to take pictures fast (to minimize motion blur) you need good usable ISO, and a good apature number. Usable ISO will very roughly scale with crop factor, because big sensors tend to be better, but it also depends on other factors.

          So all the people winging about t

          • by wisty ( 1335733 )

            Also, small DoF comes from a close subject. That's why all the iPhone bokeh shots are of small subjects (flowers, sushi, other iPhones).

  • That's really all that matters these days. Everything else can be easily corrected in software. Chromatic aberrations are more difficult to deal with nicely.

    -Matt

    • maybe we should build a sensor with more than just RGB? like RYGCBM? depending on how much aberration, treating all those channels separately will get you within a subpixel's width of the desired lack of aberration.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      That's really all that matters these days. Everything else can be easily corrected in software. Chromatic aberrations are more difficult to deal with nicely.

      -Matt

      Nope Matt...

      Every optical aberration suppresses the system's Modulation Transfer Function (MTF) fundamentally resulting in a loss of information. Despite what CSI has taught you, you can never fully "correct it in software" after the fact. How well you can do depends primarily on signal to noise ratio (SNR), your detector's linearity calibration, and your prior knowledge of the objects power spectral density (usually just a rough semi-educated guess). Therefore, it's still important to have low-aberratio

    • Chromatic abberations are probably going to be a mess. As soon as I saw the summary, I said to myself "they are probably using an absorptive layer to do the phase changes." I read it, and... yup... gold. This means it will respond very strongly to wavelength. However, if they are trying to make lenses for lasers, it will probably do just fine.
      • by Skapare ( 16644 )

        Maybe it could still work for a wider spectral range by stacking several layers at various wavelengths. They are thin, so there could be space for a lot.

  • interestingly... (Score:5, Informative)

    by Tastecicles ( 1153671 ) on Sunday August 26, 2012 @11:43PM (#41133819)

    according to this [business-standard.com] report it's not a lens, but a diffraction grating.

    From linked article:

    "Our flat lens opens up a new type of technology. We're presenting a new way of making lenses. It's extremely exciting," says principal investigator Federico Capasso, professor of applied physics at the Harvard School of Engineering and Applied Sciences (SEAS).

    Sorry, matey, it ain't that new, it's just a new application of a well established physical property. I do seem to remember using diffraction gratings to magnify light-bending effects at college in 1992 - specifically to fire an EM pulse at 450nm (near blue part of the visible spectrum) through a sample and use a calibrated* diffraction grating to amplify the signal to a photographic plate. What you end up with, essentially, is a highly magnified image (on the order of millions of times) with a very low distortion, with which you can determine the structure of the sample (be it a crystal lattice, eg. graphite, or a double helix, eg. DNA; each molecule has its own unique diffraction pattern). Generally you would use X-rays as pretty much anything is at least partially transparent to this wavelength, but since we had to use visible light from a very low powered lasing LED, we had to use visible-transparent samples. We got stuck with a quartz crystal. Still interesting physics, though, and some very pretty pictures.

    *calibrating a diffraction grating is very simple: all you do is make the spacing between the lines on the plate equal to the wavelength of the light you're using. For far blue, you'd use a 400nm grating, for red 700nm. These are but two of several calibrated plates available.

    • by Mkoms ( 910273 )
      I apologize, but you are not correct. This is certainly not a diffraction grating. In a diffraction grating you are repeating a unit cell over and over (usually a thinner region, then a thicker one, and so forth) and using the fact that light scattered from each one of these regions will end up constructively interfering in some regions, destructively in others, etc. While I don't want to say that you can't use a diffraction grating to magnify an image (there are some approaches with some particularly des
    • by zalas ( 682627 )

      This new lens is not just a diffraction grating since it uses sub-wavelength structures to alter the apparent optical density (refractive index) of the material, i.e. metamaterials [wikipedia.org].

  • by cyn1c77 ( 928549 ) on Monday August 27, 2012 @12:07AM (#41133949)

    The lens is tuned to a single wavelength of light and was demonstrated with a laser.

    It's not apochromatic and not instantly useful to most lensing applications.

    The authors say that it could potentially be, in the future. But that often means "give us more funding."

    • by Skapare ( 16644 )

      But ... 3 of them could focus red, green, and blue lasers, and give us big screen images.

    • It might be possible to make a broader wavelength lens by stacking layers tuned to different frequencies. I'm bet money on it.
  • by Yevoc ( 1389497 ) on Monday August 27, 2012 @12:44AM (#41134121) Homepage
    My colleagues work on the exact same gold-based nano-antennae used by this work. All of the nano-antennae on the lens' surface are basically arranged to absorb and re-transmit the incoming light into a near perfect spot. Because it uses metal on nanoscopic scales to manipulate light in a way other than pure reflection (like a mirror), it's in the field of plasmonics. (Below a certain frequency [of light] the electrons in a metal react like a plasma, hence the name.)

    Whenever us optical engineers hear about plasmonics, we internally roll our eyes, because metal almost always absorbs far too much light to be useful. Even tens of nanometers of penetration and/or propagation can extinguish almost all of the light. This essentially relegates the entire field to the realm of theoretical curiosities and nothing more. (This work uses 60nm thick gold)

    The authors of this paper admit that absorption is their biggest obstacle, as this lens only passes 10% of the incoming light. There are other issues for making this work a reality, but they pale in comparison to the classic brick wall you get when passing light through metal.
  • by Mkoms ( 910273 ) on Monday August 27, 2012 @01:27AM (#41134281)
    Hi everyone. I'm a co-author on the article, and I'd be happy to answer any questions you may have, though probably tomorrow. I'm hoping that this goes better than the last time I tried this (see here: http://slashdot.org/comments.pl?sid=1747464&cid=33185134 [slashdot.org]), where no questions were asked and most of the discussion centered around mildly funny jokes. I appreciate those as much as the next person, but if anyone likes, we can discuss science =].
    • Re: (Score:3, Interesting)

      by jeti ( 105266 )

      There's a type of lens called Beugungslinse in German. I think the english term is diffraction lens. It is similar to a Fresnel lens, but the size of the structures are below the wavelength of visible light. What are the differences between these lenses, diffraction grates and the type of lens you're working on?

    • it sounds like you're etching a field of micro-lenses (why do you call them anteneas??) - like grooves on a record, the angles of these grooves are tuned to have a slope which graduates across the array - directing the light in a way that finely matches a convex or concave pattern.

      if this is the case, it sounds like much more would be possible - perhaps you could go on to simulate custom glass geometries by modulating the algorithm by which you apply the angles and spacing using some sort of stipplegen??

      ver

    • For starters, how about supplying a link to a free preprint of your paper. Without that, all you are going to get are /. jokes, since there is no tech to discuss.
  • by FlyingGuy ( 989135 ) <flyingguy&gmail,com> on Monday August 27, 2012 @01:37AM (#41134313)

    and I don;t mean those little lines on the globe.

    I have been an amateur photographer most of my life. The holy grail of photography, for me, has been to find film or techniques that bring film images as close to the latitude of the human eye. In Film Speak the human eye can handle around 12 to 14 f-stops around a given lighted scene. Which is to say that the information that your retina takes in ( given a central point that has lighted value of n ) can be discriminated 12 to 14 f-stops darker or brighter.

    We have all experienced taking a picture of a brightly or darkly lit scene. Sunsets are a great example. We as a viewer can enjoy a sunset and see all the detail ( quite clearly ) around us AND enjoy the sunset. This is one of the hardest, if not impossible, things to do with any camera, digital or otherwise for the simple reason that to correctly expose for the sky ( the sunset ) we will always drastically underexpose everything else around us by a large factor.

    I think this can be solved with a digital camera, but not until computing speeds drastically increase and not just by a little bit, but by several orders of magnitude since it would mean that each individual pixel would have to be processed and recorded for the sufficient amount of time to record the detail level in a still shot. So in a Nikon D5100 the sensor has ~16 million pixels. To obtain a shutter speed of 1/125 of a second ( .008 seconds ) each pixel who have to be processed in about 5 pico seconds ( 16 million / .008 = 1 * 5 -10th) and of course faster shutter speeds, well you get the point.

    • by tibit ( 1762298 )

      The issue is not about processing speed, but about sensor manufacturing processes. Your 5 ps figure is correct but meaningless: you don't have to serially "process" pixels like you imply, it'd be useless to do so.

      In the current sensors, it's hard to put significant electronics for every pixel without making the pixes less efficient. To have the ultimate sensor, you'd want something like an HP multislope A/D coverter right behind every pixel, so that it wouldn't matter how much light falls on it -- it could

  • Speaking as both a photographer and a very near-sighted person, it would be really awesome if these new lenses were also free(r) from dispersion as compared with standard lenses. (Dispersion is responsible for colour separation in a prism. It also causes orange-yellow and blue-violet fringing in simple lenses such as eyeglasses. Cameras use compound lens elements in a clunky and expensive way to address this problem - see http://en.wikipedia.org/wiki/Achromatic_lens [wikipedia.org] )

    Cameras would be lighter and more accura

    • by tibit ( 1762298 )

      Those lenses are horrible at chromatic aberration, they are designed for monochromatic light, and there's no trivial way so far to change that. So no dice there.

  • Would this allow the creation of lasers that stay focused on a longer distance? This could allow easier very long range communication or even power transmission over very large distances.
  • As someone who wears small telescopes on his face on a daily basis, can we please get this technology into eye glasses?

    Sure I would still have to wear glasses, but at least they might weigh a bit less than my 1kilo "ultrathins" I currently endure.

    20nm sounds great! Though they might have to buffer them up a bit so I don't cut my face or nose off or something by mistake.

  • Of course there's a lot of detail missing from the article, but something that has to be said is that some of those "annoying" distortions that they talk about are in fact valuable. The ideal camera is assumed to have a projective transformation and no chromatic aberration. But a true projective transform has some undesirable characteristics. For example, assuming that the photograph will eventually be shown on a flat surface, there will be a 1/r^2 drop off in intensity because the angle of light is bein
  • Sounds like they actually invented something...

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...