Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Sony Transportation Technology

Man Says CES Lidar's Laser Was So Powerful It Wrecked His Camera (arstechnica.com) 129

An anonymous reader quotes a report from Ars Technica: A man attending this week's CES show in Las Vegas says that a lidar sensor from startup AEye has permanently damaged the sensor on his $1,998 Sony camera. Earlier this week, roboticist and entrepreneur Jit Ray Chowdhury snapped photos of a car at CES with AEye's lidar units on top. He discovered that every subsequent picture he took was marred by two bright purple spots, with horizontal and vertical lines emanating from them. "I noticed that all my pictures were having that spot," he told Ars by phone on Thursday evening. "I covered up the camera with the lens cap and the spots are there -- it's burned into the sensor." In an email to Ars Technica, AEye CEO Luis Dussan confirmed that AEye's lidars can cause damage to camera sensors -- though he stressed that they pose no danger to human eyes. "Cameras are up to 1000x more sensitive to lasers than eyeballs," Dussan wrote. "Occasionally, this can cause thermal damage to a camera's focal plane array." Chowdhury says that AEye has offered to buy him a new camera. The potential issue is that self-driving cars also rely on conventional cameras. "So if those lidars are not camera-safe, it won't just create a headache for people snapping pictures with handheld camera," reports Ars. "Lidar sensors could also damage the cameras on other self-driving cars."

"It's worth noting that companies like Alphabet's Waymo and GM's Cruise have been testing dozens of vehicles with lidar on public streets for more than a year," adds Ars. "People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras."
This discussion has been archived. No new comments can be posted.

Man Says CES Lidar's Laser Was So Powerful It Wrecked His Camera

Comments Filter:
  • by Anonymous Coward

    "Cameras are up to 1000x more sensitive to lasers than eyeballs"

    and what was the shutter speed he was using? Something like one 1/500th of a second, so in 1 second the laser would do 500 times the damage it did to the camera? in 2 seconds it'd do just as much damage.

    • by Anonymous Coward

      That's assuming it works like the mythical EM sensitivity. The reality here is that the the mechanism of interaction is absent in human eyes. DSLR cameras use light detector silicon or more exotic material chips, and the circuitry was damaged by energy in a part of the spectrum that human eyes are not sensitive enough to even detect.

      • by Anonymous Coward

        No, sorry. The article says thermal damage... and silicon is -less- sensitive to thermal damage than a retina.

        The mechanism is local heating, and anyone that uses lasers for anything knows that retina damage is not only possible, but likely. That is what the type rating on a laser is about. Directly related to the optical power output.

        • Depends on the wavelength of the laser. A Nd:YAG is much more likely to cause retinal damage than a CO2 of the same power, since your cornea and lens will pass 1064 nm IR easily, but are much more opaque to 10600 nm, so you'll feel the heating/burning and blink or otherwise get out of the beam before it can do much (if any) damage to the retina. Of course, if it's a 5 MW steel cutter, you're just well and truly screwed.

    • by GrpA ( 691294 )

      Science isn't complete unless you consider every aspect, and determining energy on a FPA is about as easy as calculating turbulence.

      Easy enough to test though. Take the camera and the car. Take more pictures. If the car damaged the camera, then more damage will occur. Even if the car did damage the camera initially, but no more damage occurs, then the camera was already in the process of failing and just waiting for the right opportunity, but most likely it would mean, beyond reasonable doubt, that the came

    • by Z00L00K ( 682162 )

      Indoors photography would probably be like 1/25 to 1/125th of a second depending on light conditions.

      • Only if you're using ISO 100-400 film. With modern DSLRs you can use much faster shutter speeds than that, particularly if it was a brightly-lit CES stage.
        • by Z00L00K ( 682162 )

          That depends on the quality of the pictures you want to get. Higher ISO means more noise and less dynamic range, which makes the pictures look flat and uninteresting.

    • by JoeyRox ( 2711699 ) on Saturday January 12, 2019 @04:30AM (#57949000)
      The camera in question is a Sony A7rII, which is a mirrorless camera. Such cameras constantly expose the sensor to light in the scene [while not taking photos], which is necessary to provide the video-like image stream used for the electronic viewfinder and LCD display.
    • by Anonymous Coward

      It is not true at all that the retina is less sensitive to light energy compared to a cmos sensor. For example try making a movie of a 50mW laser pointer beam going into the camera (not a good one obviously). It will take more than 20 seconds of exposure for most sensors before damage appears, if it does at all. On the other hand, this 50mW beam will cause a retina burn in a few tens of ms.

      I think they are using a high powered pulse system on a frequency that is not passed through the human eye, and so neve

    • by spth ( 5126797 )

      The lidar beam sweeps the area (after all the lidar is meant to give an image of the surroundings, not just a distance measure to a single point). I don't think a human could move quickly enough to keep the beam on their eye.

      I hope lidar systems are designed fail-safe enough to not keep emitting the beam when the mirror rotation fails.

  • by Anonymous Coward

    If this is powerful enough to damage a CMOS/CCD sensor then it is most certainly also doing damage to biological tissue in eyeballs.

    If this is doing "thermal damage" to CMOS/CCDs, essentially chunks of glass, then it is doing more damage to biological tissues.

    • The retina is submerged in a water bath while the camera sensor is surrounded by insulating air and plastic. The sensor may have a higher absolute temperature rating but it can't dissipate heat nearly as well as your retina.

      • by Bongo ( 13261 )

        Retina is normally submerged, but not when you've just had a vitrectomy and are walking around empty.

      • by eclectro ( 227083 ) on Saturday January 12, 2019 @12:41AM (#57948618)

        Stop downplaying the dangers of laser technology. Any coherent radiation hitting they eye should be considered very dangerous. Even the cheap laser pointers have a yellow caution sticker on them!

        Here is a story about lasers blinding concert goers in Russia. [reuters.com]

        • I'm not trying to downplay the danger of lasers in general; as the saying goes, do not stare into laser with remaining eye.

          But it is also true that localized heating will damage most camera sensors faster than the human retina. Even an 'eye-safe' laser can leave spots and streaks on CMOS and CCDs.

        • I had to remove a ")" character from the end of that link to read it, but it was fascinating. Thanks for sharing it!
      • It's much more complicated than that. The silicon substrate of the image sensor has a thermal conductivity that is over 100x higher than that of water and human tissue, which more than compensates the smaller volume of material. But a high-quality camera lens can focus a laser beam much better than the eye lens, at least for visible light. The camera lens aperture (for a DSLR or mirrorless camera) is larger than that of the human eye, so it may collect more laser power, depending on the beam diameter.

    • by tlhIngan ( 30335 )

      If this is powerful enough to damage a CMOS/CCD sensor then it is most certainly also doing damage to biological tissue in eyeballs.

      If this is doing "thermal damage" to CMOS/CCDs, essentially chunks of glass, then it is doing more damage to biological tissues.

      It depends on the laser. Common ones used for LIDAR is 800nm and 1550nm. 800nm can injure the eye, so its use in LIDARs and such where it might hit an eye accidentally mean its power limited to prevent damage.

      The thing with 1550nm is that it can't make

  • by Anonymous Coward
    Cool! I wonder if a powerful IR laser would also zorch speed cameras and/or surveillance cameras.
  • by SlaveToTheGrind ( 546262 ) on Friday January 11, 2019 @10:51PM (#57948346)

    "People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras."

    Or maybe, just maybe, this was one of the few instances where (1) camera damage happened; (2) the camera owner realized the damage must have been due to snapping a picture of a self-driving car; and (3) the camera owner knew who owned the self-driving car so they could complain?

    • More likely these guys dialed it up to 11 for CES demos and got caught.

      • by jrumney ( 197329 )
        Or, the guy deliberately pointed his camera directly at the laser at short range hoping to see it.
    • by Dayze!Confused ( 717774 ) <(slashdot.org) (at) (ohyonghao.com)> on Friday January 11, 2019 @11:10PM (#57948386) Homepage Journal

      It's pretty obvious which picture it is when 1-100 are good, picture 101 has two dots and a self driving car, and pictures 102 and beyond have that too. Anyone with a camera who detects that problem would go back through their photos and find 101, and self driving cars aren't being sold to the masses yet, it would be easy to track down who owns it; it's not one of millions of individuals, it's one of a handful of companies licensed to operate them.

      • picture 101 has two dots

        Assuming every incidence of lidar damage to a digital camera is this extreme and well-defined. That seems like it would have a lot to do with the angle the light hit the lens, the distance from the car, the focal length of the beam, and so on.

        and a self driving car

        Which apparently the average Joe would immediately deduce from the big banner reading "SELF DRIVING" on the side?

      • by Kjella ( 173770 )

        Well since it's a mirrorless camera the sensor doesn't have to be damaged by taking a picture, it's constantly exposing. So if the camera is say on a tripod pointing towards an intersection and you're busy fiddling with some camera settings it could easily be that this car stops at the intersection, damages your sensor - I assume it takes more than one burst, that it's the laser pound the same spot - and drives on before you take a photo. Heck, if you're wrapping up your shoot you might not even notice unti

  • Responsibility. (Score:2, Interesting)

    by msauve ( 701917 )
    A camera should withstand being pointed at the sun. If something puts out more power and damages a camera, shame (and liability) on them.

    Broadcasting light interference is no different than broadcasting radio interference (in terms of responsibility, not physics).
    • Re:Responsibility. (Score:4, Informative)

      by Anonymous Coward on Friday January 11, 2019 @11:11PM (#57948390)

      Eyes don't withstand being pointed at the sun for very long. Neither do most cameras.

    • by Anonymous Coward

      A camera should withstand being pointed at the sun. If something puts out more power and damages a camera, shame (and liability) on them.
       

      In the good old days if you pointed any reflex camera, in fact any camera directly at the sun it would burn the camera's shutter. What makes you think if you pointed a digital reflex directly at the sun the sun wouldn't damage the sensor ?

    • A camera should withstand being pointed at the sun.

      a) why? It's not what they are designed for.
      b) sun happily damages cameras too, it all depends on exposure.

      Broadcasting light interference is no different than broadcasting radio interference (in terms of responsibility, not physics).

      Which is why we have part 15 of the FCC rules. You will accept the interference and you will like it.

      • Part 15 applies to narrow bands of the spectrum. Small 'free for all' bands. Said equipment MUST only radiate within that band. Equipment that produces harmonic radiation outside that band is prohibited.

        • by thogard ( 43403 )

          Lasers are controlled in the US by 21 CFR 1040 and have been for a very long time. There is a odd loop hole in that if you if have your laser hit a lens that spreads out the beam so that at the end of the laser is larger than an eyeball, it can deliver far more power even if the beam is much smaller far away. Some early traffic speed lasers took advantage of that. The ANSI standard has the same problem. They should require the test at the end of the laser and 100 meters away.

        • Part 15 applies to narrow bands of the spectrum.

          It does not. Several subparts to part 15 apply to broad spectrum, and unlicensed spectrum. Your post is only correct for some clauses of Part 15.

          Some parts definitely do apply to narrow bands of the spectrum. But quite critically Part 15 in a general case for an non-intentional radiator (reads consumer electronics) specifically says it must accept interference without any specification to the interference's spectrum. So the idea of having the FCC regulate this the same way as they would the radio spectrum w

  • Not just cameras (Score:5, Interesting)

    by ColaMan ( 37550 ) on Friday January 11, 2019 @11:03PM (#57948368) Journal

    When I use multiple LIDARs on a machine their beam sweep has to be synchronised otherwise the reflections of one beam can interfere with the other.

    I'm waiting to see what happens with a freeway full of cars with LIDARs, all flinging their beams at each other willy-nilly with direct beams and reflections all over the place. If you're unlucky you'll get a beam from another vehicle just after yours has sent a pulse out - resulting in a false return showing something right in front of you.

    I'm guessing that most of the time with enough units around you all you'd get is the equivalent of "static" on your laser sweeps, where you briefly get invalid results for a few degrees of sweep. If you're really unlucky, you blind your sensor, temporarily (bad), or permanently damage it (bad and expensive).

    • Your "static" scenario is exactly what will happen with AM pulsed lidar. It's (one of many) dirty secrets of the lidar industry: AM lidar doesn't scale. There are alternative ways to do lidar (FMCW for one) that can scale much, much better but, these kinds of systems are still a bit expensive and so won't see widespread adoption for a few years.

    • Re:Not just cameras (Score:5, Interesting)

      by Solandri ( 704621 ) on Saturday January 12, 2019 @12:43AM (#57948620)
      Radar and sonar overcome this problem by constantly varying the frequency [wikipedia.org] in a series of chirps [wikipedia.org]. It's highly unlikely that there's another radar/sonar transmitting at the same frequency at near the same time. And even if there is, it's unlikely to be varying the frequency at the same rate/range.

      Another advantage of this is that you don't need as strong a sweep signal. With a single frequency, you're emitting a pulse, then waiting for the reflections of the pulse. In order to avoid the possibility of spurious noise from another source being interpreted as a reflection, your pulse has to be high-power (basically make the reflected signal stronger in strength than any noise). 1000 to 5000 Watts was typical for boat radars using pulse beams. But when you use a varying frequency, you can compare reflections at one frequency with subsequent reflections at a different frequency (there's no need to wait for return reflections - subsequent pulses will not interfere with previous pulses, so can be sent before reflections from previous pulses arrive). Noise will show up at just one frequency, making it easy to spot and trivial to filter out. Consequently newer frequency sweeping boat radars only need to emit at a few tens of Watts.

      That said, the parking sensors in your car use this frequency varying sonar. And I've noticed other cars' parking sensors trigger mine about once a day. So some more work needs to be done on standardizing frequency sweeps and noise filtering to reduce signal collisions. But the problem is not as insurmountable as you'd think from your LIDAR experience.
      • by ColaMan ( 37550 )

        I didn't say it was insurmountable, but as another poster has pointed out, there's very few LIDARs on the market right now that modulate their beam. Unlike radar, it's difficult to vary the actual frequency as such as they're diode lasers and generally fixed. So we're kind of stuck with just beam modulation unless we want to do something fancy like driving multiple lasers, which gets tricky when you're all sharing the same optical path.

        LIDARs also have the difficulty (or advantage, depending on which way yo

        • Why use expensive Lidar when stereo vision will also provide good depth perception. Two cameras pointing in the same direction. Line up the dots and do the Trig.

          That is a genuine question. I wonder why the focus on Lidar.

          • A good lidar system doesn't provide depth perception, it provides a centimeter(-ish blah, blah, Guassian) accurate 3D model of the world. It's actually pretty easy to take the point cloud from a lidar and, as long as it's accurately timestamped (PTP works well), synchronize it with GPS and an IMU and get a very accurate 3D model of your surroundings. Some flavors of lidar also provide doppler on every point in the point cloud. You aren't comparing frames against either other to determine if something is

            • by Anonymous Coward

              So how is "3D model" different from "Depth Perception"? You want to know how far away things are, so that you can build a 3D model. Both passive Stereo vision and Lidar do that.

              Incidentally, there is an intermediate approach with active stereo -- you use a laser a next to the camera, and measure the offset rather than the timing, which should be cheaper.

              All three address the same problem. The question is, why do people prefer Lidar.

              • by tepples ( 727027 )

                Perhaps they're making do with time of flight lidar while waiting for active stereo patents to expire. Or they don't want resolution to decrease dramatically for faraway objects.

      • You are 100% correct but, you are describing frequency modulation. The vast majority of lidar companies (on the order of 99%) are producing amplitude modulation systems that shoot out strong amplitude signals and cross their fingers hoping they can see and distinguish it when it comes back. These systems often use "avalanche detectors" to help their probability of return detection (look it up, it's insane). Driving directly towards a sunset can literally cause these systems to emit their magic smoke. Acc

        • by Anonymous Coward

          Avalanche detectors are not damaged by too high light input (well, within reason) assuming competent sensor readout circuit design. Otherwise they would fail when imaging a diffuse mirror :).

    • by jrumney ( 197329 )

      I'm waiting to see what happens with a freeway full of cars with LIDARs, all flinging their beams at each other willy-nilly with direct beams and reflections all over the place. If you're unlucky you'll get a beam from another vehicle just after yours has sent a pulse out - resulting in a false return showing something right in front of you.

      If only someone could come up with a way of encoding signals so that a desired signal can be distinguished amongst a mass of other signals using the same wavelength.

      • Yes, that is the challenge with laser modulation. It has significant limits that mean it won't scale very well. Others have explained it well in other comments in this discussion.

        • by jrumney ( 197329 )
          It's a solved challenge. You're probably thinking many of the solutions from current digital radio technology are not directly usable because they rely on spread spectrum to further increase the noise immunity and thus increase the bandwidth that can be carried, but there are still solutions in there that can be applied in fixed wavelength scenarios.
  • by Arzaboa ( 2804779 ) on Friday January 11, 2019 @11:26PM (#57948432)

    All I hear is that there will now be some "smart" people trying to outsmart new tech and cameras by firing lasers, not strong enough to poke your eyes out, but strong enough to burn the retina of the machine.

    --
    When you have confidence, you can have a lot of fun. And when you have fun, you can do amazing things. - Joe Namath

    • by AmiMoJo ( 196126 )

      This has been happening for ages already. You can buy high power laser pointers and even higher power laser modules online for very little money, and they are more than capable of destroying CCTV cameras and the like. The tricky part is doing it without being caught on the camera itself.

  • We need to develop passive sensors. I mean, we have them, we're just not using them.

  • by Vadim Makarov ( 529622 ) <makarov@vad1.com> on Saturday January 12, 2019 @12:05AM (#57948526) Homepage

    As the original article [arstechnica.com] duly explains, the laser light at the wavelength of 1550 nm used by this lidar scanner does NOT reach the retina of the eye. At this wavelength, it is fully absorbed in outer parts of the eye (cornea, lens, etc.) before it could get focused into a tight spot on the retina. This makes this wavelength (relatively) eye-safe, comparing to visible and some other wavelength ranges. There is no such protection for the camera however, whose glass optics happily focuses 1550 nm into a small spot... so the sensor damage may happen.

    Laser safety regulations are primarily concerned with (a) no damage to humans, especially their eyes, and (b) laser beams not setting things on fire. Neither of this has happened in this incident. So we are good.

    If you are interested in technical details of laser safety, read ANSI Z136.1 standard. Warning: it requires technical expertise.

    • The filter built into a DSLR would block light of that wavelength. The reason I had it removed for taking astrophotos, to get maximum red transmission. With it in place, transmission of wavelengths greater than 600nm is significantly reduced and close to zero by 1100nm. 1550nm is well into the IR band, and the longer the wavelength, the less energetic the photons. Also, the camera lens wouldn't focus it. Glass lenses are like prisms, and refract different wavelengths of light differently. Camera lenses and
      • If the camera was focused for visible light, IR would be out of focus.

        ...unless the camera was focused at something else at another distance, or simply moved out-of-focus, and the 1550 nm image accidentally came in-focus.

        Also, as I've written in another comment, the filter doesn't have to be effective at 1550 nm, because the Si sensor itself is insensitive that far into IR. This depends on the technology used to make the filter. For example, an interference-type spectral filter may perform very well in its designed wavelength range, but simply becomes transparent outside of i

      • by mark-t ( 151149 )

        The filter built into a DSLR would block light of that wavelength. The reason I had it removed for taking astrophotos, to get maximum red transmission.

        One is compelled to wonder why one would use a camera that has been custom modified to be sensitive to frequencies that are needed for astrological observation to take pictures of passing automobiles.

    • Retinal damage is not the only ocular risk.

      https://velodynelidar.com/newsroom/guide-to-lidar-wavelengths/ [velodynelidar.com]

      1550 nm systems use a wavelength that is allowed to run more power compared to 905 nm. However, under certain conditions, the 1550 nm wavelength of light can still cause corneal damage and potential damage to the eye lens [lbl.gov].

      • by Vadim Makarov ( 529622 ) <makarov@vad1.com> on Saturday January 12, 2019 @01:38AM (#57948720) Homepage

        Maximum permitted exposure (MPE) in 1500-1800 nm band is the same for the eye and the skin. For continuous-wave light it is 0.1 W/cm2, for pulsed light it is 1 J/cm2. Reference: ANSI Z136.1, see Tables 5a and 7.

        In other words, if the 1550 nm laser beam is not burning your skin, it is safe for your eye.

        This is remarkably untrue at other wavelengths, where light is dramatically more dangerous to the eye than it is to the skin.

        • This is remarkably untrue at other wavelengths, where light is dramatically more dangerous to the eye than it is to the skin.

          For sure. You can let an unfocused 100 watt Nd:YAG beam fall on your hand and you might feel a tiny bit of warmth. Letting that same beam hit you in the eyes is a different story.

  • by Flexagon ( 740643 ) on Saturday January 12, 2019 @02:52AM (#57948840)
    So, how long will the various municipalities' automatic red light ticketing cameras last with this?
  • As someone who has aimed a laser at a CMOS sensor for hours at a time, I would want to check to make sure the laser might be over the legal limit. Laser Shined at CMOS sensor, for hours. https://www.youtube.com/watch?... [youtube.com] I have seen videos of laser strikes ruin sensor, but often at laser light shows, which might also be over the legal limit.
  • Manufacturer designs camera-killing laser to be mounted on the front of vehicles which have cameras on their rears. What could possibly go wrong?

  • by zmooc ( 33175 ) <zmooc@zmooc.DEGASnet minus painter> on Saturday January 12, 2019 @09:11AM (#57949402) Homepage

    Sony does not have a good security track record so this does not come as a surprise to me.

    OWASP Secure Coding Practices Checklist section 1 about input validation was clearly not applied at all. Specifically, they failed to implement "Validate data range" :p

  • Also, what effect would that LIDAR have on surveillance cameras? I assume they could be fitted with filters, provided their IR operations are not affected.
  • "Do not look into LASER with remaining eye / camera."

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...