Man Says CES Lidar's Laser Was So Powerful It Wrecked His Camera (arstechnica.com) 129
An anonymous reader quotes a report from Ars Technica: A man attending this week's CES show in Las Vegas says that a lidar sensor from startup AEye has permanently damaged the sensor on his $1,998 Sony camera. Earlier this week, roboticist and entrepreneur Jit Ray Chowdhury snapped photos of a car at CES with AEye's lidar units on top. He discovered that every subsequent picture he took was marred by two bright purple spots, with horizontal and vertical lines emanating from them. "I noticed that all my pictures were having that spot," he told Ars by phone on Thursday evening. "I covered up the camera with the lens cap and the spots are there -- it's burned into the sensor." In an email to Ars Technica, AEye CEO Luis Dussan confirmed that AEye's lidars can cause damage to camera sensors -- though he stressed that they pose no danger to human eyes. "Cameras are up to 1000x more sensitive to lasers than eyeballs," Dussan wrote. "Occasionally, this can cause thermal damage to a camera's focal plane array." Chowdhury says that AEye has offered to buy him a new camera. The potential issue is that self-driving cars also rely on conventional cameras. "So if those lidars are not camera-safe, it won't just create a headache for people snapping pictures with handheld camera," reports Ars. "Lidar sensors could also damage the cameras on other self-driving cars."
"It's worth noting that companies like Alphabet's Waymo and GM's Cruise have been testing dozens of vehicles with lidar on public streets for more than a year," adds Ars. "People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras."
"It's worth noting that companies like Alphabet's Waymo and GM's Cruise have been testing dozens of vehicles with lidar on public streets for more than a year," adds Ars. "People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras."
Lies pushed by big Optometry. (Score:2, Insightful)
"Cameras are up to 1000x more sensitive to lasers than eyeballs"
and what was the shutter speed he was using? Something like one 1/500th of a second, so in 1 second the laser would do 500 times the damage it did to the camera? in 2 seconds it'd do just as much damage.
Re: (Score:1)
That's assuming it works like the mythical EM sensitivity. The reality here is that the the mechanism of interaction is absent in human eyes. DSLR cameras use light detector silicon or more exotic material chips, and the circuitry was damaged by energy in a part of the spectrum that human eyes are not sensitive enough to even detect.
Re: Lies pushed by big Optometry. (Score:1)
No, sorry. The article says thermal damage... and silicon is -less- sensitive to thermal damage than a retina.
The mechanism is local heating, and anyone that uses lasers for anything knows that retina damage is not only possible, but likely. That is what the type rating on a laser is about. Directly related to the optical power output.
Re: (Score:1)
dude seriously what in the fuck, does everyone have a UID >5M now?
Re: (Score:2)
Depends on the wavelength of the laser. A Nd:YAG is much more likely to cause retinal damage than a CO2 of the same power, since your cornea and lens will pass 1064 nm IR easily, but are much more opaque to 10600 nm, so you'll feel the heating/burning and blink or otherwise get out of the beam before it can do much (if any) damage to the retina. Of course, if it's a 5 MW steel cutter, you're just well and truly screwed.
Re: (Score:2)
Science isn't complete unless you consider every aspect, and determining energy on a FPA is about as easy as calculating turbulence.
Easy enough to test though. Take the camera and the car. Take more pictures. If the car damaged the camera, then more damage will occur. Even if the car did damage the camera initially, but no more damage occurs, then the camera was already in the process of failing and just waiting for the right opportunity, but most likely it would mean, beyond reasonable doubt, that the came
Re: (Score:2)
Indoors photography would probably be like 1/25 to 1/125th of a second depending on light conditions.
Re: (Score:2)
Re: (Score:2)
That depends on the quality of the pictures you want to get. Higher ISO means more noise and less dynamic range, which makes the pictures look flat and uninteresting.
Shutter speed doesn't matter for mirrorless camera (Score:5, Informative)
Re: (Score:1)
It is not true at all that the retina is less sensitive to light energy compared to a cmos sensor. For example try making a movie of a 50mW laser pointer beam going into the camera (not a good one obviously). It will take more than 20 seconds of exposure for most sensors before damage appears, if it does at all. On the other hand, this 50mW beam will cause a retina burn in a few tens of ms.
I think they are using a high powered pulse system on a frequency that is not passed through the human eye, and so neve
Re: (Score:2)
The lidar beam sweeps the area (after all the lidar is meant to give an image of the surroundings, not just a distance measure to a single point). I don't think a human could move quickly enough to keep the beam on their eye.
I hope lidar systems are designed fail-safe enough to not keep emitting the beam when the mirror rotation fails.
Re: (Score:2)
Bullshit about eye safety. (Score:2, Interesting)
If this is powerful enough to damage a CMOS/CCD sensor then it is most certainly also doing damage to biological tissue in eyeballs.
If this is doing "thermal damage" to CMOS/CCDs, essentially chunks of glass, then it is doing more damage to biological tissues.
Re: (Score:3)
The retina is submerged in a water bath while the camera sensor is surrounded by insulating air and plastic. The sensor may have a higher absolute temperature rating but it can't dissipate heat nearly as well as your retina.
Re: (Score:2)
Retina is normally submerged, but not when you've just had a vitrectomy and are walking around empty.
Re:Bullshit about eye safety. (Score:5, Informative)
Stop downplaying the dangers of laser technology. Any coherent radiation hitting they eye should be considered very dangerous. Even the cheap laser pointers have a yellow caution sticker on them!
Here is a story about lasers blinding concert goers in Russia. [reuters.com]
Re: (Score:2)
I'm not trying to downplay the danger of lasers in general; as the saying goes, do not stare into laser with remaining eye.
But it is also true that localized heating will damage most camera sensors faster than the human retina. Even an 'eye-safe' laser can leave spots and streaks on CMOS and CCDs.
Re: Bullshit about eye safety. (Score:2)
Re: (Score:1)
If you've never gotten your foreskin stuck in your zipper you've missed a new threshold of suffering. I once zipped an area of it INTO the zipper. The only way to stop the pain was to zip BACK over it to get it free.
Re: (Score:2)
No shit right. After the first time that happened I decided I needed protection.
Re: (Score:2)
Re: (Score:2)
It's much more complicated than that. The silicon substrate of the image sensor has a thermal conductivity that is over 100x higher than that of water and human tissue, which more than compensates the smaller volume of material. But a high-quality camera lens can focus a laser beam much better than the eye lens, at least for visible light. The camera lens aperture (for a DSLR or mirrorless camera) is larger than that of the human eye, so it may collect more laser power, depending on the beam diameter.
Re: (Score:3)
It depends on the laser. Common ones used for LIDAR is 800nm and 1550nm. 800nm can injure the eye, so its use in LIDARs and such where it might hit an eye accidentally mean its power limited to prevent damage.
The thing with 1550nm is that it can't make
Cool! (Score:1)
"as far as we know" (Score:5, Insightful)
"People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras."
Or maybe, just maybe, this was one of the few instances where (1) camera damage happened; (2) the camera owner realized the damage must have been due to snapping a picture of a self-driving car; and (3) the camera owner knew who owned the self-driving car so they could complain?
Re: (Score:2)
More likely these guys dialed it up to 11 for CES demos and got caught.
Re: (Score:2)
Re: (Score:2)
Re:"as far as we know" (Score:4, Interesting)
It's pretty obvious which picture it is when 1-100 are good, picture 101 has two dots and a self driving car, and pictures 102 and beyond have that too. Anyone with a camera who detects that problem would go back through their photos and find 101, and self driving cars aren't being sold to the masses yet, it would be easy to track down who owns it; it's not one of millions of individuals, it's one of a handful of companies licensed to operate them.
Re: (Score:2)
picture 101 has two dots
Assuming every incidence of lidar damage to a digital camera is this extreme and well-defined. That seems like it would have a lot to do with the angle the light hit the lens, the distance from the car, the focal length of the beam, and so on.
and a self driving car
Which apparently the average Joe would immediately deduce from the big banner reading "SELF DRIVING" on the side?
Re: (Score:2)
Well since it's a mirrorless camera the sensor doesn't have to be damaged by taking a picture, it's constantly exposing. So if the camera is say on a tripod pointing towards an intersection and you're busy fiddling with some camera settings it could easily be that this car stops at the intersection, damages your sensor - I assume it takes more than one burst, that it's the laser pound the same spot - and drives on before you take a photo. Heck, if you're wrapping up your shoot you might not even notice unti
Responsibility. (Score:2, Interesting)
Broadcasting light interference is no different than broadcasting radio interference (in terms of responsibility, not physics).
Re:Responsibility. (Score:4, Informative)
Eyes don't withstand being pointed at the sun for very long. Neither do most cameras.
Re: (Score:1)
A camera should withstand being pointed at the sun. If something puts out more power and damages a camera, shame (and liability) on them.
In the good old days if you pointed any reflex camera, in fact any camera directly at the sun it would burn the camera's shutter. What makes you think if you pointed a digital reflex directly at the sun the sun wouldn't damage the sensor ?
Re: (Score:3)
A camera should withstand being pointed at the sun.
a) why? It's not what they are designed for.
b) sun happily damages cameras too, it all depends on exposure.
Broadcasting light interference is no different than broadcasting radio interference (in terms of responsibility, not physics).
Which is why we have part 15 of the FCC rules. You will accept the interference and you will like it.
Re: (Score:1)
Part 15 applies to narrow bands of the spectrum. Small 'free for all' bands. Said equipment MUST only radiate within that band. Equipment that produces harmonic radiation outside that band is prohibited.
Re: (Score:3)
Lasers are controlled in the US by 21 CFR 1040 and have been for a very long time. There is a odd loop hole in that if you if have your laser hit a lens that spreads out the beam so that at the end of the laser is larger than an eyeball, it can deliver far more power even if the beam is much smaller far away. Some early traffic speed lasers took advantage of that. The ANSI standard has the same problem. They should require the test at the end of the laser and 100 meters away.
Re: (Score:2)
Part 15 applies to narrow bands of the spectrum.
It does not. Several subparts to part 15 apply to broad spectrum, and unlicensed spectrum. Your post is only correct for some clauses of Part 15.
Some parts definitely do apply to narrow bands of the spectrum. But quite critically Part 15 in a general case for an non-intentional radiator (reads consumer electronics) specifically says it must accept interference without any specification to the interference's spectrum. So the idea of having the FCC regulate this the same way as they would the radio spectrum w
Not just cameras (Score:5, Interesting)
When I use multiple LIDARs on a machine their beam sweep has to be synchronised otherwise the reflections of one beam can interfere with the other.
I'm waiting to see what happens with a freeway full of cars with LIDARs, all flinging their beams at each other willy-nilly with direct beams and reflections all over the place. If you're unlucky you'll get a beam from another vehicle just after yours has sent a pulse out - resulting in a false return showing something right in front of you.
I'm guessing that most of the time with enough units around you all you'd get is the equivalent of "static" on your laser sweeps, where you briefly get invalid results for a few degrees of sweep. If you're really unlucky, you blind your sensor, temporarily (bad), or permanently damage it (bad and expensive).
Re: (Score:1)
LIDARs that are in close proximity to each other - you mean like on a freeway?
Re: (Score:3)
Your "static" scenario is exactly what will happen with AM pulsed lidar. It's (one of many) dirty secrets of the lidar industry: AM lidar doesn't scale. There are alternative ways to do lidar (FMCW for one) that can scale much, much better but, these kinds of systems are still a bit expensive and so won't see widespread adoption for a few years.
Re: (Score:3)
People in the industry know. Right now self-driving cars are an easy way to raise funds. Much like the .com era, everyone wants a piece of the pie. Eventually after about 20 years we'll laughing at the time we were awed by the self-driving car equivalent of Badger badger and lament the situation with Pewdiepie while everyone just uses self-driving cars to get delivery of hookers and blow.
There is less and less reason to keep driving around like fools every day for work. Cars, self driving or not, will feel
Re: (Score:2)
Boober?
Re:Not just cameras (Score:5, Interesting)
Another advantage of this is that you don't need as strong a sweep signal. With a single frequency, you're emitting a pulse, then waiting for the reflections of the pulse. In order to avoid the possibility of spurious noise from another source being interpreted as a reflection, your pulse has to be high-power (basically make the reflected signal stronger in strength than any noise). 1000 to 5000 Watts was typical for boat radars using pulse beams. But when you use a varying frequency, you can compare reflections at one frequency with subsequent reflections at a different frequency (there's no need to wait for return reflections - subsequent pulses will not interfere with previous pulses, so can be sent before reflections from previous pulses arrive). Noise will show up at just one frequency, making it easy to spot and trivial to filter out. Consequently newer frequency sweeping boat radars only need to emit at a few tens of Watts.
That said, the parking sensors in your car use this frequency varying sonar. And I've noticed other cars' parking sensors trigger mine about once a day. So some more work needs to be done on standardizing frequency sweeps and noise filtering to reduce signal collisions. But the problem is not as insurmountable as you'd think from your LIDAR experience.
Re: (Score:3)
I didn't say it was insurmountable, but as another poster has pointed out, there's very few LIDARs on the market right now that modulate their beam. Unlike radar, it's difficult to vary the actual frequency as such as they're diode lasers and generally fixed. So we're kind of stuck with just beam modulation unless we want to do something fancy like driving multiple lasers, which gets tricky when you're all sharing the same optical path.
LIDARs also have the difficulty (or advantage, depending on which way yo
Why not just use Stereo Vission (Score:2)
Why use expensive Lidar when stereo vision will also provide good depth perception. Two cameras pointing in the same direction. Line up the dots and do the Trig.
That is a genuine question. I wonder why the focus on Lidar.
Re: (Score:2)
A good lidar system doesn't provide depth perception, it provides a centimeter(-ish blah, blah, Guassian) accurate 3D model of the world. It's actually pretty easy to take the point cloud from a lidar and, as long as it's accurately timestamped (PTP works well), synchronize it with GPS and an IMU and get a very accurate 3D model of your surroundings. Some flavors of lidar also provide doppler on every point in the point cloud. You aren't comparing frames against either other to determine if something is
Re: (Score:1)
So how is "3D model" different from "Depth Perception"? You want to know how far away things are, so that you can build a 3D model. Both passive Stereo vision and Lidar do that.
Incidentally, there is an intermediate approach with active stereo -- you use a laser a next to the camera, and measure the offset rather than the timing, which should be cheaper.
All three address the same problem. The question is, why do people prefer Lidar.
Re: (Score:2)
Perhaps they're making do with time of flight lidar while waiting for active stereo patents to expire. Or they don't want resolution to decrease dramatically for faraway objects.
Re: (Score:3)
You are 100% correct but, you are describing frequency modulation. The vast majority of lidar companies (on the order of 99%) are producing amplitude modulation systems that shoot out strong amplitude signals and cross their fingers hoping they can see and distinguish it when it comes back. These systems often use "avalanche detectors" to help their probability of return detection (look it up, it's insane). Driving directly towards a sunset can literally cause these systems to emit their magic smoke. Acc
Re: Not just cameras (Score:1)
Avalanche detectors are not damaged by too high light input (well, within reason) assuming competent sensor readout circuit design. Otherwise they would fail when imaging a diffuse mirror :).
Re: (Score:2)
I'm waiting to see what happens with a freeway full of cars with LIDARs, all flinging their beams at each other willy-nilly with direct beams and reflections all over the place. If you're unlucky you'll get a beam from another vehicle just after yours has sent a pulse out - resulting in a false return showing something right in front of you.
If only someone could come up with a way of encoding signals so that a desired signal can be distinguished amongst a mass of other signals using the same wavelength.
Re: (Score:1)
Yes, that is the challenge with laser modulation. It has significant limits that mean it won't scale very well. Others have explained it well in other comments in this discussion.
Re: (Score:2)
What's good for the goose... (Score:3)
All I hear is that there will now be some "smart" people trying to outsmart new tech and cameras by firing lasers, not strong enough to poke your eyes out, but strong enough to burn the retina of the machine.
--
When you have confidence, you can have a lot of fun. And when you have fun, you can do amazing things. - Joe Namath
Re: (Score:2)
This has been happening for ages already. You can buy high power laser pointers and even higher power laser modules online for very little money, and they are more than capable of destroying CCTV cameras and the like. The tricky part is doing it without being caught on the camera itself.
Simple solution, well, not *simple* (Score:1)
We need to develop passive sensors. I mean, we have them, we're just not using them.
1550 nm wavelength is (relatively) eye-safe (Score:5, Interesting)
As the original article [arstechnica.com] duly explains, the laser light at the wavelength of 1550 nm used by this lidar scanner does NOT reach the retina of the eye. At this wavelength, it is fully absorbed in outer parts of the eye (cornea, lens, etc.) before it could get focused into a tight spot on the retina. This makes this wavelength (relatively) eye-safe, comparing to visible and some other wavelength ranges. There is no such protection for the camera however, whose glass optics happily focuses 1550 nm into a small spot... so the sensor damage may happen.
Laser safety regulations are primarily concerned with (a) no damage to humans, especially their eyes, and (b) laser beams not setting things on fire. Neither of this has happened in this incident. So we are good.
If you are interested in technical details of laser safety, read ANSI Z136.1 standard. Warning: it requires technical expertise.
Re: (Score:3)
While I don't know the actual filter construction, a couple possibilities come to mind. First, the filter may be bonded or integrated or be a deposited layer on the sensor itself, and a physical crack in the filter may propagate into the sensor chip. Second, any filter has a finite suppression, and I don't see why the one in the camera has to have it high. So some fraction of light can still get through it, and that could be enough to damage the sensor.
Re: (Score:2)
P.S. An additional possibility: for Si sensor, the filter has to reject 700-1100 nm band. Si is transparent beyond 1100 nm, i.e., the sensor is completely insensitive to longer IR wavelengths. So the filter doesn't need to be effective at rejecting 1550 nm, and I guess it isn't.
Re: (Score:2)
Re: (Score:2)
If the camera was focused for visible light, IR would be out of focus.
...unless the camera was focused at something else at another distance, or simply moved out-of-focus, and the 1550 nm image accidentally came in-focus.
Also, as I've written in another comment, the filter doesn't have to be effective at 1550 nm, because the Si sensor itself is insensitive that far into IR. This depends on the technology used to make the filter. For example, an interference-type spectral filter may perform very well in its designed wavelength range, but simply becomes transparent outside of i
Re: (Score:2)
One is compelled to wonder why one would use a camera that has been custom modified to be sensitive to frequencies that are needed for astrological observation to take pictures of passing automobiles.
Re: (Score:2)
Re: (Score:3)
Retinal damage is not the only ocular risk.
https://velodynelidar.com/newsroom/guide-to-lidar-wavelengths/ [velodynelidar.com]
Re:1550 nm wavelength is (relatively) eye-safe (Score:5, Informative)
Maximum permitted exposure (MPE) in 1500-1800 nm band is the same for the eye and the skin. For continuous-wave light it is 0.1 W/cm2, for pulsed light it is 1 J/cm2. Reference: ANSI Z136.1, see Tables 5a and 7.
In other words, if the 1550 nm laser beam is not burning your skin, it is safe for your eye.
This is remarkably untrue at other wavelengths, where light is dramatically more dangerous to the eye than it is to the skin.
Re: (Score:2)
This is remarkably untrue at other wavelengths, where light is dramatically more dangerous to the eye than it is to the skin.
For sure. You can let an unfocused 100 watt Nd:YAG beam fall on your hand and you might feel a tiny bit of warmth. Letting that same beam hit you in the eyes is a different story.
Re: (Score:2)
This has nothing to do with the LIDAR that is currently in use on self-driving cars. This is entirely different (and fundamentally unsafe) new tech.
Current-generation LIDAR uses low-power (5 mW-ish) lasers in the short end of the near infrared spectrum (905 nm), which can be detected by normal cameras and do not cause them harm. The light frequency is within the detection range of cameras, and they produce about the same amount of output per unit of area as the sun. Pointing at the sun for seconds at a
There go the police ticketing cameras (Score:4, Interesting)
Laser Might Be Over Legal Limit (Score:2)
Rear view cameras (Score:2)
Manufacturer designs camera-killing laser to be mounted on the front of vehicles which have cameras on their rears. What could possibly go wrong?
Sony and security... (Score:3)
Sony does not have a good security track record so this does not come as a surprise to me.
OWASP Secure Coding Practices Checklist section 1 about input validation was clearly not applied at all. Specifically, they failed to implement "Validate data range" :p
Car, or mobile speed-camera zapper? (Score:2)
He missed the warning sticker. (Score:2)