Power rules ensure lasers are safe for human eyes—but not necessarily for cameras.
A man attending this week’s CES show in Las Vegas says that a lidar sensor from startup AEye has permanently damaged the sensor on his $1,998 Sony camera. Earlier this week, Jit Ray Chowdhury, an autonomous vehicle engineer at the startup Ridecell, snapped photos of a car at CES with AEye’s lidar units on top. He discovered that every subsequent picture he took was marred by two bright purple spots, with horizontal and vertical lines emanating from them.
“I noticed that all my pictures were having that spot,” he told Ars by phone on Thursday evening. “I covered up the camera with the lens cap and the spots are there—it’s burned into the sensor.”
In an email to Ars Technica, AEye CEO Luis Dussan stressed that AEye lidars pose no danger to human eyes. But he didn’t deny that AEye’s lidars can cause damage to camera sensors.
“Cameras are up to 1000x more sensitive to lasers than eyeballs,” Dussan wrote. “Occasionally, this can cause thermal damage to a camera’s focal plane array.”
Chowdhury says that AEye has offered to buy him a new camera.
Lidar is essential for self-driving cars; many experts believe that it won’t be possible to reach full autonomy any time soon without it. So in the coming years, we can expect to see more and more lidar sensors atop cars on public streets.
Crucially, self-driving cars also rely on conventional cameras. So if those lidars are not camera-safe, it won’t just create a headache for people snapping pictures with handheld cameras. Lidar sensors could also damage the cameras on other self-driving cars.
The big question is how common this kind of damage is—and whether it’s specific to AEye’s lidar or is a problem across the industry. Dussan wrote that AEye is “fully committed to implementing mitigation technology” and described camera safety as “a complex issue that the entire LiDAR and laser community will need to address.”
But at least one competitor disputed that statement.
“Camera safety is not a complex issue for Ouster products,” wrote Angus Pacala, CEO of lidar startup Ouster. “Our sensors are camera and eye safe. Period.”
It’s worth noting that companies like Alphabet’s Waymo and GM’s Cruise have been testing dozens of vehicles with lidar on public streets for more than a year. People have taken many pictures of these cars, and as far as we know none of them have suffered camera damage. So most lidars being tested in public today do not seem to pose a significant risk to cameras.
AEye uses powerful laser pulses to measure distances
There’s no dispute that powerful lasers can damage cameras. The website of the International Laser Display Association, which represents companies running laser light shows, says that “camera sensors are, in general, more susceptible to damage than the human eye” from lasers. It warns consumers to never point a camera directly at laser emitters during a laser show.
In the below video, the video camera appears to become permanently burned out after focusing directly on a laser source at a wedding.
While laser light shows operate in the visible spectrum—viewers wouldn’t be able to see them otherwise—lidar units tend to operate outside the visible spectrum. Lidar designed for automotive applications tends to use one of two frequency ranges.
Some lidar companies use lasers with a wavelength of 905nm (or 850nm in Ouster’s case). These wavelengths are attractive to companies because sensors can be made using conventional silicon-based fabrication techniques. The downside, however, is that it’s relatively easy for lasers at these wavelengths to damage the human retina. So safety requires strict limits on laser power.
Other lidar makers use lasers with a wavelength of 1550nm. This tends to be more expensive because sensors have to be made out of exotic materials like indium-gallium arsenide rather than silicon. But it also has a big advantage: the fluid in the human eye is opaque to 1550nm light, so the light can’t reach the retina at the back of the eye. This means lasers can operate at much higher power levels without posing an eye safety risk.
AEye uses 1550nm lasers. And unfortunately for Chowdhury, cameras are not filled with fluid like human eyes are. That means that high-power 1550nm lasers can easily cause damage to camera sensors even if they don’t pose a threat to human eyes.
AEye is known for claiming that its lidar units have much longer range than those of competitors. While most lidar makers say their high-end lidars can see 200 or 300 meters, AEye says that its lidar has a range of 1,000 meters. When I talked to AEye CEO Luis Dussan about this claim last month, he said that one factor in AEye’s long range is the use of a powerful fiber laser.
“One of the most important things about fiber lasers is that they can be amplified,” Dassan said. “Very short pulse, huge amount of signal.”
Lidar maker Blackmore argues continuous-wave lidar is safer
AEye lidar is a time-of-flight system. This means that it measures the distance to an object by sending out a short pulse of light and measuring the time it takes for it to bounce back.
But not all lidars are designed this way. Other lidars use an alternative approach called continuous-wave frequency modulation (CWFM), which (as the name implies) measures distances by sending out a continuous laser beam with a steadily changing frequency. This kind of lidar measures distances by observing how much the frequency of the laser beam has changed between when it was sent out and when it bounces back.
Stephen Crouch is the CTO of Blackmore, an AEye competitor that uses a CWFM approach to measure distance. He told Ars that AEye’s pulsed approach is more likely to damage cameras.
“Pulsed lidar uses bursts of energy that can overwhelm a camera’s detector elements,” Crouch wrote by email. “Continuous wave FM lidar uses less power” for comparable range, “so by default it would be more eye- and camera-safe.”
Blackmore hasn’t done a systematic study of camera safety for its lidar units, Crouch said. But he did say that when filming a promotional video recently, “we positioned a high-end video camera facing our Doppler sensor—about four feet apart—and recorded it in operation for 45 minutes. The lens and the resulting footage are undamaged and unmarked.”
Update: I originally said that Luis Dussan confirmed that AEye lidars could damage camera sensors. But the company says that when Dussan wrote “this can cause thermal damage” he was referring to lasers in general rather than AEye’s lasers in particular. So I’ve updated the story to say that Dussan didn’t deny that AEye lasers can damage camera sensors.
Listing image by Jit Ray Chowdhury