Articles, Perception & Sensor Tech

Automotive Sensor Technology: LiDAR Vs. Radar Vs. Cameras

There are dozens of startups pushing LiDAR as the future of autonomous vehicle technology, but a growing number of companies are looking to alternatives that include cameras and radar. Which is the definitive technology for AVs? Which will come out on top? That question has yet to be answered, but there are a few unique companies trying to rethink automotive sensor technology. Their creations might provide a few clues for what the future holds.

“We had a booth at CES a year and a half ago, and the most common comment we got was, ‘Where’s your LiDAR? You guys aren’t a LiDAR company,’” said Paul Banks, founder and CEO of TetraVue, a startup building 4D LiDAR video cameras. “In a sense that’s true. We are a camera company and the camera is able to make a distance measurement for every single pixel and every frame.” TetraVue’s technology isn’t technically LiDAR, but Banks said his firm uses “the same basic physics measurements.”

“We have what we call ‘optical time of flight,’” Banks explained. “We have this optical modulator that we put in front of a normal camera sensor, just like the one that’s in your cell phone, and that modulator gets us a distance measurement from every single pixel for the same picture. Instead of 64 points, we have done cameras that are HD, so you get 2 million distance measurements at the same time.” This results in a sensor that can clearly visualize a wide range of details – not just other cars but also potential obstacles, pedestrians or a kid riding a tricycle.

TetraVue isn’t the only company trying to use cameras to overcome LiDAR’s shortcomings. Outsight is another such venture, developing a 3D semantic camera that can detect the size, position and chemical composition of objects – including skin, plastic, metal and snow – without machine learning. Co-founder Raul Bravo thinks this is an important part of the camera’s development.

“There’s a tendency [toward] machine learning,” said Bravo. “Our contrarian approach is that machine learning is not a silver bullet. It’s not something that should be used for in every situation.” Bravo envisions a world in which vehicles are capable of recognizing that something is there – a person or object that doesn’t belong – without necessarily worrying about the specifics.

“If it’s in front of you, in your lane and shouldn’t be there, sometimes you just have to react,” said Bravo. He worries that if a car is relying on machine learning, it might waste too much time evaluating the scenario instead of reacting. With Outsight, he hopes cars will achieve a greater degree of situational awareness.

“It means not only feeling the environment but also understanding the environment,” he added. “We are fusing, in one sensor, the sensing and understanding that you need for smart machines to work.” John Xin, co-founder and CEO of Lunewave, a startup developing a high performance, high value automotive radar sensor system for AVs, sees value in all of the technologies. But he also recognizes their weaknesses.

“I think cameras [have] a distinct advantage of interpreting signs, so [they] are extremely important,” said Xin, whose company offers custom-made Luneburg lens antennas in various sizes. “Ultrasound is mostly for parking – the tricky part is that it’s very close range, it can’t really detect more than a few feet.” LiDAR, on the other hand, has very fine angular resolution, which makes it ideal for differentiating between objects. But when fog or a snowstorm hit, both LiDAR and cameras struggle to perform at full capacity.

“This is why the industry knows that radar is here to stay,” said Xin. “It is the only one that functions well during poor weather conditions.

About the author:

Louis Bedigian is an experienced journalist and contributor to various automotive trade publications. He is a dynamic writer, editor and communications specialist with expertise in the areas of journalism, promotional copy, PR, research and social networking.