Benchmarking Tests Pave the Way for Automotive Lidar Standards

As research on autonomous vehicles slowly gains momentum, an absence of automotive lidar standards creates problems in a market of closely guarded IP
21 January 2020
By Gwen Weerts
lidar street image
Lidar image collected by Lincoln ladar sensor. Credit: Active Optical Systems Group, MIT Lincoln Laboratory

Updated 17 March 2020

In the near future, engineers and representatives from competing lidar companies plan to gather together to test automotive lidar, a key technology that enables autonomous vehicles. They'll need to find a space with 250 meters of flat, open space—the typical distance auto lidar is expected to be able to detect (and help a vehicle plan for) an obstacle.

Except that 250 meters isn't always the standard range for auto lidar, because the functional distance of lidar is directly related to the vehicle's speed. If a car is driving on the autobahn, for example, then the car needs to be equipped with 250-meter lidar, but if it's moving slower, then a much shorter distance is acceptable. Lidar for some applications, like congested city driving, really only needs to be able to detect objects 50 meters away or less.

The fuzziness surrounding lidar's functional range requirements illustrates an issue with current lidar specs: there are no standard measurement methods, which means that neither lidar suppliers nor auto manufacturers have a way to compare lidar products. No one is measuring the same thing.

"It's a wild west right now," says Paul McManamon, lidar expert, author, and president of Exciting Technology, LLC. "You can't compare between one and the other. And no one tells you the performance. They won't tell you how it works, and they won't tell you the performance."

Frustrated by this lack of transparency, McManamon plans to develop a series of multi-vendor benchmarking tests so that auto lidar companies will have something to compare against. Since lidar is an optical technology, he saw SPIE, the international society for optics and photonics, as an obvious host for a lidar benchmarking event.

"SPIE is delighted to support this new project in lidar benchmarking as part of the frontier of applied 3D sensing," says Peter Hallett, SPIE Director of Industry Relations. "We are eager to help suppliers and manufacturers work toward standards in one of the fastest growing markets based on optical engineering and machine learning. We encourage all lidar developers to provide input on the test plan and sign up early to participate."

The idea is to set up a large parking lot, likely at a sports stadium, to test key lidar competencies: range of effective object detection; resolution, which is how accurately the lidar identifies and classifies objects, both moving and stationary; reflectance confusion, meaning how well the lidar can see something in the presence of bright objects, like reflective signs or bright sun; and obscurants, like fog and heavy rain, which can scatter the laser pulses. Since lidar uses lasers to detect objects, they'll also test eye safety in these scenarios. Although laser eye safety is already carefully regulated standard, there's no data about which lidar companies are at the highest or lowest end of the safety threshold, nor the impact of multiple vehicles in close proximity all using lidar on automated vehicles.

Although initially planned to take place in the Angel Stadium parking lot during the SPIE Defense + Commercial Sensing conference in April, the event will be rescheduled due to the COVID-19 outbreak and resulting restrictions. When the lidar tests ultimately take place, McManamon would like to follow up with what he hopes will be a "raucous" discussion, where lidar companies can discuss the tests, where they encountered problems, and what should be done next. He hopes that these tests and discussions will result in a set of official standards sometime in the next three to four years.

McManamon is aware that neither he, nor SPIE, are experts in standards or benchmarking. Fortunately, two representatives from the National Institute of Standards and Technology (NIST) are on the organizing committee, and they will guide the group as to how to conduct benchmarking. "We're optics people, but not standards people," says McManamon. "They'll help us understand what has to happen in order to get to a standard."

One thing everyone agrees on is that the results of the tests will need to be published, and the participating companies will need to remain anonymous. The organizers plan to assign secret identifiers to the participating lidar companies, such as as ABCD, etc. Each company will know their own letter, but not the letter of the other participants. McManamon expects this approach to be helpful for lidar companies. "They'll be able to go to their investors and say, ‘We have unbiased measurements of our performance in a standardized format.' If you have intelligent investors, auto lidar companies need to have independent verification," he says.

In five to ten years, lidar is expected to be a key enabler in a growing market for autonomous vehicles, which means that lidar performance will have a direct impact on the safety of those vehicles. Standards of measurement will need to be developed, and one way to start is by getting together lots of lidar manufacturers. "I think this will be a very useful thing," says McManamon. "Besides that, it's fun."

Enjoy this article?
Get more information about SPIE Defense + Commercial Sensing in your inbox
Get more stories from SPIE
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research