Shedding Light on New Automotive Applications

Light-based applications are increasing as vehicles become autonomous, even if many questions remain concerning LiDAR
23 March 2020
By Andy Extance
LiDAR systems today can detect large objects
LiDAR systems today can detect large objects, such as cars, from as far as 200 meters, pick up a pedestrian 70 meters away or spot road debris at 50 meters away. Image credit: Osram Opto Semiconductors

Photonics-based sensing for vehicles is "currently in a very exciting phase" according to Jörg Strauss, General Manager and Vice President Visualization and Laser, at Regensburg, Germany-headquartered Osram Opto Semiconductors. He notes that advanced driver assistance systems (ADAS) that exploit technologies like infrared light-based driver monitoring and pre-crash sensing are becoming more common. "We are even seeing autonomous vehicles on public roads in some communities," Strauss stresses. "The amount of light-based applications within the automotive area increases year by year."

Cars are becoming increasingly driver-friendly, safer and more comfortable, Strauss underlines. "Thanks to automation, drivers have fewer systems to operate manually, allowing them to concentrate more on traffic," he says. "Many systems automate, generate or analyze visible and non-visible light for such tasks. Current examples include adaptive speed control, pre-crash sensors, and blind spot monitoring. Besides this, driver-monitoring systems are getting more important - and will still be relevant for level 3 and 4 of autonomous driving."

Delivering these capabilities challenges car makers because current ADAS are very complex. "A fully autonomous vehicle, for example, needs a full 3D view of its surroundings for the algorithms to determine the car's next action," Strauss says. "Cameras, radar, and Light Detection and Ranging (LiDAR) are the key sensor technologies. Individual systems will be combined. The top players in the field are having fundamental evaluations about which direction their business should go in future."

The optical technology getting perhaps the greatest attention for automation in vehicles is LiDAR. "Unlike human drivers, these systems never get distracted, nor do they take precious seconds to act," says Strauss. "Scanning LiDAR sweeps an infrared laser beam across the car's surroundings and creates a high-resolution 3D image. The systems today can detect large objects, such as cars, from as far as 200 meters."

There are two main types of LiDAR systems, explains Jake Li, Business Development Manager - Auto LiDAR, at Hamamatsu, Japan-headquartered Hamamatsu Photonics. In time-of-flight (ToF) LiDAR, pulses of light emitted from a light source travel through space. When they hit objects, light is reflected back and detected by the photodetector. In this approach, Li explains, the round-trip time between light emission and return can indicate the distance to an object. Frequency-modulated continuous wave (FMCW) LiDAR looks at the frequency shift between the reference frequency transmitted and received. This provides information about both an object's distance, and its velocity through the Doppler Effect.

Time of Flight LiDAR

Time of Flight LiDAR uses the time between light emission and detection after reflecting off an object to determine its distance from the detector.  Credit: Hamamatsu

Building images requires scanning light across the environment using beam-steering components, such as MEMS mirrors or mechanical spinning mirrors. "LiDAR is the leading sensor, and for good reason - it provides both day and night vision," adds Joseph Shaw, from Montana State University. However Shaw notes that the need to avoid damaging people's eyes, places constraints on laser power, which limits LiDAR range. Larger receiver optics can extend range, but also increase the size of the LiDAR, which must be as compact as possible. Shaw notes that atmospheric conditions including fog, rain, and snow all affect LiDAR performance, which "gets talked about less than it probably should." Another challenge is building a picture of the surrounding world fast enough. To achieve this, system designers are increasingly adopting several cheap LiDARs with narrow fields-of-view, rather than an expensive one that scans the entire environment, Shaw says.

LiDAR's unclear outlook

Shaw suggests that the best solution for autonomous vehicles will probably be a synergistic combination of LiDAR and passive imaging. "I've never seen a problem yet that was solved by just one sensor," he says. Shaw believes that thermal imaging is likely to be an important addition because of its low cost, high supply, and ability to see in the dark.

Yet he also disagrees with Tesla founder Elon Musk's assertion that LiDAR is "a fool's errand." "Information produced by a LiDAR system is very valuable for the perception problem," Shaw says, superior to visible and thermal cameras alone. "You really can't beat the idea of LiDAR for reaching out into the dark." Yet the stringent automotive qualification process is a key challenge for optics companies looking at selling products for use in LiDAR, Li adds. To pass such tests, carmakers demand reliable performance in harsh and humid environments, over temperature ranges spanning from -40°C to 105°C. Longer detection range is also need ed, requiring higher power lasers with narrower pulses and higher sensitivity/lower noise detectors to improve signal-to-noise ratios.

Most systems exploit 905nm light, which can be paired with lower cost silicon detectors, Li explains. However, this visible wavelength range imposes restraints on laser power due to greater concerns about eye safety. 1550nm light, which requires InGaAs detectors, is considered to be safer for human vision. Companies can therefore use much higher output sources like fiber lasers at this wavelength, enabling longer detection ranges, Li explains.

Hamamatsu discussed such issues at an all-day event, at SPIE Photonics West 2020 in February, At this event, at its booth, and elsewhere, Hamamatsu sought to help LiDAR system makers navigate the range of component choices available for the different wavelengths. "Unlike others, Hamamatsu offers a very complete product line of detectors and light sources," Li says. "Therefore we are in the position to provide most unbiased recommendations for each unique LiDAR design."

Reducing cost is also critical to make the devices suitable for the high volumes demanded by automotive applications. Li says that this is forcing LiDAR and component makers to make significant improvements. "We're working on different manufacturing refinements and design changes, to hopefully allow our customers to meet pricing targets," Li says.

Hamamatsu also presented its view on the automotive LiDAR market's challenges and trends at Photonics West. Li explains that his company can help its customers by offering different levels of optical assemblies and high-level integrations with detector, light source and various electronics in the future. Hamamatsu is working towards enabling more integration possibilities to reduce the manufacturing complexity of LiDAR system designs. All detectors need electronic components like amplifiers to boost output signal, filters to block the ambient light, application-specific integrated circuits (ASICs) for signal processing, he emphasizes. Integrating such components into detector packages provides multiple advantages. Most critical is reducing the number of components needed to put through qualification processes before use in cars, Li says.

Due to such challenges, widespread LiDAR adoption in commercial vehicles will probably start in 2021-2025 in ADAS safety systems, Li says. LiDAR in fully automated vehicles will come in much later, after 2030, although LiDAR will also be adopted in fleet vehicles, buses and taxis, Li adds. Delivery systems, industrial automation, robotics, mining, and agriculture will probably adopt LiDAR before automotive applications. Yet, Li notes great market diversity, with no consensus on the optimum solution, and companies instead exploring different concepts.

Resolution revolution

Jennifer Ruskowski, Head of 3D Sensors at Fraunhofer Institute for Microelectronic Circuits and Systems (IMS) in Duisburg, Germany echoes this point. "Nobody knows what is right, what is the best choice, and what is the cheapest choice," she says. And when it comes to vehicle automation, which application to focus on is also an open question, Ruskowski believes.

In terms of detector choice, for most designs that exploit 905nm light avalanche photodiodes (APDs) are popular. That's in part because they have good gain and high photon detection efficiency (PDE), Ruskowski explains. But they are difficult to form into arrays, which create the images in many LiDAR system architectures, as they are bigger and "consume a lot of power," she says. This makes it hard to achieve LiDAR resolution necessary for automotive applications. APDs are also sensitive to changes in temperature, making them challenging to use in the extremes of heat and cold vehicles can experience.

Fraunhofer IMS has built a flash LiDAR camera

Fraunhofer IMS has built a flash LiDAR camera to show the capabilities of its SPAD detectors. Credit: Fraunhofer IMS

As such, Ruskowski sees a trend away from the use of APDs in favor of single photon avalanche diodes, or SPADs. SPADs can be made cheaply using silicon CMOS processes, and their associated electronics are easy to implement, but currently SPADs don't have the same PDE as APDs. That's one of the many aspects of SPADs that the IMS team is working on improving.

And, in the next few years, the IMS will be working to improve SPAD technology to enable 3D integration techniques that will improve other performance metrics. "When you think of flash LiDAR, you need high resolution - VGA or QVGA - it's nice to have," Ruskowski says. "The biggest thing is to achieve a fill-factor with high pixel resolution, and also increasing PDE on the same time." On Sunday morning, February 2nd, in session 2 of Quantum Sensing and Nano Electronics and Photonics XVII, she presented a new SPAD detector architecture for high-resolution 2D arrays.

Another feature of IMS' LiDAR detectors is that it considers weather conditions, in particular "background light," noise originating from sunlight. The researchers implement several algorithms on the chip level, using special measurement methods. This approach enables them to determine sunlight photons because their arrival is not correlated with the emission of a laser pulse. Several SPADs "have to be fired in a distinct time frame before you can say "OK, this is a signal and not sunlight," Ruskowski says. This approach offers an improved signal-to-noise ratio, which allows longer-range measurement distances, even under strong sunlight conditions.

"Due to the fact that we built one component of LiDAR, we tried to build a camera to see how well the detector works, and the improvements that can be seen in our live videos," explains Ruskowski. "We can implement other laser sources and our customers can see what is good for the different applications."

Elsewhere at Photonics West, Osram Opto Semiconductors participated in sessions about blue laser technology and quantum-dot based LEDs, Strauss notes. The company also presented its latest VCSEL products at the Photonics West Exhibition. Markus Arzberger, the company's General Manager, Product Line Sensors, chaired the Photonics Mobility Forum on the industry stage Wednesday afternoon. "This session highlighted the growing role of optics and photonics in today's autonomous systems marketplace," Strauss says.

Huge demand

Together with Joyson Safety Systems, Osram's products enable "Super Cruise," the industry's first true hands-free driving technology for the highway. Osram's infrared LEDs, or IREDs, and LEDs, are embedded in Joyson Safety Systems' steering wheel in the Cadillac CT6, Strauss explains, allowing the system to monitor driver attentiveness. "Multi-color LEDs are used to alert drivers if they look away from the road too long and to show the vehicle's autonomous status," he says. Osram also cooperates with Rinspeed, a Swiss automobile manufacturer, which "shows how autonomous cars could look in future." "You get access to the car through biometric identification technologies like facial recognition," Strauss says. "Thanks to Human Centric Lighting, the car adjusts the brightness of ambient lighting to help passengers feel more comfortable."

Osram believes that the more its components improve in terms of brightness and reliability, the more they help the overall systems which are needed for autonomous driving to progress. "It is essential that infrared lasers for LiDAR cover a long distance and enable high-resolution pictures for the infrared cameras," Strauss stresses. "In general, the better each component of the complete system gets, the more reliable they become and the faster they can be adopted by customers."

Driver monitoring systems

Driver monitoring systems will be important all the way through to level 4 vehicle automation. Credit: Osram Opto Semiconductors

At Montana State, Shaw works at the system level, designing LiDARs for many different applications, most recently using MEMS devices to scan the environment. At Photonics West, he used his knowledge to teach a course called "Introduction to LiDAR for Autonomous Vehicles." "It's a half-day short course that covers the basic principles and physics, as well as the optical layout of LiDAR," he explains. "We discuss the challenges presented by autonomous vehicle LiDAR and how that drives the design and development of new technologies to enable better systems at a lower cost."

Such a course is needed, because ADAS are appearing rapidly, Shaw notes. Currently, ultrasonic systems and radar are commonplace in high-end cars. Radar is the best-developed, fully integrated low-cost solution in ADAS, but it is nowhere near the capability of LiDAR. Shaw believes that there will be a greater uptake of LiDAR when full autonomy is a bigger thing. "Behind closed doors, a lot of this is being done very quietly and people are trying to just run faster than each other and then reveal their great product when it comes time," he says. "There's a huge demand for people who can work with the software and hardware of LIDAR systems. But the number of academic programs that actually teach people and give them hands-on experience with designing, building, and using LiDARs is extraordinarily small."

Progress in the field underlines the need for better LiDAR skills in particular. "The driver assistance world has grown very rapidly and is becoming quite mature already," Shaw stresses. "There are a lot of sensors that are being deployed already that are very low cost and very practical, but they're nowhere near the capability of the LiDARs that we're considering as tools to be fully autonomous."

Andy Extance is a freelance science journalist based in the UK. A version of this article originally appeared in the 2020 Photonics West Show Daily.

Related SPIE content:

The Wild West of Automotive Lidar

Sensing an Autonomous Vehicle Future

Benchmarking Tests Pave the Way for Automotive Lidar Standards

Lidar: a new self-driving vehicle for introducing optics to broader engineering and non-engineering audiences

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?
close_icon_gray