Sensing an Autonomous Vehicle Future
The race is on to capture the lucrative autonomous vehicle (AV) market: traditional automakers such as Volvo, Mercedes, BMW, Audi, Toyota, GM, and Ford are vying with new market entrants Apple, Uber, Waymo, Oxbotica, and Baidu. To meet the definition of a fully autonomous vehicle it must be able to navigate without human intervention, to a predetermined destination, over roads that have not been adapted for its use. The prize at the end of the day is massive. According to New York-based Kenneth Research, the global self-driving car market will expand at an annual rate of 36.2 percent, leading to global revenue of $173.15 billion by 2023.
Self-driving car systems utilize vast amounts of data from image-recognition systems, along with machine learning and neural networks, to build systems that can drive autonomously. The race for the fully AV continues, but the current state of sensing technology is hampering its commercialization. Sensor problems have been blamed for several accidents in recent years.
In March 2018, an Uber autonomous Volvo SUV killed a woman in Tempe, Arizona, in what was the first reported crash involving an AV and a pedestrian. A year earlier, Uber suspended tests after one of its autonomous vehicles was involved in a multicar collision, although on this occasion there were no serious injuries. In 2016, the California Department of Motor Vehicles ordered all Uber AVs off the road after several were filmed running red lights.
Uber is not the only company to suffer problems with AV technology. Also in 2016, the first death attributed to an AV occurred in Florida when the autopilot sensors on a Tesla Model S failed to detect an 18-wheel truck and trailer; the driver was killed in the collision.
Traditionally, four types of sensors form the AV toolkit-video cameras, radar, ultrasonic sensors, and lidar. Radar uses radio waves to detect objects, while lidar uses light waves. Because of the shorter wavelengths, lidar is more accurate than radar, although the latter is used in applications where detection distance is critical, but not the exact size and shape of an object. Each type of sensor has its limitations, whether it be glare that distorts video, radar's poor vision abilities, ultrasonic's distance challenges, or lidar's inability to cope with poor weather. For the autonomous vehicle to be roadworthy, its perception must be accurate enough to enable the classification of any object at a variety of distances.
The answer could be in far infrared (FIR) thermal sensors, which give vehicles complete reliable detection of the road and its surroundings. Numerous companies have been developing FIR camera systems that can see people and objects in extremely challenging conditions. It is a solution that meets the approval of Tesla CEO Elon Musk, who recently proclaimed that he does not have the right sensor suite available, and FIR could be the answer.
Israel-based startup AdaSky is one such company with an infrared solution. Their Viper thermal sensing camera passively collects heat signatures from nearby objects, and converts them into a VGA video, and computer vision algorithms detect and classify the objects.
"Unlike other sensing options, thermal sensors do not require any light to accurately detect, segment, and classify objects and pedestrians and are therefore well suited to improve AV systems safety drastically," Raz Peleg, AdaSky sales director, explains. "Far-infrared sensors can deliver reliable, accurate detection in real time and in any environmental condition because they access an additional layer of information that the other sensors do not. While radar and lidar sensors transmit and receive signals, a FIR camera passively collects signals by detecting the thermal energy that radiates from objects. By sensing this infrared wavelength that is far longer than visible light, FIR cameras access a different band of the electromagnetic (EM) spectrum than other sensing technologies do."
Infrared radiation operates at the lower end of the EM spectrum and is therefore invisible to the human eye. The infrared section of the electromagnetic spectrum is found between the visible waves and the microwaves. The infrared wavelength is between 0.75 and 1000 µm and is separated into three regions: near-infrared from 0.75 to 3 µm, mid-infrared from 3 to 8 µm, and far-infrared above 8 µm—although there is some disagreement about these ranges in the literature. The infrared window that is commonly used in low-cost thermal imagers spans 8–14 μm wavelengths, also known as LWIR (long-wave infrared). Thus, the FIR camera can generate a new layer of information, making it an all-weather solution that enables AVs to detect objects that may not otherwise be perceptible to radar, cameras, or lidar.
"It's also worth noting that FIR's passivity offers another advantage to autonomous vehicles: no interference," Peleg adds. "Because lidar and radar are active, energy-emitting modalities, the lidar and radar installed and functioning on one vehicle may interfere with and upset that of another passing vehicle. Conversely, as a passive technology, FIR can work to detect a vehicle's surroundings without ever upsetting the sensors of other vehicles.
Autonomous vehicle platforms. Photo Credits: Waymo, Tesla, Uber
Creating a Sensor Platform
Incorporating sensors into autonomous or standard road cars is the task of tier one suppliers, such as Sweden-based Veoneer, who put together a complete sensing platform with the software algorithms required to combine the data from the range of sensors with GPS information. Although these sensor systems will be the brain of future autonomous cars, they are also very much part of today's cars, delivering advanced driver assistance systems. Veoneer has sensor systems on BMW, Audi, Mercedes, Bentley, Rolls Royce, Peugeot, Cadillac, Porsche, and Lamborghini, and is working with new market entrants as well.
Veoneer's thermal sensing system uses an infrared camera mounted in the front grille of the vehicle that senses heat differences as sensitive as a tenth of a degree to create a highly detailed thermal image of the world out in front of the vehicle. An onboard computer runs custom algorithms to detect animals, pedestrians, and cyclists up to 100 meters ahead of the vehicle and reacts in less than 150 milliseconds to detect and highlight them on an in-car display. Thermal sensing systems help drivers see objects three to four times farther than the vehicle headlight range and improve visibility in fog, smoke, and oncoming headlight scenarios.
"The carmakers that we work with recognize that they need thermal imaging," Stuart Klapper, senior director at Veoneer, explains. "Vison at night is clearly a problem with 69 percent of all pedestrian traffic fatalities occurring at night, over 4,000 a year. Now, as we enter into autonomous vehicles, the problem becomes even greater because when you have a human in the loop, they could react to critical use cases that are not easy to experience if you're in an autonomous vehicle."
This means AVs must have more information and better redundancy to continually be able to handle all those situations. Night driving is particularly problematic. "Vision cameras do not have enough resolution in clarity at night," Klapper adds. "Shadows on the road and oncoming headlights add to the complexity, making it difficult for a vision camera to be able to detect objects. With radar, sometimes you can get information that there is something ahead, but you cannot classify what that object is. The same is valid with lidar; it is excellent at 360-degree awareness, but there are many vital situations where it cannot classify an object."
The only technology capable of reliably seeing objects at night, especially humans and animals, is a far infrared camera system. There is much discussion about why the Uber vehicle hit the pedestrian at night in Arizona, but one thing that most agree on was that if there had been a thermal device on the vehicle, there would have been a better chance to avoid the accident.
The biggest challenge facing av manufacturers right now is safety. "Many of the manufacturers are making systems that are very effective in normal roadway conditions; however, where these systems struggle is what we call corner cases and challenging lighting," says Paul Clayton, general manager at FLIR OEM and Emerging division.
The current AV sensor suite has a performance gap in low light and darkness. Corner cases refer to those relatively uncommon, but genuine scenarios that human drivers can typically negotiate but pose significant problems for AV systems. "At the moment, the industry is focusing on solving those complex and less common roadway scenarios, which require more sensor technology and better algorithms," Clayton adds.
The general feeling from the automotive industry is that the existing suite of sensors deployed on autonomous vehicles today has proven to be insufficient for all conditions and roadway scenarios. "That is why automakers and suppliers have begun to examine complementary sensor technology, including thermal cameras, or what we call long-wave infrared cameras," Clayton says.
"FLIR sees thermal imaging as a complementary technology that improves upon the prevailing sensors by offering a parallel and redundant dataset to help improve the automated decision making on the vehicle system while also adding new data that the prevailing sensor system cannot detect—heat energy," Clayton adds. "That heat energy, or infrared radiation, constitutes a different part of the electromagnetic spectrum that the other sensors do not capture, be it lidar, visible, ultrasonic, or radar sensors."
In essence, thermal imaging enables the autonomous car to see warm objects, which is everything above -273 degrees Celsius. Thermal cameras allow the AV to see the things we least want to hit, no matter if the vehicle is in complete darkness, in cluttered city environments, facing the direct sun, or headlight glare.
To bring AVs to the mass market, the consensus amongst developers is that each vehicle will need to be fitted with a multisensor platform. In order to deliver the complete detection capabilities required, carmakers are leaning towards adopting multiple FIR sensors that will deliver the highest possible levels of safety. FIR is seen as the enabling sensor for full autonomy because, in addition to its impressive sensing capabilities, the technology is also uniquely affordable for mass market deployment. Many within the industry are predicting that it could be the final piece in the AV puzzle, hastening the advent of an autonomous future for driving.
Mark Venables is an award-winning technology writer who has covered science, innovation, and transformational technology for leading newspapers and magazines for over 25 years.
|Enjoy this article?
Get similar news in your inbox