Beyond resolution, sensitivity looms large for infrared thermal imaging cameras

01 July 2022
Oscar Angel
comparing the performance of  a 20 mK camera (top) with a 50 mK camera (bottom).

Thermal imagers make pictures from infrared (IR) energy via digital or analog video outputs, with the details defined by differences in temperature. A camera that can detect visible light won’t see thermal energy, and vice versa.

Infrared cameras are made of an array of individual detector elements. Because IR wavelengths are longer than those of visible light, each IR detector element must be correspondingly larger than those on visible light detectors. As a result, a thermal camera usually has lower resolution than a visible light sensor of the same size.

Originally developed for surveillance and military operations, thermal cameras are now widely used for industrial applications such as building inspections, firefighting, autonomous vehicles, automatic emergency braking systems, industrial inspections, scientific research, and more. These cameras come in a variety of form factors, from handheld to unmanned drones, to scientific instruments sent into space.

Engineers developing products or systems incorporating thermal cameras need to have a clear understanding of the key design specifications including scene dynamic range, field of view, resolution, sensitivity, and spectral range. Different cameras excel at different tasks, so engineers need to understand the tradeoffs between different types of thermal camera modules and their impact on product performance. 

One important specification often overlooked at the expense of resolution is thermal sensitivity, also referred to as the instrument’s noise equivalent temperature difference (NETD). It defines the smallest temperature difference a camera can detect.

A thermal camera’s NETD will have a direct impact on its image clarity and sharpness. The lower the number, the more sensitive the detector. Integrators and developers should look for manufacturers that can provide NETD performance as measured at an industry-standard of 30 degrees C.

Increased sensitivity is especially important in scenes with low thermal contrast and when operating in challenging environmental conditions like fog, cloudiness, and rain. Lower cost thermal cameras with acceptable to satisfactory sensitivity will result in poor image quality in low contrast scenes, reduced detection range, and limited situational awareness.

IR cameras with an imaging sensor cooled to cryogenic temperatures provide distinct advantages over uncooled detectors because they reduce noise to a level below that of the scene being imaged.

Improvement in sensitivity comes at a cost, however. Cooled IR cameras are generally larger, heavier, and more power hungry. They are significantly more expensive to buy and subject to mechanical wear and tear that leads to failure and shorter product life. That’s because cryocoolers have moving parts with extremely tight mechanical tolerances that degrade over time, as well as utilizing helium gas that can slowly leak through seals.

Recent improvements in uncooled thermal sensors, however, have brought sensitivity to better than 20 mK—a drastic improvement over legacy systems, that makes them a viable option for many new applications. But it’s important to note that uncooled IR cameras cannot simply replace cooled cameras. Product developers and system integrators need to consider additional requirements regarding imaging speed, spatial resolution, spectral filtering, and more. 

Oscar Angel is Product Manager, Uncooled Camera Modules, Teledyne FLIR.

For more information on thermal sensitivity, or to speak with an expert, please see

Enjoy this article?
Get similar news in your inbox
Get more stories from SPIE
Recent News
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research
Forgot your username?