Imagery analysis: standards needed for the drone age

Breaking down statements into their component facts, and developing a reporting format that includes the methods used to obtain them, helps improve the quality of image analysis.
21 August 2014
Barbara Grant

Imagine that aerial imagery gathered from an unmanned aerial vehicle (UAV) sensor shows you at the flashpoint of a demonstration gone violent. You are now facing criminal charges, and key to your guilt or innocence is IR imagery depicting some nearby bright flashes. You know that they were sun reflections off broken glass, but prosecutors contend that they are the IR signatures of Molotov cocktails. Their expert (who has never seen Molotov signatures on IR before) has calculated that sun reflections from glass were too weak to be seen by the camera. Your expert concluded otherwise, but you are more than a little apprehensive as you realize that your fate now hinges on scientific statements made in a venue where no standards to judge their quality exist.

Purchase SPIE Field Guide to IR Systems, Detectors and FPAsThe above scenario is less far-fetched than it sounds. It is the result of combining two conditions—the expected proliferation of UAVs in America's skies1 and the lack of standards to judge the quality of an analysis—with some recent history, namely, the controversy over the origin of bright flashes appearing on forward-looking IR (FLIR) imagery captured during the 1993 standoff between religious sect members and federal agents at Waco, TX. (By federal agents I mean individuals acting on behalf of the government. No specific agency affiliation should be assumed.) The dispute's primary technical issue was whether these flashes represented gunfire or some other phenomenon, such as solar reflections from debris. By examining issues pertinent to this controversy, we can gain insight into what is required to improve the quality of imagery analysis.

My approach to the task borrowed from my background in radiometric calibration. Like calibration, analysis is a process. Component parts—facts, in the analysis case—are arranged by the analyst in a logical order and lead to a result. Each statement that an analyst takes as fact contributes an uncertainty to the outcome. By examining significant statements from analysts who had worked on the flash problem,2–6 I developed error categories. These categories allowed me a closer look at what analysts thought to be the facts behind their statements. I also documented how analysts obtained their facts, with observation, modeling, testing, and literature searches among the methods.

The error categories I identified were: errors resulting from an incorrect calculation or model; errors embedded within analysis logic; and errors arising from an analyst's lack of knowledge about a key phenomenon. An example of the first error type is the statement that sun reflections do not show up on thermal IR imagery. While this error in fact resulted from incomplete modeling, it was also contradicted by the data. Figure 1 shows a metal post, circled, from which a strong glint may be seen in Video 1.7 Even if we did not have the photo for reference, the close alignment of the long shadow from the tower with our viewing direction is a strong indicator of sunglint.


Figure 1. Metal post near swimming pool, Federal Bureau of Investigation photograph of Mt. Carmel complex, 19 April 1993.7

Uncovering the facts behind an embedded error required a few more steps. According to several government analysts, persons had to be viewable near an IR flash for the flash to be a gunfire signature.8, 9 Embedded within this proposition are the supporting facts that the thermal and spatial resolution of the IR video were sufficient to allow persons to be seen. Video 2 refutes these underlying assumptions, showing that persons were viewable only when the thermal contrast is great enough to allow it.10 The video also exhibits insufficient spatial resolution to see individuals who had stopped moving. Figure 2 depicts conditions near a flash in which the likelihood of seeing persons is even lower than we see on the video. I sought to decouple this error from analysis logic by selecting the reporting format shown in Table 1, which prominently features key facts and information on how they were obtained.


Figure 2. Bright flash and surrounding area, from Federal Bureau of Investigation FLIR videotape, 19 April 1993.10
Table 1.Sample reporting format incorporating both facts and methods. FLIR: Forward-looking IR.
Number Fact How determined?
1 Sun reflections may be seen in the IR images By observation; this correlated with calculations
2.1 No people are seen near the flashes By observation
2.2 Spatial and thermal resolution were sufficient to view persons near flashes By observation
3 Gunfire flashes do not last long enough to have generated the flashes on FLIR Literature survey

The most obvious source of error in the Waco controversy was analysts' lack of experience with the primary phenomenon, gunfire, and I created a separate category called ‘phenomenology error’ to address it. I suggested that the practice of retaining discipline specialists—standard in audio and video authentication today—be extended to subject matter in which the analyst has no experience. For most Waco analysts, these subjects included muzzle flash phenomenology, the differences between sniper rifles and combat arms, military versus commercial ammunition, flash suppressants, and environmental influences on flash signatures. The specialist's judgment on the issue would appear in the report with his/her name and credentials documented in the ‘How determined?’ column.

My work thus far11 has focused on breaking down statements used in analysis into their component facts, and developing a reporting format that includes the methods used to obtain them. I am now working on improvements, including developing metrics allowing uncertainties to be quantified. For example, there is less uncertainty in actually observing a sunglint than in calculating that one may be observed. Yet simply adopting reporting standards like those suggested will constitute a step forward. We stand poised on the brink of a new era in imagery collection. We need to be ready with standards for imagery analysis.


Barbara Grant
Lines and Lights Technology
Cupertino, CA

Barbara G. Grant, SPIE senior member, earned her master's in optical sciences from the University of Arizona. She has more than 30 years' professional engineering experience encompassing areas such as radiometric calibration, electro-optical systems, and imagery analysis. She is the author of Field Guide to Radiometry, co-author of The Art of Radiometry, and is working on a book on UAV imaging sensors. She teaches classes to professionals through SPIE, the University of California Irvine Extension, and Georgia Tech Professional Education.


References:
1. https://www.accessnorthga.com/detail.php?n=269353 Associated Press. Feds announce test sites for drone aircraft, dated 30 December 2013. Accessed 7 August 2014.
2. http://www.wnd.com/2003/04/18242/ B. Grant, Feds on Waco: shooting in the dark. Online article on failures of analysis at Waco, TX, including an extensive reference list of analysts, dated 12 April 2003. Accessed 7 August 2014.
3. L. M. Klasen, Waco investigation: analysis of FLIR videotapes, Proc. SPIE 4370, p. 271-285, 2001. doi:10.1117/12.440087
4. D. S. Frankel, Assessment of Waco, Texas FLIR videotape, Proc. SPIE 4370, p. 286-300, 2001. doi:10.1117/12.440088
5. F. H. Zegel, Infrared signatures of small arms weapons fire, Proc. SPIE 4370, p. 301-313, 2001. doi:10.1117/12.440089
6. B. G. Grant, D. T. Hardy, Muzzle flash issues related to the Waco FLIR analysis, Proc. SPIE 4370, p. 314-324, 2001. doi:10.1117/12.440090
7. http://spie.org/documents/newsroom/videos/5594/video1.wmv Federal Bureau of Investigation (FBI) forward-looking IR (FLIR) video showing sunglint from a metal post at Mt. Carmel, 19 April 1993. Accessed 10 August 2014.
8. http://news.findlaw.com/cnn/docs/waco/vector_report.html Vector Data Systems, Imagery analysis report: the events at Waco Texas 19 April 1993. Report by Vector Data Systems prepared for the US District Court for the Western District of Texas and the Office of Special Counsel, dated 5 May 2000. Accessed 10 August 2014.
9. http://www.apologeticsindex.org/pdf/klasen.pdf L. Klasén, Imagery analysis report: the events at Waco Texas 19 April 1993. Report by Vector Data Systems prepared for the US District Court for the Western District of Texas and the Office of Special Counsel, dated 5 May 2000. Accessed 10 August 2014.
10. http://spie.org/documents/newsroom/videos/5594/video2.wmv FBI FLIR video showing personnel crossing the Mt. Carmel storm shelter roof after the fire on 19 April 1993. Accessed 10 August 2014.
11. B. Grant, Imagery analysis and the need for standards, Proc. SPIE 9195, 2014. (Invited paper.)
PREMIUM CONTENT
Sign in to read the full article
Create a free SPIE account to get access to
premium articles and original research