Proceedings Volume 6946

Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications V

cover
Proceedings Volume 6946

Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications V

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 19 May 2008
Contents: 4 Sessions, 16 Papers, 0 Presentations
Conference: SPIE Defense and Security Symposium 2008
Volume Number: 6946

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 6946
  • Multispectral ISR Sensors
  • 3D ISR Sensors
  • ISR Processing
Front Matter: Volume 6946
icon_mobile_dropdown
Front Matter: Volume 6946
This PDF file contains the front matter associated with SPIE Proceedings Volume 6946, including the Title Page, Copyright information, Table of Contents, and the Conference Committee listing.
Multispectral ISR Sensors
icon_mobile_dropdown
Real-time multispectral data collection, processing, downlink, and display: test and demonstration
Denise Runnels, Scott Peterman, Jonathan Powell, et al.
Radiance Technologies, Inc. has tested and demonstrated real-time collection and on-board processing of Multispectral Imagery (MSI). Further, the test and demonstration consisted of a real-time downlink from the aircraft to the ground station and real-time display of the processed data product. The multispectral imagery was collected with a low-cost, low-profile MSI sensor, MANTIS-3T, from PAR Government Systems. The data product was created from output of a novel spectral algorithm combination that increases the probability of detection and decreases the false alarm rates for specific objects of interest. The display product was a compressed true color image in which the detected objects were delineated with red pixels. A description of the end-to-end solution, issues encountered as well as their resolution, and results will be discussed.
Overcoming adverse weather conditions with a common optical path, multiple sensors, and intelligent image fusion system
Joseph Ng, Michael Piacentino, Brian Caldwell
Mission success is highly dependent on the ability to accomplish Surveillance, Situation Awareness, Target Detection and Classification, but is challenging under adverse weather conditions. This paper introduces an engineering prototype to address the image collection challenges using a Common Optical Path, Multiple Sensors and an Intelligent Image Fusion System, and provides illustrations and sample fusion images. Panavision's advanced wide spectrum optical design has permitted a suite of imagers to perform observations through a common optical path with a common field of view, thereby aligning images and facilitating optimized downstream image processing. The adaptable design also supports continuous zoom or Galilean lenses for multiple field of views. The Multiple Sensors include: (1) High-definition imaging sensors that are small, have low power consumption and a wide dynamic range; (2) EMCCD sensors that transition from daylight to starlight, even under poor weather conditions, with sensitivity down to 0.00025 Lux; and (3) SWIR sensors that, with the advancement in InGaAs, are able to generate ultra-high sensitivity images from 1-1.7μm reflective light and can achieve imaging through haze and some types of camouflage. The intelligent fusion of multiple sensors provides high-resolution color information with previously impossible sensitivity and contrast. With the integration of Field-Programmable Gate Arrays (FPGAs) and Application-Specific Integrated Circuits (ASICs), real-time Image Processing and Fusion Algorithms can facilitate mission success in a small, low power package.
Spectral detection and monitoring of marine mammals
This note presents an airborne spectral imaging system and methodology used to detect, track and monitor marine mammal populations. The system is a four band multispectral imaging system using spectral bands tailored for maritime imaging. This low cost, low volume, imaging sensor can be deployed on either a small unmanned air vehicle (UAV) or any other cost efficient aircraft. Results of recent multispectral data collects over marine mammals in St. Lawrence Seaway are presented. Species present included beluga whales as well as various species of larger baleen whales.
DUSTER: demonstration of an integrated LWIR-VNIR-SAR imaging system
Michael L. Wilson, Dale Linne von Berg, Melvin Kruer, et al.
The Naval Research Laboratory (NRL) and Space Dynamics Laboratory (SDL) are executing a joint effort, DUSTER (Deployable Unmanned System for Targeting, Exploitation, and Reconnaissance), to develop and test a new tactical sensor system specifically designed for Tier II UAVs. The system is composed of two coupled near-real-time sensors: EyePod (VNIR/LWIR ball gimbal) and NuSAR (L-band synthetic aperture radar). EyePod consists of a jitter-stabilized LWIR sensor coupled with a dual focal-length optical system and a bore-sighted high-resolution VNIR sensor. The dual focal-length design coupled with precision pointing an step-stare capabilities enable EyePod to conduct wide-area survey and high resolution inspection missions from a single flight pass. NuSAR is being developed with partners Brigham Young University (BYU) and Artemis, Inc and consists of a wideband L-band SAR capable of large area survey and embedded real-time image formation. Both sensors employ standard Ethernet interfaces and provide geo-registered NITFS output imagery. In the fall of 2007, field tests were conducted with both sensors, results of which will be presented.
Near infrared missile warning testbed sensor
D. J. McDermott, R. S. Johnson, J. B. Montgomery, et al.
Multicolor discrimination is one of the most effective ways of improving the performance of infrared missile warning sensors, particularly for heavy clutter situations. A new tactical airborne multicolor missile warning testbed was developed and fielded as part of a continuing Air Force Research Laboratory (AFRL) initiative focusing on clutter and missile signature measurements for effective missile warning algorithms. The developed sensor test bed is a multi-camera system 1004x1004 FPA coupled with optimized spectral filters integrated with the optics; a reduced form factor microprocessor-based video data recording system operating at 48 Hz; and a real time field programmable gate array processor for algorithm and video data processing capable of 800B Multiply/Accumulates operations per second. A detailed radiometric calibration procedure was developed to overcome severe photon-limited operating conditions due to the sub-nanometer bandwidth of the spectral filters. This configuration allows the collection and real-time processing of temporally correlated, radiometrically calibrated video data in multiple spectral bands. The testbed was utilized to collect false alarm sources spectra and Man-Portable Air Defense System (MANPADS) signatures under a variety of atmospheric and solar illuminating conditions. Signatures of approximately 100 missiles have been recorded.
Performance analysis of a multispectral framing camera for detecting mines in the littoral zone and beach zone
BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.
Color in perceptual tracking using low frame rate motion imagery
Darrell Young, Tariq Bakir, Fred Petitti, et al.
A perceptual evaluation compared tracking performance when using color versus panchromatic synthetic imagery at low frame rates. Frame rate was found to have an effect on tracking performance for the panchromatic motion imagery. Color was found to be associated with improved tracking performance at 2 frames per second (FPS), but not at 6 FPS or greater. A self estimate of task confidence given by the respondents was found to be correlated to the measured tracking performance, which supports the use of task confidence as a proxy for task performance in the future development and validation of a motion imagery rating scale.
Mass properties factors in achieving stable imagery from a gimbal mounted camera
Daniel R. Otlowski, Kurt Wiener, Brandon A. Rathbun
Mass properties play a role in affecting the stability of gimbaled imaging platforms. The relationship between static balance and jitter, in some measure, is previously established. The definition and role of dynamic unbalance however, is not as thoroughly understood. Leading to the discussion of dynamic unbalance, we show through the use of analytical means, the jitter resulting from static unbalance generated forces when we follow these forces through the isolation mounting, servo pointing accuracy and disturbance rejection systems using a single degree of freedom model. We then apply the qualitative results to the discussion of dynamic unbalance exploring some of the nuances of product of inertia and cross coupling response. We will offer strategies and specific methods for measuring and correcting static unbalances and the principles of reducing dynamic unbalance in gimbals in order to reduce disturbance response and ultimately improve the image resolution. Finally, we will test a method for measuring and correcting dynamic unbalance to facilitate future quantitative testing aimed toward jitter improvement.
3D ISR Sensors
icon_mobile_dropdown
3D rapid mapping
In this paper the performance of passive range measurement imaging using stereo technique in real time applications is described. Stereo vision uses multiple images to get depth resolution in a similar way as Synthetic Aperture Radar (SAR) uses multiple measurements to obtain better spatial resolution. This technique has been used in photogrammetry for a long time but it will be shown that it is now possible to do the calculations, with carefully designed image processing algorithms, in e.g. a PC in real time. In order to get high resolution and quantitative data in the stereo estimation a mathematical camera model is used. The parameters to the camera model are settled in a calibration rig or in the case of a moving camera the scene itself can be used for calibration of most of the parameters. After calibration an ordinary TV camera has an angular resolution like a theodolite, but to a much lower price. The paper will present results from high resolution 3D imagery from air to ground. The 3D-results from stereo calculation of image pairs are stitched together into a large database to form a 3D-model of the area covered.
ISR Processing
icon_mobile_dropdown
A complete passive or imaging-based sensor system for unmanned air vehicle taking off and landing operations
We have successfully developed an innovative, miniaturized, and lightweight PTZ UCAV imager called OmniBird for unmanned air vehicle taking off and landing operations. OmniBird is developed through a SBIR funding from NAVAIR. It is to fit in 8 in3. The designed zoom capability allows it to acquire focused images for targets ranging from 10 to 250 feet. The innovative panning mechanism also allows the system to have a field of view of +/- 100 degrees. Initial test results show that the integrated optics, camera sensor, and mechanics solution allow the OmniBird to stay optically aligned and shock-proof under harsh environments.
SmartCapture: a compact video capture, encoding and streaming technology for UAVs
Wei Dai, Pankaj Topiwala
Live surveillance video is increasingly in demand on the battlefield to achieve Information Dominance, a critical DoD doctrine established long ago. With the increasing data transmit rate and range of the wireless device and development of Multi-hop Mobile Ad-hoc Network (MANet), real-time live video streaming from UAVs to the ground stations will move forward rapidly. UAVs of every size and shape are currently being formulated and fielded at breathtaking pace, some of them no bigger than paper planes. State-of-the-art video compression and transmission technology will be needed to achieve real-time transmission of the on-board sensors. This paper demonstrates a highly compact video capture and encoding technology for UAVs - SmartCapture. FastVDO SmartCapture is an USB-based capture and encoder device, which ingests NTSC/PAL analog audio/video, and outputs compressed MP4-formatted multimedia via USB 2.0. The captured video/audio is compressed using H.264/AAC, today's leading commercial standards. The highly compressed bit streams make live video streaming possible.
Automatic image exploitation system for small UAVs
N. Heinze, M. Esswein, W. Krüger, et al.
For surveillance and reconnaissance tasks small UAVs are of growing importance. These UAVs have an endurance of several hours, but a small payload of about some kilograms. As a consequence lightweight sensors and cameras have to be used without having a mechanical stabilized high precision sensor-platform, which would exceed the payload and cost limitations. An example of such a system is the German UAV Luna with optical and IR sensors on board. For such platforms we developed image exploitation algorithms. The algorithms comprise mosaiking, stabilization, image enhancement, video based moving target indication, and stereo-image generation. Other products are large geo-coded image mosaics, stereo mosaics, and 3-D-model generation. For test and assessment of these algorithms the experimental system ABUL has been developed, in which the algorithms are integrated. The ABUL system is used for tests and assessment by military PIs.
VideoQuest: managing large-scale aerial video database through automated content extraction
Hui Cheng, Darren Butler, Thomas Kover, et al.
Today, a large amount of videos is collected using aerial platforms. As the amount of aerial videos increases, there is an urgent need for effective management and systematic exploitation of aerial videos. In this paper, we introduce an aerial video management and exploitation system, named VideoQuest. The proposed system manages large-scale aerial video database through automated video processing and content extraction. These processing and content extraction algorithms include real-time video and metadata enhancement, hierarchical indexing and summarization, moving target detection (i.e. MTI), moving object tracking and event detection. Additionally, VideoQuest allows user to interactively search and browse large aerial video database based on sensor metadata and content extracted from aerial video. Using the VideoQeust system, a user can search and retrieve mission-relevant information several magnitudes faster than without using our system.
A content-based retrieval system for UAV-like video and associated metadata
N. E. O'Connor, T. Duffy, P. Ferguson, et al.
In this paper we provide an overview of a content-based retrieval (CBR) system that has been specifically designed for handling UAV video and associated meta-data. Our emphasis in designing this system is on managing large quantities of such information and providing intuitive and efficient access mechanisms to this content, rather than on analysis of the video content. The retrieval unit in our system is termed a "trip". At capture time, each trip consists of an MPEG-1 video stream and a set of time stamped GPS locations. An analysis process automatically selects and associates GPS locations with the video timeline. The indexed trip is then stored in a shared trip repository. The repository forms the backend of a MPEG-211 compliant Web 2.0 application for subsequent querying, browsing, annotation and video playback. The system interface allows users to search/browse across the entire archive of trips and, depending on their access rights, to annotate other users' trips with additional information. Interaction with the CBR system is via a novel interactive map-based interface. This interface supports content access by time, date, region of interest on the map, previously annotated specific locations of interest and combinations of these. To develop such a system and investigate its practical usefulness in real world scenarios, clearly a significant amount of appropriate data is required. In the absence of a large volume of UAV data with which to work, we have simulated UAV-like data using GPS tagged video content captured from moving vehicles.
Real-time aerial video exploitation station for small unmanned aerial vehicles
Jason B. Gregga, Art Pope, Kathy Kielmeyer, et al.
SET Corporation, under contract to the Air Force Research Laboratory, Sensors Directorate, is building a Real-time Aerial Video Exploitation (RAVE) Station for Small Unmanned Aerial Vehicles (SUAVs). Users of SUAVs have in general been underserved by the exploitation community because of the unique challenges of operating in the SUAV environment. SUAVs are often used by small teams without the benefits of dedicated personnel, equipment, and time for exploitation. Thus, effective exploitation tools for these users must have sufficiently automated capabilities to keep demands on the team's labor low, with the ability to process video and display results in real-time on commonly-found ruggedized laptops. The RAVE Station provides video stabilization, mosaicking, moving target indicators (MTI), tracking, and target classification, and displays the results in several different display modes. This paper focuses on features of the RAVE Station implementation that make it efficient, low-cost, and easy to use. The software architecture is a pipeline model, allowing each processing module to tap off the pipe, and to add new information back into the stream, keeping redundancy to a minimum. The software architecture is also open, allowing new algorithms to be developed and plugged in. Frame-to-frame registration is performed by a feature-tracking algorithm which employs RANSAC to discard outlying matches. MTI is performed via a fast and robust three frame differencing algorithm. The user interface and exploitation functions are simple, easy to learn and use. RAVE is a capable exploitation tool that meets the needs of SUAV users despite their challenging environment.