Proceedings Volume 6394

Unmanned/Unattended Sensors and Sensor Networks III

cover
Proceedings Volume 6394

Unmanned/Unattended Sensors and Sensor Networks III

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 29 September 2006
Contents: 6 Sessions, 17 Papers, 0 Presentations
Conference: Optics/Photonics in Security and Defence 2006
Volume Number: 6394

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Unmanned Systems Technology
  • Unattended Sensor Systems
  • Active and Passive Image Sensing and Processing
  • Sniper and Mortar Detection Technologies Session
  • Radar and Through-The-Wall Sensor Systems
  • Visible/IR/Fiber Optic Sensor Systems
Unmanned Systems Technology
icon_mobile_dropdown
Cohort: UxV teams in support of operations in complex environments
The Defence R&D Canada (DRDC) has been given strategic direction to pursue research to increase the independence and effectiveness of unmanned military vehicles and systems. This led to the creation of the Autonomous Land Systems(ALS) project that was completed in 2005 with a successful demonstration of semi-autonomous UGVs in open partially vegetated environments. Cohort is a newly funded project that will work to develop effective UxV teams for urban and complex environments. This paper will briefly discuss the state of UGV research at the completion of ALS and other research projects supporting Cohort. The goals and challenges of Cohort will be outlined as well as the research plan that involves many of DRDC's laboratories from across Canada.
A software design approach for heterogeneous systems of unattended sensors, unmanned vehicles, and monitoring stations
William J. Smuda, Grant Gerhart, Man-Tak Shing, et al.
The design and implementation of software for network systems of diverse physical assets is a continuing challenge to sensor network developers. The problems are often multiplied when adding new elements, and when reconfiguring existing systems. For software systems, like physical systems, explicit architectural descriptions increase system level comprehension. Coupled with well defined object oriented design practices, system extensibility is defined and software reuse and code composition are enabled. Our research is based on model driven design architecture. High level system models are defined in the Unified Modeling Language (UML), the language of the software engineer. However, since most experimental work is done by non-software specialists, (electronics Engineers, Mechanical Engineers and technicians) the model is translated into a graphical, domain specific model. Components are presented as domain specific icons, and constraints from the UML model are propagated into the domain model. Domain specialists manipulate the domain model, which then composes software elements needed at each node to create an aggregate system.
Unattended Sensor Systems
icon_mobile_dropdown
Multifunctional self-sensing microcantilever arrays for unattended detection of chemicals, explosives, and biological agents
B. Rogers, R. Whitten, J. D. Adams
Unattended sensing applications necessitate robust, compact, low-cost, low-power sensor units. The microcantilever-based Self-Sensing Array (SSA) technology developed by Nevada Nanotech Systems, Inc. (NNTS) is a strong candidate for such units. SSA technology is expected to provide the selectivity, sensitivity, durability, low cost, and low power needed for unattended sensors and sensor networks. The sensor employs a variety of sensor coatings and the ability to analyze the electrical and thermal properties of molecules on the cantilevers. This so-called Lab-on-a-TipTM technology could lead to enhanced chemical identification capabilities of the trace detection platform.
Development, implementation, and experimentation of parametric routing protocol for sensor networks
Matthew S. Nassr, Jangeun Jun, Stephan J. Eidenbenz, et al.
The development of a scalable and reliable routing protocol for sensor networks is traced from a theoretical beginning to positive simulation results to the end of verification experiments in large and heavily loaded networks. Design decisions and explanations as well as implementation hurdles are presented to give a complete picture of protocol development. Additional software and hardware is required to accurately test the performance of our protocol in field experiments. In addition, the developed protocol is tested in TinyOS on Mica2 motes against well-established routing protocols frequently used in sensor networks. Our protocol proves to outperform the standard (MINTRoute) and the trivial (Gossip) in a variety of different scenarios.
Field testing of new unattended small size seismic module for various target detection
General Sensing Systems (GSS) has achieved outstanding and verifiable results in the design and performance of seismic systems with near zero false alarm rates for the detection of walking and running persons. These results were realized in a number of detection systems and in particular in small size seismic detection modules. Preliminary testing of these seismic modules in various environment noise conditions shows that such small unattended modules can be successfully used for other target detection. Potential target sets can include light and heavy vehicles, helicopters, aircraft, and ships. This paper describes preliminary results of such target detection and preliminary experimental data about corresponding detection range. We show that the new unattended, small size detection module demonstrates reliable performance in various environment conditions.
Active and Passive Image Sensing and Processing
icon_mobile_dropdown
Parallelization of a blind deconvolution algorithm
Charles L. Matson, Kathy J. Borelli
Often it is of interest to deblur imagery in order to obtain higher-resolution images. Deblurring requires knowledge of the blurring function - information that is often not available separately from the blurred imagery. Blind deconvolution algorithms overcome this problem by jointly estimating both the high-resolution image and the blurring function from the blurred imagery. Because blind deconvolution algorithms are iterative in nature, they can take minutes to days to deblur an image depending how many frames of data are used for the deblurring and the platforms on which the algorithms are executed. Here we present our progress in parallelizing a blind deconvolution algorithm to increase its execution speed. This progress includes sub-frame parallelization and a code structure that is not specialized to a specific computer hardware architecture.
Orthoscopic long-focal-depth 3D integral imaging
Integral imaging systems are imaging devices that provide 3D images of 3D objects. When integral imaging systems work in their standard configuration the provided reconstructed images are pseudoscopic; that is, are reversed in depth. In this paper we present, a technique for formation of real, undistorted, orthoscopic integral images by direct pickup. The technique is based on the use of a proper relay system and a global mapping of pixels of the elemental-images set. Simulated imaging experiments are presented to support our proposal.
Visible and NIR spectral band combination to produce high security ID tags for automatic identification
Verification of a piece of information and/or authentication of a given object or person are common operations carried out by automatic security systems that can be applied, for instance, to control the entrance to restricted areas, access to public buildings, identification of cardholders, etc. Vulnerability of such security systems may depend on the ease of counterfeiting the information used as a piece of identification for verification and authentication. To protect data against tampering, the signature that identifies an object is usually encrypted to avoid an easy recognition at human sight and an easy reproduction using conventional devices for imaging or scanning. To make counterfeiting even more difficult, we propose to combine data from visible and near infrared (NIR) spectral bands. By doing this, neither the visible content nor the NIR data by theirselves are sufficient to allow the signature recognition and thus, the identification of a given object. Only the appropriate combination of both signals permits a satisfactory authentication. In addition, the resulting signature is encrypted following a fully-phase encryption technique and the obtained complex-amplitude distribution is encoded on an ID tag. Spatial multiplexing of the encrypted signature allows us to build a distortion-invariant ID tag, so that remote authentication can be achieved even if the tag is captured under rotation or at different distances. We also explore the possibility of using partial information of the encrypted signature to simplify the ID tag design.
High secure authentication by optical multifactor ID tags
Multifactor encryption-authentication technique reinforces optical security by allowing the simultaneous AND-verification of more than one primary image. The method involves double random-phase encoding, fully phase-based encryption and a combined nonlinear JTC and a classical 4f-correlator for simultaneous recognition and authentication of multiple images. The encoded signal fulfils the general requirements of invisible content, extreme difficulty in counterfeiting and real-time automatic verification. Four reference images, double-phase encoded and encrypted in an ID tag are compared with the input images obtained in situ from the person or the vehicle whose authentication is wanted and from a database. Optical ID tags are satisfactory items to achieve remote and real-time optical authentication. On the one hand, effort has been focused on the tag design to provide a positive identification even though the receiver captured the ID tag under some distortion. On the other hand, demands on increasing security require ciphering the information prior to include it in the ID tag. A recognition step based on the correlation between the retrieved signature and a stored reference determines the authentication or rejection of the object under surveillance. In this work, we combine optical ID tags with the multifactor authentication procedure. Instead of basing the identification on a unique signature or piece of information, our goal is to authenticate a given person, object, or vehicle, by the simultaneous recognition of several factors. Some of them are intrinsic to the person and vehicle under control. Other factors, act as keys of the authentication step. The information of the whole set of factors is included in the ID tag. Remote identification of all factors is achieved. Such a system is proposed to control the access of people and vehicles in restricted areas, where the demand of security is high.
Decision fusion strategy for target recognition in hyperspectral images
Hyperspectral sensors allow a considerable improvement in the performance of a target recognition process to be achieved. This characteristic is particular interesting in a lot of military and civilian remote sensing applications, such as automatic target recognition (ATR) and surveillance of wide areas. In this framework, real time processing of the observed scenario is becoming a key issue, because it permits the operator to provide immediate assessment of the surveyed area. In the literature is presented a line-by-line real time implementation of the widely used Constrained Energy Minimization (CEM) target detector. However, experimental results show that sometimes the CEM filter produces False Alarms (FAs) corresponding to rare objects, whose spectra are angularly very different from the target signature and from the natural background classes in the image. A solution to such a problem is presented in this work: the proposed strategy is based on the decision fusion of the CEM and the SAM algorithms. Only those pixels that pass the CEM-stage are processed by the SAM algorithm. The second stage allows false alarms to be reduced by preserving most of target pixels. The fusion strategy is applied to an experimental hyperspectral data set to recognize a known green target. Detection performance is numerically evaluated and compared to the one of the classical CEM detector.
Sniper and Mortar Detection Technologies Session
icon_mobile_dropdown
Discriminating mortar launch/impact events utilizing acoustic sensors
Myron E. Hohil, Sachi Desai, Amir Morcos
Feature extraction methods based on the discrete wavelet transform and multiresolution analysis facilitate the development of a robust classification algorithm that reliably discriminates between launch and impact mortar events via acoustic signals produced during these events. Distinct characteristics arise within the different explosive events because impact events emphasize concussive and shrapnel effects, while launch events result from explosion that expel and propel a mortar round from a gun. The ensuing blast waves are readily characterized by variations in the corresponding peak pressure and rise time of the waveform, differences in the ratio of positive pressure amplitude to the negative amplitude, variations in the prominent frequencies associated with the varying blast events and variations in the overall duration of the resulting waveform. Unique attributes can also be identified that depend upon the properties of the gun tube, projectile speed at the muzzle, and the explosive/concussive properties associated with the events. In this work, the discrete wavelet transform is used to extract the predominant components of these characteristics from the acoustic signatures of the event at ranges of 1km. Highly reliable discrimination is achieved with a feedforward neural network classifier trained on a feature space derived from the distribution of wavelet coefficients and higher frequency details found within different levels of the multiresolution decomposition. We show that the algorithms provide a reliable discrimination (>84%) between launch and impact events using data collecting during several separate field test experiments.
Implementation of algorithms to discriminate between chemical/biological airbursts and high explosive airbursts
Myron E. Hohil, Sachi Desai, Amir Morcos
The Army is currently developing acoustic sensor systems that will provide extended range surveillance, detection, and identification for force protection and tactical security. A network of such sensors remotely deployed in conjunction with a central processing node (or gateway) will provide early warning and assessment of enemy threats, near real-time situational awareness to commanders, and may reduce potential hazards to the soldier. In contrast, the current detection of chemical/biological (CB) agents expelled into a battlefield environment is limited to the response of chemical sensors that must be located within close proximity to the CB agent. Since chemical sensors detect hazardous agents through contact, the sensor range to an airburst is the key-limiting factor in identifying a potential CB weapon attack. The associated sensor reporting latencies must be minimized to give sufficient preparation time to field commanders, who must assess if an attack is about to occur, has occurred, or if occurred, the type of agent that soldiers might be exposed to. The long-range propagation of acoustic blast waves from heavy artillery blasts, which are typical in a battlefield environment, introduces a feature for using acoustics and other sensor suite technologies for the early detection and identification of CB threats. Employing disparate sensor technologies implies that warning of a potential CB attack can be provided to the solider more rapidly and from a safer distance when compared to current conventional methods. Distinct characteristics arise within the different airburst signatures because High Explosive (HE) warheads emphasize concussive and shrapnel effects, while chemical/biological warheads are designed to disperse their contents over immense areas, therefore utilizing a slower burning, less intensive explosion to mix and distribute their contents. Highly reliable discrimination (100%) has been demonstrated at the Portable Area Warning Surveillance System (PAWSS) Limited Objective Experiment (LOE) conducted by Joint Project Manager for Nuclear Biological Contamination Avoidance (JPM NBC CA) and a matrixed team from Edgewood Chemical and Biological Center (ECBC) at ranges exceeding 3km. The details of the field-test experiment and real-time implementation/integration of the stand-alone acoustic sensor system are discussed herein.
Radar and Through-The-Wall Sensor Systems
icon_mobile_dropdown
Non-uniform integration for through-wall-imaging radar
Amir Beeri, Ron Daisy
An innovative approach is introduced herein for three-dimensional (3D) imaging of objects or people hidden behind obstacles such as walls. The Xaver 800, new micro-power Ultra Wideband (UWB) radar utilizes unique signal and image processing algorithms, enabling real-time acquisition and presentation of 3D images with high resolution of objects hidden behind walls. 'Xaver 800' offers the ability to perform life-saving operations with greater success and with a smaller risk to the operatives and to those they are trying to protect. Many technical challenges are encountered when working in real-world environment. Tough link-budget, distortions, and de-focusing when passing through walls with complex construction, are only a partial list of the problems an effective through-wall-imaging system should overcome. In addition, real-time operation limits the available integration time to the operational frame rate. Efficient utilization of the available integration time can improve the systems' link budget and thereby enhance the overall performance of the system. In cases where the link-budget parameters are tough, an efficient integration method can enable a reasonable refresh rate and therefore real-time operation. An approach to compensating for different attenuation sources and signal loss through 'Non-Uniform Integration' is discusses in this paper.
Visible/IR/Fiber Optic Sensor Systems
icon_mobile_dropdown
A reconfigurable low-cost thermal imager for unattended ground sensors
Paul A. Manning, Nicholas J. Parkinson, Tim S. Phillips, et al.
The cost of thermal imaging technology has, up until now, precluded its widespread use in sensor systems which require sensors to be deployed in very large numbers. This paper describes a method of achieving this goal of bringing low-cost 'disposable' thermal imaging into the dismounted military environment. Infrared detectors based on the manufacturing processes used in the production of conventional silicon chips offer a breakthrough in cost compared to other technologies. Despite having modest performance, this technology offers a route towards a very cost-effective thermal imaging sensor for dismounted applications. A flexible detector format which permits the detector to operate as a conventional close-packed 2-d array or as a faster update linear array gives the opportunity for performance optimisation and data reduction at the sensor, important attributes for a remotely deployed sensor with limited power resources. This paper describes a sensor architecture which is well matched to the cost, power consumption, and performance levels suited to short-range dismounted and networked operations, and demonstrates some of the imaging capability achievable with such a simple (and hence potentially extremely low cost) sensor.
Two-interferometer fiber optic sensor with disturbance localization
M. Kondrat, M. Szustakowski, W. Ciurapinski, et al.
We present investigation results of a new generation of the fiber optic perimeter sensor in a Sagnac and Michelson interferometers configuration. This sensor can detect a potential intruder and determine its position along a protected zone. We propose a localization method that makes use of the inherent properties of both interferometers. After demodulation of signals from both interferometers, the obtained amplitude characteristic of the Sagnac interferometer depends on position of a disturbance along the interferometer, while amplitude characteristic of the Michelson interferometer do not depend on this position. So, quotient of both demodulated characteristics makes it possible to localize the disturbance. Arrangement of a laboratory model of the sensor and its signal processing scheme is also presented. During research of the laboratory model of the sensor, it was possible to detect the position of the disturbance with resolution of about 40m along the 6-km long sensor.
Partial polarization characterization based on the Kullback relative entropy
The analysis of polarization and of depolarizing capabilities of materials is a technique of growing interest in different domains of applications. This is also the case for defence and security for which the determination of material characteristics can be of great interest in remote sensing and control application. However, until now, the analysis of polarization properties has been essentially limited to second order statistical characteristics such as Stokes vectors and/or Mueller matrices. There nevertheless exist other physical properties that cannot be characterized by such second order statistical characteristics. We demonstrate that the Kullback relative entropy leads to a relevant characterization of different properties that cannot be obtained by the measurements of Stokes vectors and/or Mueller matrices. For that purpose we show how the Kullback relative entropy, which is a physically meaningful measure of proximity between probability density functions (PDF), allows one to compare a partially polarized light with different optical states of reference. In particular, for optical waves with Gaussian fluctuations, the standard degree of polarization is a simple function of the Kullback relative entropy between the considered optical light and a totally depolarized light of the same intensity. It is demonstrated that one can generalize this relation between partially polarized light and different optical states of reference in order to measure new characteristics such as a degree of anisotropy, a degree of non gaussianity and a new degree degree of non-circularity. We discuss experimental configurations that can be discriminated with these new degrees of partially polarized light.
Autonomous vision networking: miniature wireless sensor networks with imaging technology
Gioia Messinger, Giora Goldberg
The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor. Image processing at the sensor node level may also be required for applications in security, asset management and process control. Due to the data bandwidth requirements posed on the network by video sensors, new networking protocols or video extensions to existing standards (e.g. Zigbee) are required. To this end, Avaak has designed and implemented an ultra-low power networking protocol designed to carry large volumes of data through the network. The low power wireless sensor nodes that will be discussed include a chemical sensor integrated with a CMOS digital camera, a controller, a DSP processor and a radio communication transceiver, which enables relaying of an alarm or image message, to a central station. In addition to the communications, identification is very desirable; hence location awareness will be later incorporated to the system in the form of Time-Of-Arrival triangulation, via wide band signaling. While the wireless imaging kernel already exists specific applications for surveillance and chemical detection are under development by Avaak, as part of a co-founded program from ONR and DARPA. Avaak is also designing vision networks for commercial applications - some of which are undergoing initial field tests.