Proceedings Volume 9986

Unmanned/Unattended Sensors and Sensor Networks XII

cover
Proceedings Volume 9986

Unmanned/Unattended Sensors and Sensor Networks XII

Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 15 December 2016
Contents: 6 Sessions, 10 Papers, 5 Presentations
Conference: SPIE Security + Defence 2016
Volume Number: 9986

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 9986
  • Keynote Session I
  • Unmanned/Unattended Air Sensors and Systems
  • Keynote Session II
  • Unmanned/Unattended Ground Sensors and Systems
  • Unmanned/Unattended Sensors and Technologies
Front Matter: Volume 9986
icon_mobile_dropdown
Front Matter: Volume 9986
This PDF file contains the front matter associated with SPIE Proceedings Volume 9986 including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
Keynote Session I
icon_mobile_dropdown
Perception and estimation challenges for humanoid robotics: DARPA Robotics Challenge and NASA Valkyrie
Maurice Fallon
This paper describes ongoing work at the University of Edinburgh's Humanoid Robotics Project. University of Edinburgh have formed a collaboration with the United States' National Aeronautics and Space Administration (NASA) around their R5 humanoid robot commonly known as Valkyrie. Also involved are MIT, Northeastern University and the Florida Institute for Human and Machine Cognition (IHMC) as part of NASA's Space Robotics Challenge. We will outline the development of state estimation and localization algorithms being developed for Valkyrie.
Unmanned/Unattended Air Sensors and Systems
icon_mobile_dropdown
Multibeam monopulse radar for airborne sense and avoid system
Ashok Gorwara, Pavlo Molchanov
The multibeam monopulse radar for Airborne Based Sense and Avoid (ABSAA) system concept is the next step in the development of passive monopulse direction finder proposed by Stephen E. Lipsky in the 80s. In the proposed system the multibeam monopulse radar with an array of directional antennas is positioned on a small aircaraft or Unmanned Aircraft System (UAS). Radar signals are simultaneously transmitted and received by multiple angle shifted directional antennas with overlapping antenna patterns and the entire sky, 360° for both horizontal and vertical coverage. Digitizing of amplitude and phase of signals in separate directional antennas relative to reference signals provides high-accuracy high-resolution range and azimuth measurement and allows to record real time amplitude and phase of reflected from non-cooperative aircraft signals. High resolution range and azimuth measurement provides minimal tracking errors in both position and velocity of non-cooperative aircraft and determined by sampling frequency of the digitizer. High speed sampling with high-accuracy processor clock provides high resolution phase/time domain measurement even for directional antennas with wide Field of View (FOV). Fourier transform (frequency domain processing) of received radar signals provides signatures and dramatically increases probability of detection for non-cooperative aircraft. Steering of transmitting power and integration, correlation period of received reflected signals for separate antennas (directions) allows dramatically decreased ground clutter for low altitude flights. An open architecture, modular construction allows the combination of a radar sensor with Automatic Dependent Surveillance – Broadcast (ADS-B), electro-optic, acoustic sensors.
Using crowd sourcing to combat potentially illegal or dangerous UAV operations
The UAV (Unmanned Aerial Vehicles) industry is growing exponentially at a pace that policy makers, individual countries and law enforcement agencies are finding difficult to keep up. The UAV market is large, as such the amount of UAVs being operated in potentially dangerous situations is prevalent and rapidly increasing. Media is continually reporting ‘near-miss’ incidents between UAVs and commercial aircraft, UAV breaching security in sensitive areas or invading public privacy.

One major challenge for law enforcement agencies is gaining tangible evidence against potentially dangerous or illegal UAV operators due to the rapidity with which UAV operators are able to enter, fly and exit a scene before authorities can arrive or before they can be located.

DroneALERT, an application available via the Airport-UAV.com website, allows users to capture potentially dangerous or illegal UAV activity using their mobile device as it the incident is occurring. A short online DroneALERT Incident Report (DIR) is produced, emailed to the user and the Airport-UAV.com custodians. The DIR can be used to aid authorities in their investigations. The DIR contains details such as images and videos, location, time, date of the incident, drone model, its distance and height.

By analysing information from the DIR, photos or video, there is a high potential for law enforcement authorities to use this evidence to identify the type of UAV used, triangulate the location of the potential dangerous UAV and operator, create a timeline of events, potential areas of operator exit and to determine the legalities breached. All provides crucial evidence for identifying and prosecuting a UAV operator.
Optical flow and inertial navigation system fusion in the UAV navigation
In recent years navigation on the basis of computation of the camera path and the distance to obstacles with the aid of field of image motion velocities (i.e. optical flow, OF) became highly demanded particularly in the area of relatively small and even micro unmanned aerial vehicles (UAV). Video sequences captured by onboard camera gives the possibility of the OF calculation with the aid of relatively simple algorithms like Lucas-Kanade. The complete OF is the linear function of linear and angular velocities of the UAV which provides an additional means for the navigation parameters estimation. Such UAV navigation approach presumes that on-board camera gives the video sequence of the underlying surface images providing the information about the UAV evolutions. Navigation parameters are extracted on the basis of exact OF formulas which gives the observation process description for estimation based on Kalman filtering. One can expect the high accuracy of the estimated parameters (linear and angular velocities) because their number is substantially less than the number of measurements (practically the number of the camera pixels).
Keynote Session II
icon_mobile_dropdown
Power for sensors; sensors for power
As sensors are increasingly deployed in locations removed from mains power and increasingly expected to operate for times that are long compared to battery lifetimes we look to means for "harvesting" or "scavenging" energy from the sensors' operating environments. Whereas many sensors are "parametric" - their interaction with the environment causes a change in one or more of their electrical parameters - many other are true transducers - they perform their sensing function by extracting energy from their environment. These kinds of sensors can thus serve - under suitable operating conditions - both as measuring devices and as power supplies. In this paper we review this background, review the fundamental restrictions on our ability to extract energy from the environment, enumerate and summarize sensing principles that are promising candidates to double as power supplies, and provide several examples that span the range from already off-the-shelf at low cost to in laboratory prototype stage to sufficiently speculative that there might be reasonable doubt regarding whether they can actually work even in principle. Possibilities examined across this spectrum include thermal noise, ambient RF scavenging (briefly), thermoelectricity, piezoelectricity, pyroelectricity, and electrochemistry, especially including electrochemistry facilitated by microorganisms.
Collaborative autonomous sensing with Bayesians in the loop
There is a strong push to develop intelligent unmanned autonomy that complements human reasoning for applications as diverse as wilderness search and rescue, military surveillance, and robotic space exploration. More than just replacing humans for `dull, dirty and dangerous' work, autonomous agents are expected to cope with a whole host of uncertainties while working closely together with humans in new situations. The robotics revolution firmly established the primacy of Bayesian algorithms for tackling challenging perception, learning and decision-making problems. Since the next frontier of autonomy demands the ability to gather information across stretches of time and space that are beyond the reach of a single autonomous agent, the next generation of Bayesian algorithms must capitalize on opportunities to draw upon the sensing and perception abilities of humans-in/on-the-loop. This work summarizes our recent research toward harnessing `human sensors' for information gathering tasks. The basic idea behind is to allow human end users (i.e. non-experts in robotics, statistics, machine learning, etc.) to directly `talk to' the information fusion engine and perceptual processes aboard any autonomous agent. Our approach is grounded in rigorous Bayesian modeling and fusion of flexible semantic information derived from user-friendly interfaces, such as natural language chat and locative hand-drawn sketches. This naturally enables `plug and play' human sensing with existing probabilistic algorithms for planning and perception, and has been successfully demonstrated with human-robot teams in target localization applications.
Unmanned/Unattended Ground Sensors and Systems
icon_mobile_dropdown
Tunable mechanical monolithic sensors for large band low frequency monitoring and characterization of sites and structures
F. Barone, G. Giordano, F. Acernese, et al.
Among the different mechanical architectures present in literature, the Watts linkage is one of the most promising ones for the implementation of a new class of mechanical accelerometers (horizontal, vertical and angular). In this paper, we present monolithic implementations of uniaxial and triaxial mechanical seismometers and accelerometers based on the UNISA Folded Pendulum mechanical configuration, optimized for low frequency characterization of sites (including underground sites) and structures as inertial sensor (seismometer). This mechanical architecture allows the design and implementation of very large band monolithic sensors (10-7Hz 102 Hz), whose sensitivities for the most common applications are defined by the noise introduced by their readouts (e.g. ¡ 10-12 m/sqrt(Hz) with classical LVDT readouts). These unique features, coupled other relevant properties like scalability, compactness, lightness, high directivity, frequency tunability (typical resonance frequencies in the band 10-1 Hz 102 Hz), very high immunity to environmental noises and low cost make this class of sensors very effective for the implementation of uniaxial (horizontal and/or vertical) and triaxial seismometers and accelerometers for ground, space and underwater applications, including UHV and cryogenics ones. Typical applications of this class of monolithic sensors are in the field of earthquake engineering, seismology, geophysics, civil engineering, characterization of sites (including underground sites), structures (e.g. buildings, bridges, historical monuments), and, in general, in all applications requiring large band-low frequency performances coupled with high sensitivities and compactness.
Data fusion for target tracking and classification with wireless sensor network
In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
Unmanned/Unattended Sensors and Technologies
icon_mobile_dropdown
Laser ablation method for production of surface acoustic wave sensors
Dmitry Lukyanov, Sergey Shevchenko, Alexander Kukaev, et al.
Nowadays surface acoustic wave (SAW) sensors are produced using a photolithography method. In case of inertial sensors it suffers several disadvantages, such as difficulty in matching topologies produced on opposite sides of the wafer, expensive in small series production, not allowing further topology correction. In this case a laser ablation method seems promising. Details of a proposed technique are described in the paper along with results of its experimental test and discussion.