Proceedings Volume 11159

Electro-Optical and Infrared Systems: Technology and Applications XVI

Duncan L. Hickman, Helge Bürsing
cover
Proceedings Volume 11159

Electro-Optical and Infrared Systems: Technology and Applications XVI

Duncan L. Hickman, Helge Bürsing
Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 3 December 2019
Contents: 11 Sessions, 26 Papers, 18 Presentations
Conference: SPIE Security + Defence 2019
Volume Number: 11159

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 11159
  • Sensor Technology and Devices I
  • Sensor Technology and Devices II
  • Systems and Applications I
  • Systems and Applications II
  • European Defence Agency Special Session
  • Modelling and Simulation I
  • Modelling and Simulation II
  • Processing and Analysis
  • Poster Session
  • Erratum
Front Matter: Volume 11159
icon_mobile_dropdown
Front Matter: Volume 11159
This PDF file contains the front matter associated with SPIE Proceedings Volume 11159, including the Title Page, Copyright Information, Table of Contents, Author and Conference Committee lists.
Sensor Technology and Devices I
icon_mobile_dropdown
Silver nanowires: a new nanomaterial with advances for electrical, optical and IR systems
Julia Graubmann, Mariacarla Arduini, Günter Baur, et al.
In the present contribution we introduce silver nanowires, a material with outstanding properties. When silver nanowires are coated on surfaces, they form a percolating network. The surface resistance of these coatings can be adapted to individual needs, by changing the amount of silver nanowires on the surface. The coating formulation is versatilely applicable on rigid and flexible, glass and plastic and even curved substrates. Silver nanowires can be processed using standard coating procedures.

Thereby the material properties of silver, such as excellent electrical conductivity and reflection of electromagnetic radiation, and the advantages of nanotechnology are combined: The resulting coatings are electrically conductive, show remarkable reflective properties to electromagnetic radiation in the infrared wavelength range and are transparent in the visible spectral range.

In this paper we will demonstrate the potential of silver nanowires as a reflective coating for electromagnetic radiation on the example of low-e coatings. Such low-e coatings can be used for e.g. windows, where a high reflection of incoming IRradiation is necessary to avoid rising of the interior’s temperature. The reflective properties of silver nanowire based lowe coatings can be individually adjusted by means of the maximum reflection performance within a specific wavelength interval. Furthermore, the reflection properties of silver nanowires show a wavelength dependent performance. Consequently, silver nanowires are a promising material for tailorable reflective coatings especially for electrical, optical and IR systems.
Ghost imaging in the frequency domain with a high brilliance coherent monochromatic source: a novel approach to extend spectroscopy sensitivity beyond detectors limits
Ghost imaging is an active technique that implies using a time-varying structured illumination source to image a target without spatially-resolving measurements of the light beam that interacts with the target. Traditionally, a beam splitter is used to create two highly correlated beams, such that the signal interacts with the target and is then measured by a single pixel detector, while the reference is directly measured by a spatially resolving detector. This approach allows to implement ghost imaging in the space domain, nevertheless also temporal and frequency domains can be addressed1,2, allowing to extract the pertinent information. In particular, ghost imaging in the frequency domain has been recently applied to extract spectral information from a target object by means of Fourier Transform Interferometry3,4. In this work we illustrate and discuss the results of interaction-free measurements on an Er3+ doped nonlinear crystal, placed in one arm of an interferometer, obtained by using only non-interacting photons. Our equipment is a wave-guided solid state device, exploiting an integrated quantum photonic circuit that is equivalent to an Asymmetric Nonlinear Mach-Zehnder Interferometer. The experiment was performed by using a 250mW monochromatic 980 nm laser source that allowed exciting an Er:LiNbO3 waveguide, placed in one of the arms of the asymmetric interferometer. The interferograms were obtained by varying the signal in the time domain by using a LiNbO3 undoped waveguide in the opposite branch of the interferometer and recorder with a standard Si p-i-n detector, provided with a pass band filter (975nm ± 25nm) thus blocking all photons except the pump ones. The data were analyzed with conventional Fast Fourier Transform Techniques. The application of this approach allowed to recover information in the frequency domain, in particular, despite the monochromatic characteristics of the detected signal, we could recover the whole spectroscopy of the energy levels of the Er3+ doped crystal. The role of the converted photons was evidenced by the fact that, by using a radiation source that does not interact with the dopant (1320nm Laser), only the line of the source is recovered by the FFT handling of the interferograms. An important aspect to remark is that the obtained spectral distribution addressed also the IR part of the spectrum where the applied detector (Si p-i-n) is blind. In this view, this methodology opens the possibility to extend sensitive spectral measurements in spectral regions where detectors show poor responsivity.
Theoretic approach to ghost imaging in the frequency domain performed by means of a high brilliance coherent monochromatic source
Ghost imaging is a novel non-conventional technique allowing to generate high resolution images by correlating the intensity of two light beams, neither of which independently contains sufficient information about the spatial distribution and shape of the object. The first demonstration of ghost imaging used light in double photon state, obtained from spontaneous parametric down-conversion. Owing to the entanglement of the source photons, the proposed theory required quantum descriptions for both the optical source and its photo-detection statistics1. However, subsequent experimental and theoretical considerations2,3 demonstrated that ghost imaging can be performed also with thermalized light, utilizing either CCD detector arrays or photon-counting detectors, thus admitting to a semi-classical description, employing classical fields and shot-noise limited detectors. This has generated increasing interest4-6 in establishing a unifying theory that characterizes the fundamental physics of ghost imaging and defines the boundary between classical and quantum domains. In this view, we exploited recent progress obtained through the application of Fourier Transform Techniques to demonstrate ghost imaging in the frequency domain, in order to measure a continuous spectrum by using a highly brilliant and coherent monochromatic source. In particular, we demonstrate the application of this ghost imaging technique to broadband spectroscopic measurements by means of interaction free photon detection. The experimental apparatus and the collected data are described in a dedicated work7. In this paper, we consider the theoretical aspects underlying the proposed Spectroscopic technique. In particular, two alternative theoretical models are presented. In one case, a statistical approach (semi-classical) is applied, where the states of the sampling beam are considered, whereas in the other case a pure quantum treatment is carried on, by describing the interaction of vacuum states generated by photon conversion processes. Both theoretical models, though carried on by means of a complementary formalism, lead to equivalent results and offer a physical interpretation of the collected experimental data. The application of these results offer novel perspectives for remote sensing in low light conditions, or in spectral regions where sensitive detectors are lacking.
Sensor Technology and Devices II
icon_mobile_dropdown
Low-light-level SWIR photodetectors based on the InGaAs material system
F. Rutz, R. Aidam, A. Bächle, et al.
The short-wavelength infrared (SWIR) regime between 1 and 3 μm is of high interest especially for surveillance, reconnaissance, and remote sensing applications. The availability of high-power, yet eye-safe SWIR laser sources is an important asset enabling scene illumination and implementation of advanced active imaging concepts like gated viewing (GV) or light detection and ranging (LIDAR). With atmospheric nightglow also a natural, but faint source for scene illumination is available for passive low-light-level imaging in the SWIR region. The most commonly employed material system for realizing SWIR photodetectors is InGaAs with an indium content of 53%. The spectral sensitivity of In0.53Ga0.47As with its cut-off wavelength of 1.7 μm covers a wide part of the nightglow spectrum as well the emission lines of available laser sources at typical telecom wavelengths around 1.55 μm. However, for low-light-level passive SWIR imaging a dark-current density around 10-9 A/cm² is considered mandatory. While the international state-of-the-art has already achieved this performance at room-temperature operation, today’s European stateof- the-art is still lagging behind. The development of InGaAs-based SWIR detectors at Fraunhofer IAF aims at pin as well as avalanche photodiodes (APDs) for imaging applications with 640×512 pixels. While InGaAs APDs play to their strength in GV applications with typically rather short integration times, planar processed InGaAs/InP pin photodiodes with lowest possible dark-current and noise characteristics are the detector devices of choice for passive low-light-level detection. Within a few planar-process batches, we approached the European state-of-the-art for the dark-current density of 15-μm-pitch InGaAs pin detectors by a remaining factor of five. The most recent process run yielded further slightly improved dark-current characteristics on test devices. Recently, we have started with in-house characterization of such focal plane detector arrays hybridized with suitable SWIR read-out integrated circuits.
Sensor for security and safety applications based on a fully integrated monolithic electro-optic programmable micro diffracting device
This work presents and discusses the features of a monolithic Programmable Micro Diffractive Grating (PMDG) fabricated over a lithium niobate substrate, which can be used to synthetize the visible and near-infrared spectra of important analytes, including dangerous materials (chemically aggressive, toxic or explosive gases). The functional core of the device consists of a periodic arrangement (array) of ridge waveguides whose optical delay (phase shift) is controlled electrically via the linear electro-optical (Pockels) effect. By distinctly polarizing every waveguide composing the comb with a suitable voltage, the collective transparency of the grating can be tailored so that the output far-field, at a predetermined diffraction angle, may reproduce a spectral distribution of interest. Therefore, this device can serve as universal reference cell in a correlation spectroscopy set-up, with particular interest for safety and security applications, as it could avoid the direct manipulation of dangerous or explosive materials. Moreover, by using a dual colour InGaAs detector, the sensing system can process optical spectra covering an extremely wide wavelength band, from the near UV, (~380 nm), to the MIR (~2.5 μm). In the present article, a schematic description of the sensing system, together with a detailed description of the PMDG device and its programming, will be provided and compared with some experimental data and the corresponding generated synthetic spectra. Examples of simulation of synthetic spectra generation in the case of some gases of interest for safety and security, together with the modelling of the device performances, as a function of the design parameters will also be discussed.
Ultrathin tunable terahertz absorbers based on electrostatically actuated metamaterial
High performance tunable absorbers for terahertz (THz) frequencies will be crucial in advancing applications such as single-pixel imaging and spectroscopy. Metamaterials provide many new possibilities for manipulating electromagnetic waves at the subwavelength scale. Due to the limited response of natural materials to terahertz radiation, metamaterials in this frequency band are of particular interest.

The realization of a high-performance tunable (THz) absorber based on microelectromechanical system (MEMS) is challenging, primarily due to the severe mismatch between the actuation range of most MEMS (on the order of 1-10 microns) and THz wavelengths on the order of 100-1000 microns. Based on a metamaterial design that has an electromagnetic response that is extremely position sensitive, we combine meta-atoms with suspended at membranes that can be driven electrostatically. This is demonstrated by using near-field coupling of the meta-atoms to create a substantial change in the resonant frequency.

The devices created in this manner are among the best-performing tunable THz absorbers demonstrated to date, with an ultrathin device thickness ( 1/50 of the working wavelength), absorption varying between 60% and 80% in the initial state when the membranes remain suspended, and with a fast switching speed ( 27 us). In the snap-down state, the resonance shifts by γ >200% of the linewidth (14% of the initial resonance frequency), and the absorption modulation measured at the initial resonance can reach 65%.
Systems and Applications I
icon_mobile_dropdown
Fast decay solid-state scintillators for high-speed x-ray imaging
Flash X-ray radiography (FXR) is one of the electro-optical imaging methods and the most important diagnostic tool in the field of homeland security to detect explosive materials or drugs, terminal ballistics and detonation research to register and study such high-speed phenomena even under the roughest conditions like humidity, dust, smoke, debris, and metal. Physical principles of FXR technology and imaging with hard X-rays are presented. In order to take image sequences of high-velocity impacts using FXR and intensified high-speed cameras for image separation fast solid-state scintillator screens are necessary to convert the X-ray radiation into detectable visible light. The thickness of a scintillator screen is a significant parameter because there is a trade-off between the spatial resolution and the sensitivity. A low thickness means a high spatial resolution but also a lower sensitivity. In a study we investigated the influence of scintillator thickness on the FXR image quality and the emission decay of the scintillation response called decay time. Physical parameters like spatial resolution, signal-noise ratio and contrast are used to characterize image quality. High-speed sequences of FXR images at frame rates up to 50kfps of experimental investigations on the ballistic impact behavior of various protective components against projectiles and applications in the field of military and security agencies are presented. Finally, as a result of a literature study, some applications of X-ray backscatter technology in the field of homeland security and border control are shown to detect suspicious organic materials such as explosives, drugs using and landmines fast decay solid-state scintillators.
Influence of phosphor screen color on performance with modern night vision goggles
Modern Image Intensifier (I²) feature green or white phosphor screens and both types are in use in Night Vision Goggles (NVG). It is still an open question if the phosphor screen color influences operational performance. In this study, a test close to an operational task was used to assess this. Forty-nine soldiers of the German army had to perform a gunfighting course in different illumination settings. All soldiers possessed a visual acuity of LogMAR 0.1 or better and stereoscopic vision of at least 40 second of arc. Their task was to place 3 to 5 shots from four different positions on a target chart. Each soldier had to complete the course without NVG in daylight illumination conditions and in night illumination level with NVG using green and white phosphor I². The usage was in random order with roughly on third starting with one of each conditions to average the learning effect. Performance analyses included time needed to complete the course (pass-through time) and the accuracy of the shots (hit rate). Additionally, the soldiers gave a self-assessment of their performance for comparison with the objective results.

The analysis of the shooting accuracy showed the homogeneity of the subject group. In mean three shots were placed in an area of 3.66 cm x 3.66 cm (SD: 2.35 cm²) regardless of the tasks, phosphor color or the position in the course. The accuracy of the fire using NVG were regardless of the phosphor color high significantly better (p = 0.00034 green resp. p = 0.00014 white) than in daylight conditions. However, the use of the aiming laser equipment attributes probably for this and it is not a true performance criterion. No significant difference was found between the three groups in the passthrough time (p = 0.89). As well no difference was found in respect of the accuracy of fire between the different screen color of the NVGs (p = 0.56). The soldiers self-assessment revealed with high significance (p = 0.001) a preference to the NVG with the white phosphorus screen.

The soldiers probably prefer white phosphor because it looks closer to daylight. In contrast to this subjective preference, no significant objective performance differences appeared when using NVG with either white or green I² phosphor.
The development of a multi-band handheld fusion camera
A longwave infrared (LWIR) handheld surveillance camera has been modified through the addition of a second sensor which provides both visible (RGB) and near-infrared (NIR) image streams. The challenges and constraints imposed on the development process of this Handheld Fusion Camera (HHFC) are described, and the approach to the dual and tri-band image fusion processing schemes is presented. The physical characteristics of the existing camera acted as a major constraint on the HHFC design with the Size, Weight, and Power (SWaP) requirements restricting the choice of both the additional sensor as well as the processor engine available within the camera. The primary use of the HHFC is in groundbased security and surveillance operations which is challenging in terms of variability in the scene content. Establishing an effective processing architecture is critical to both image interpretability by the user, and operational effectiveness. The HHFC allows the user to view different image streams including enhanced single-band image data as well as both dual and tri-band fused imagery. Such flexibility allows the user to select the best imagery for their immediate requirements. Power consumption and latency figures have been minimised by the use of relatively simple arithmetical fusion algorithms combined with an Adaptive Weight Map (AWM) for regional-based optimisation. In practice, the potential performance gain achieved is necessarily limited by the required performance robustness, and this trade-off was critical to the HHFC design and the final image processing solution.
Comparison of a kaleidoscope-based multi-view infrared system with its TOMBO-based counterpart
A. Mas, G. Druart, M. Vaché, et al.
Multi imaging snapshot systems are used for a wide range of applications in all the spectral ranges. We propose here a study and a realization of a multi-view snapshot system using a kaleidoscope in the Long-Wave Infrared (LWIR) and compatible with uncooled infrared detectors such as microbolometers. The optical system has a high numerical aperture, a wide range of fields of view and uses a single focal plane array. We will establish here the advantages of this technology on other design strategies and especially the kaleidoscope design will be compared with the TOMBO design. Then the optical conception rules for every subset of the kaleidoscope architecture will be described and the results of a first demonstrator will be presented. The features of this system will be compared with a TOMBO-based system with a front afocal.
Systems and Applications II
icon_mobile_dropdown
Developing a control architecture for highly accurate multi-axis inertial stabilized platform
Inertial stabilized platforms (ISP) are used in many acquisition, tracking and pointing systems, in which the line of sight (LOS) of electro-optical sensors must be kept steady. This is very challenging, especially in long range Electro- Optical/Infrared (EO/IR) systems where the LOS is more sensitive to mechanical-electrical noise, aerodynamic force or base motion effects. The efforts to improve the stability of the system includes various approaches from control algorithms, feedback/feed forward compensator to dual-stage controller or six degrees of freedom pivot, etc. n this paper, the authors present several control architectures for a multi-axis ISP system. First, the dynamic of four-axis gimbaled pedestal is modeled taking into account the effects of friction, cross-coupling and mechanical limitation. Then, the control loops for stabilization and pointing are designed using master–slave architecture for each gimbal axis. The pointing accuracy and stabilization level are analyzed and evaluated by simulation and experiment. At the end, by switching the role of each gimbal, an optimal control architecture that performs the stabilization at micro radian level in the wide range of bandwidth has been suggested. It is also proved that the proposed methods are effective for other EO/IR mobile systems that suffered various frequency of disturbance.
European Defence Agency Special Session
icon_mobile_dropdown
ECOMOS software structure and key features
ECOMOS is a multinational effort within the framework of an EDA Project Arrangement. Its aim is to provide a generally accepted and harmonized European computer model for computing nominal Target Acquisition (TA) ranges of optronic imagers operating in the Visible or thermal Infrared (IR). The project involves close co-operation of defense and security industry and public research institutes from five nations: France, Germany, Italy, The Netherlands and Sweden. ECOMOS will use and combine existing European tools, to build up a strong competitive position. In Europe, there are two well-accepted approaches for providing TA performance data: the German TRM (Thermal Range Model) model and the Netherlands TOD (Triangle Orientation Discrimination) method. ECOMOS will include both approaches. The TRM model predicts TA performance analytically, whereas the TOD prediction model utilizes the TOD test method, imaging simulation and a Human Visual System model in order to assess device performance. For the characterization of atmosphere and environment, ECOMOS uses the French model and software MATISSE (Modélisation Avancée de la Terre pour l'Imagerie et la Simulation des Scènes et de leur Environnement). The first software implementation of ECOMOS has been finalized in spring 2019. In this presentation, the key features implemented in the current version are elucidated. In addition, the final ECOMOS software structure as well as an overview of the user guidance within ECOMOS are shown.
The project SPIDVE: study on EO Sensors Performance Improvement in Degraded Visual Environment
In the last years, there has been a huge improvement in Electro-Optical (EO) systems effectiveness, due to the availability of large staring arrays detectors with higher performance, as well as strong processing capability. So both in homeland surveillance and for military situational awareness, the use of EO systems, operating from Visible to Infrared, has dramatically grown.
Operations in Degraded Visual Environment (DVE) are frequent during military actions, due to many factors: either natural (poor light, fog, glare etc.) or intentionally produced (smoke, dust etc.). In these conditions the performance of EO sensors is degraded and therefore their effectiveness for Detection, Recognition and Identification (DRI) and Navigation capability. In general, the situational awareness is strongly affected as well as the safety of personnel. Proper techniques are needed to restore (at least partially) the imaging capabilities of EO sensors in DVEs. The project SPIDVE (Study on EO Sensors Performance Improvement in Degraded Visual Environment), promoted by the European Defense Agency (EDA), is focused on the analysis of the impact on EO sensors performance by the adverse visual conditions. It starts from the analysis of the status of the art in terms of technology, processing, measurements and modeling methodologies, based on the existing scientific literature, to carry out an assessment of the most promising technologies for image enhancement and restoration in different DVEs.
Particular care is devoted to the discussion with the final users (the military personnel) to identify the cases of higher interest for their operations. On this basis the possible candidate methodologies shall be analyzed more deeply, evaluating their performance with the aim of selecting the most promising one.
At the end, a possible roadmap for new initiatives to exploit and develop the findings shall be defined.
Architectures for radiofrequency and optronics sensors onboard Remotely Piloted Aerial Systems (RPAS)
This work has been developed under the EDA contract no 16.ESI.OP.137 within the Electro Optical Sensors technologies Captech in EDA. In the Defence domain, the different assets are increasingly required to operate in a multirole, multi-purpose manner within a wide range of possible missions, locations and operational environments. The sensors systems, both Radiofrequency (RF) and Electro-optic (EO), are in continuous development, and the next generation will be developed with multifunctional capabilities and increased performance, therefore new architectures should be define to handle this and to combine the information offered by them in the most effective way. This will provide the capability of operation in all weather, all time, difficult conditions with a broad range of threats immersed in strong clutter and electromagnetic interference (EMI) environments and the possibility to quickly adapt to each mission scenarios. The main goal is the definition of an Interoperable, Modular, Open and Scalable architecture (IMOSA) to achieve interoperability within payloads –mainly EO and RF sensors- for Class I Remotely Piloted Aircraft Systems (RPAS) which can also be applied to Class II RPAS. The implementation of this architecture allows that a single RPAS will be able to carry a variety of sensors on-board, regardless of its manufacturer, including the maximization of sensor data fusion performance, enhancing the RPAS capabilities in hostile environments and the improvement in the payload sensors interoperability and integration properties together with a higher reliability, flexibility and a lower product life - cycle costs for both, manufacturers and final users. This study includes a review of the state of the art of the related technologies, the definition of scenarios, requirements and business cases to justify its implementation, the definition of an architecture and BB initially designed and the generation of a roadmap to implement this concept successfully in the next years. During the process, a Systems Engineering methodology based on TOGAF and NAFv3 was applied.
Radiation-induced degradation of optoelectronic sensors
C. Inguimbert, T. Nuns, D. Hervé, et al.
Space system undergo particularly hard natural radiation environment, but can also potentially be subject to the radiations injected in low earth orbit by the explosion of a nuclear weapons. The increasing use of optoelectronic components in space systems makes the risk assessment regarding the radiation effects of an increasing interest. This paper presents recent results about the degradation of optoelectronic devices in term of atomic displacements. This paper Most of this work has been developed under the EDA contract JIP-ICET2 A-1341-RT-GP within the CapTech Technologies for Components and Modules’ (TCM) in EDA. (Tracking #: SD102-11)
Modelling and Simulation I
icon_mobile_dropdown
Assessing detection performance of night vision VIS/LWIR-fusion
Thermal imagers (TI) and low light imagers (LLI; e.g. Night Vision Goggles, NVG) are today’s technologies for nighttime applications. Both possess their own advantages and disadvantages. LLI typically is ineffective in dark areas whereas TI operates also in complete darkness. On the other hand, TI often does not provide enough scene details, hindering situational awareness. Hence, it is nearby to combine the systems for common use. Today such combined systems are available as so-called Fusion Goggles: a LWIR bolometer optically fused to an image intensifier based NVG. Future development will probably replace the NVG by solid-state night vision technologies enabling sophisticated image fusion.

Performance assessment and modeling of fused systems is an open problem. Main reason is the strong scene and task dependency of fusion algorithms. The idea followed here is to divide the task in detection and situational awareness ones and analyze them separately. An experimental common line of sight system consisting of a LWIR-bolometer and a low light CMOS camera was set-up to do so. Its first use was detection performance assessment. A data collection of a human at different distances and positions was analyzed in a perception experiment with the two original bands and eight different fusion methods. Twenty observers participated in the experiment. Their average detection probability clearly depends on the imagery. Although the resolution in LWIR is three times worse than the visual one, the achieved detection performance is much better. This transforms in the fused imagery also, with all fusion algorithms giving better performance than the visual one. However, all fusion algorithms to a different degree decrease observer performance compared to LWIR alone. This result is in good agreement with a Graph-Based Visual Saliency image analysis. Thus, it seems possible to assess fusion performance for the detection task by saliency calculations.
Data collection and preliminary results on turbulence characterisation and mitigation techniques
M. -T. Velluet, C. Bell, J. -F. Daigle, et al.
In the framework of NATO task group SET 226 on turbulence mitigation techniques for OA systems, a trial was conducted in the premises of RDDC-Valcartier, using indoor and outdoor facilities in September 2016. Images data sets were collected under various turbulence conditions, both controllable (indoor) and natural (outdoor). The imagery of this trial was used in the Grand Challenge, where different experts were asked to process identical input data with state-of-the-art algorithms. The trial also provided a data-base to validate theoretical and numerical models. The paper will give an overview of the experiment set-up (target, sensors, turbulence screens generators…) and present some preliminary results obtained with the collected data in terms of effectiveness of image processing techniques, new methods for turbulence characterisation, modelling of laser beam propagation.
Infrared system simulation of airborne target detection on space-based platform
With the continuous occurrence of aircraft crash, it is very important to realize the detection of aerial targets on the spaced-based platform. Many countries have carried out some researches in this field, but there is still no good conclusion about the methods and systems for aerial target detection. Meanwhile, the actual cost of satellite experiments is very expensive, and it is impractical to test the detection system by launching satellites several times. Therefore, the system simulation model can be used as the basis for the design of detection system. In the simulation process, combined with the optical system parameters and detector indicators, the imaging relationship between the satellite platform, the turntable and the target are calculated, and various imaging modes such as scanning and gazing are obtained according to the specific parameters of the actual application. This simulation mode directly presents the actual satellite motion, camera imaging and target motion state. And such a simulation system greatly shortens the actual design time of the system in engineering applications. It more realistically inverts the actual operating state and can obtain the detection result without the actual satellite launch. And such a simulation system can flexibly change parameters according to the actual conditions, so it can not only be applied to aerial targets detection, but also play an important role in other fields.
Modelling and Simulation II
icon_mobile_dropdown
Kinematic analysis of imaging seekers with roll-over-nod gimbal and a folded electro-optical layout
Kutlu D. Kandemir, Yigit Yazicioglu, Bulent Ozkan
Nod-over-roll is a commonly used gimbal configuration especially in air-to-air missile seekers due to its volumetric advantage. It may further scaled down by implementing a folded optical layout and locating the detector off-gimbal. Yet, the concept suffers from an inherent kinematic singularity problem right at the center of its task space, where roll axis and pointing vectors coincide. The phenomenon is called zenith pass problem and has to be solved in real time for proper target tracking. A second drawback of the mentioned seeker structure is image registration problem, which reflects itself as rotation of constructed image on the FPA, and shall be considered while localizing the target. This work focuses on the kinematic analysis of the zenith pass and image registration problems in an air-to-air missile seeker perspective.
Feedback control method for limiting interfering Gaussian beams in a bistatic substance-on-surface chemical recognizer
Richard Fauconier
In a previous paper, the use of a bistatic optical instrument for substance-on-surface chemical recognition was introduced. The apparatus proposed showed that the bistatic arrangement of multiwavelength emitter and sensor potentially allow certain unknown optical properties of the interrogated chemical (on a supporting surface) to be measured in real time. These optical variables have long plagued the results of monostatic infrared chemical recognisers with ambiguity, by virtue of being unknown and difficult to measure in real time outside of a controlled laboratory setting—(1) unknown optical properties of the supporting surface beneath the chemical layer, (2) unknown thickness and refractive index of the chemical film and (3) unknown angles of incidence and detection. It was previously shown that it is possible (and essential) to limit to a single pair, the number of narrow laser beams reaching the bistatic apparatus’s detector from its emitter, after the beams have impinged on and propagated through the interrogated substance. In this paper, a mechanism and feedbackcontrol method are discussed to accomplish two tasks: (1) limit the number of narrow beams reaching the detector from the emitter to two, and (2) determine the separation between the resulting pair of beams in real time. The beam separation is a real-time variable that is essential to determining the thickness of the unknown substance in the field, thereby removing one cause of false substance identification. The beams discussed are narrow Gaussian beams that are frequency-modulated. A generic movable variable aperture apparatus device is described that, when controlled via feedback, can position its aperture and set it to the appropriate sizes so as to exclude multiple reflections of the incident interrogating beam and thereby limit to two, the number of beams entering the detector. The feedback control system is also described with an appropriate set of state-space equations and a prototype for a robust feedback control methodology.
Vessel track summarization by viewpoint selection
For military operations, the availability of high-quality imaging information from Electro-Optical / Infrared (EO/IR) sensors is of vital importance. This information can be used for timely detection and identification of threatening vessels in an environment with a large amount of neutral vessels. EO/IR sensors provide imagery of all vessels at different moments in time. It is challenging to interpret the images of the different vessels within a larger region of interest. It is therefore helpful to automatically detect and track vessels, and save the detections of the vessels, called snapshots, for identification purposes.

Of all available snapshots, only the best and most representative snapshots should be selected for the operator. In this paper, we present two different approaches for snapshot selection from a vessel track. The first is based on directional track information, and the second on the snapshot appearance. We present results for both these methods on IR recordings, containing vessels with different track patterns in a harbor scenario.
Processing and Analysis
icon_mobile_dropdown
Pixel-wise infrared tone remapping for rapid adaptation to high dynamic range variations
Biological vision systems can perform target selection, pattern recognition, and dynamic range adaptation at capability levels far beyond that of human-designed methods. This paper applies a two-stage Biologically-Inspired Vision (BIV) model for image pre-processing and infrared tone remapping, derived from the visual pipeline of the hover y. The first stage performs spatially invariant, pixel-wise, intensity normalization, to intelligently compress scene dynamic range and enhance local contrasts using an adaptive gain control mechanism. The second stage of the model applies adaptive spatio-temporal filtering to reduce redundancy within image sequences. Our experiments demonstrate the strengths of the model on four practical tasks. For large targets, the model acts as a sophisticated edge extractor. The examples show the ability to retrieve the complete structure of a boat from sea clutter, increasing the global contrast factor by 165%. Secondly and thirdly, for small and weak-signature targets, segmentation is demonstrated. A filter is applied to track a 2x2 pixel dragon y without interruption, and a small maritime vessel, extracted as it passes in front of a larger vessel of similar emissivity. Finally, the power of the BIV model to rapidly compress dynamic range and normalize sudden changes in scene luminance is validated by means of incandescent pyrotechnic pellets launched from an aerial platform.
Poster Session
icon_mobile_dropdown
Improving the stabilization level of ISP system using feedforward compensators
The stabilization of the Line of Sight (LOS) of an Electro-Optical System (EOS) locating on a moving platform contributes significantly to the image quality. A portion of perturbation inherited by the base motion can be eliminated by numbers of algorithms with powerful processors boards. However, it is difficult to implement these algorithms in embedded systems because of memory capacity and processing speed limitation. This paper introduces a method for identifying gimbal parameters and feedforward compensators. The key parameters including the friction force, cross-coupling effect and misalignment compensator are investigated using feedforward compensator theory and verified by practical experiments. The effectiveness in the stabilization loop is validated through many scenarios of disturbance. The result shows that the good improvement for the stabilization level of inertial stabilization platform has been achieved, and the reduction of LOS RMS errors is up to 40 per cent.
A simple method for determining distances by range-gated vision systems with different forms of illuminating pulses
It is proposed to single out two methods for observing object using range-gated vision systems (RGVS). In the first case the distance between RGVS and an object is fixed. Its observation is made by changing time delay between the leading edges of pulses of laser illumination and gate-pulses of the receiving block, i.e. neighboring layers of space are sequentially seen along the optical axis (visibility zone shifts). In the second case time delay is fixed. This corresponds either to observation of object moving along the optical axis of the system or to the study of ensemble of objects in visibility zone at different distances, including the observation on inclined path. It is shown that such a division of the observation methods has a definite physical justification. Division of the two methods for observing object will promote systematization and better understanding of the investigation results on RGVS.
The regularities were investigated of formation of range-energy profiles (REP) of visibility zone for RGVS with illuminating pulses, the shape of which differs from the rectangular (triangular or trapezoidal) one. It was established that if the illuminating pulse length ΔtL is smaller or equal to the length of gate-pulse ΔtG of the receiving block, then the expressions for characteristic distances coincide with the case of rectangular-shaped pulses and they can be used to determine distances to objects for pulses having non-rectangular shape. At ΔtL > ΔtG in a case of illuminating pulses having triangular shape REP possesses bell-like shape. For illuminating pulses having trapezoidal shape REP has either bell-like or trapezoidal shape. The last shape appears when the duration of the upper base of the illuminating pulse having trapezoidal shape exceeds the gate-pulse duration. The empirical method for determining characteristics distances to the REP maximum and boundary points of plateau area, which can be used for calculation of the distance to the object. Using calibration constants the method was proposed for calculating the distances to objects and its efficiency was experimentally proved.
Erratum
icon_mobile_dropdown
Kinematic analysis of imaging seekers with roll-over-nod gimbal and a folded electro-optical layout (Erratum)
Kutlu D. Kandemir, Yigit Yazicioglu, Bulent Ozkan
Publisher’s Note: This paper, originally published on 9 October 2019, was replaced with a corrected/revised version on 20 February 2020. If you downloaded the original PDF but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance.