The Australian Defence Science and Technology Group have developed novel single photon avalanche diode (SPAD) arrays using Silicon based complementary metal-oxide semiconductor (CMOS) processes. The first of these was a simple 32x32 pixel array, followed by higher density arrays developed with our partners. These single photon detector arrays have inherently low dark currents and we have used them in several Flash LADAR systems, including an innovative design where the LADAR is cued by an 8-12 micron infrared imager which shares a common aperture. The use of Flash LADAR (rather than scanning) has the advantage that moving targets can be imaged accurately. We have developed modelling and simulation tools for predicting SPAD LADAR performance and use processing techniques to suppress ‘background’ counts and resolve targets that are obscured by clutter. In this paper we present some of our initial results in discriminating small (<1 m) targets at ranges out to 10 km. Results from our field experiments include extraction of a 0.5m object at 10 km and identification of a small flying UAV.
This paper describes development of two key functionalities within the VIRSuite scene simulation program, broadening its scene generation capabilities and increasing accuracy of thermal signatures. Firstly, a new LADAR scene generation module has been designed. It is capable of simulating range imagery for Geiger mode LADAR, in addition to the already existing functionality for linear mode systems. Furthermore, a new 3D heat diffusion solver has been developed within the VIRSuite signature prediction module. It is capable of calculating the temperature distribution in complex three-dimensional objects for enhanced dynamic prediction of thermal signatures. With these enhancements, VIRSuite is now a robust tool for conducting dynamic simulation for missiles with multi-mode seekers.
At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.
A new enhanced resistor array projector nonuniformity correction (NUC) process based on the flood method is
presented. It relies on precise characterisation of the projector-camera optical system. The information obtained from
the characterisation procedure is used for the rapid derivation of NUC coefficients in a minimal number of iterations.
The new NUC process benefits from using large format cameras but, in contrast to previous flood methods, it does not
depend critically on 1:1 mapping and can be performed using smaller format cameras. Other benefits are improved
handling of emitter array imperfections and sampling artefacts. The procedure is fast and alleviates camera temporal
effects such as drift. In order to further isolate the correction process from temporal effects, we have implemented a
new multi-point multi-temperature camera calibration procedure that allows the corrections to be applied in real time.
We describe our procedure and discuss other possible NUC improvement strategies.
Continuing interest exists in the development of cost-effective synthetic environments for testing Laser Detection and
Ranging (ladar) sensors. In this paper we describe a PC-based system for real-time ladar scene simulation of ships and
small boats in a dynamic maritime environment. In particular, we describe the techniques employed to generate range
imagery accompanied by passive radiance imagery. Our ladar scene generation system is an evolutionary extension of
the VIRSuite infrared scene simulation program and includes all previous features such as ocean wave simulation, the
physically-realistic representation of boat and ship dynamics, wake generation and simulation of whitecaps, spray,
wake trails and foam. A terrain simulation extension is also under development. In this paper we outline the
development, capabilities and limitations of the VIRSuite extensions.
We describe recent improvements in our maritime scene generation program and the extension in capabilities that has
been achieved. The motion of multiple boats under independent control can now be simulated, as well as large ship
motion. The effects we simulate include ocean surfaces in different sea states, the physically-realistic representation
of boat and ship dynamics, wake generation and generation of surface effects including whitecaps, spray, wake trails
and foam. We describe our graphical user interface tools, the underlying phenomena that they control and their
application in enabling versatile real-time maritime scene simulation.
With the development and widespread availability of computer graphics cards, complex infrared scenes can now be
readily generated for application in real-time hardware-in-the-loop simulations. It is important that the best efforts are
made to ensure that the scenes are radiometrically valid, to the level where the operation of the imaging infrared unitunder-
test can be properly emulated. In this paper we describe the techniques we employ to ensure radiometric
validity within our real-time aircraft and boat simulation applications of current interest.
Hardware-in-the-Loop (HWIL) simulation is becoming increasingly important for cost-effective testing of imaging
infrared systems. DSTO is developing real-time scene generation and image processing capabilities within its HWIL
simulation programs, based on the application of COTS desktop PCs equipped with Graphics Processing Unit (GPU)
cards, and including limited use of Field Programmable Gate Arrays (FPGAs). GPUs and FPGAs are high-performance
parallel computing machines but are fundamentally different types of hardware. To determine which hardware type
should be used to implement a real-time solution of a given application, a methodology is required to expose the
concurrency within the problem and to structure the problem in a way that can be mapped to the hardware types. In this
paper we use parallel programming patterns to compare the architectures of recent generation GPUs and FPGAs. We
demonstrate the decomposition of a parallel application and its implementation on GPU and FPGA hardware and present
We assess the issues that need to be addressed to ensure that a resistor array infrared projector is capable of validly
simulating the real world. These include control of the additional sources of blurring and aliasing arising from the
presence of the projector and its associated scene generation system, nonuniformity correction, busbar robbing,
spurious back reflections and narcissus. In particular, we reconfirm that a 2 × 2 projector/unit-under-test pixel
mapping ratio offers a good compromise for controlling the additional blurring and aliasing, and furthermore, we
demonstrate achievement of projector nonuniformity noise equivalent temperature differences (NETDs) in the 20 mK
We describe the extension of our real-time scene generation software VIRSuite to include the dynamic simulation of
small boats and their wakes within an ocean environment. Extensive use has been made of the programmabilty
available in the current generation of GPUs. We have demonstrated that real-time simulation is feasible, even
including such complexities as dynamical calculation of the boat motion, wake generation and calculation of an FFTgenerated
Methods to correct for atmospheric degradation of imagery and improve the "seeing" of a telescope are well known in astronomy but, to date, have rarely been applied to more earthly matters such as surveillance. The intrinsically more complicated visual fields, the dominance of low-altitude distortion effects, the requirement to process large volumes of data in near real-time, the inability to pre-select ideal sites and the desirability of ruggedness and portability all combine to pose a significant challenge.
Field Programmable Gate Array (FPGA) technology has advanced to the point where modern devices contain hundreds of thousands of logic gates, multiple "hard" processors and multi-gigabit serial communication links. Such devices present an ideal platform to tackle the demands of surveillance image processing.
We report a rugged, lightweight system which allows multiple FPGA "modules" to be added together in order to quickly and easily reallocate computing resources. The devices communicate via 2.5Gbps serial links and process image data in a streaming fashion, reducing as much data as possible on-the-fly in order to present a minimised load to storage and/or communication devices.
To maximise the benefit of such a system we have devised an open protocol for FPGA-based image processing called "OpenStream". This allows image processing cores to be quickly and easily added into or removed from the data stream and harnesses the benefits of code-reuse and standardisation. It further allows image processing tasks to be easily partitioned across multiple, heterogeneous FPGA domains and permits a designer the flexibility to allocate cores to the most appropriate FPGA. OpenStream is the infrastructure to facilitate rapid, graphical, development of FPGA based image processing algorithms especially when they must be partitioned across multiple FPGAs. Ultimately it will provide a means to automatically allocate and connect resources across FPGA domains in a manner analogous to the way logic synthesis tools allocate and connect resources within an FPGA.
Resistor array infrared projector nonuniformity correction (NUC) is currently limited in fidelity. In the flood
technique a fundamental limitation has been the inevitable presence of Moire fringes. In this paper, an advanced NUC
procedure is described in which the Moire patterns are successfully subtracted, leading to improved levels of residual
nonuniformity. It is shown that, irrespective of the projection technology, the Moire fringes exist at the unit-under-test
image plane where they appear in general as sampling noise. Their control through choice of mapping ratio is
Research leading towards the continued improvement in resistor array infrared projector nonuniformity correction
(NUC) is reported, particularly at low drive levels relevant to thermal imager and FLIR test and evaluation
applications. Moire fringes have been successfully compensated, as has the checkerboard effect seen in earlier flood
NUC measurements. With these improvements, the residual nonuniformity associated with the random spatial noise
has been reduced successfully to the 0.1-0.2% rms level, equivalent to 20-60 mK noise equivalent temperature
differences. The random noise is accompanied, however, by a low spatial frequency fixed pattern, currently
unexplained but possibly attributable to busbar robbing in the electronic backplane.
Resistor array infrared projectors offer the unique potential of simultaneously covering both a wide apparent temperature range and providing fine temperature resolution at low output levels. The temperature resolution capability may not be realized, however, if the projector error sources are not controlled; for example, residual nonuniformity after nonuniformity correction (NUC) procedures have been applied, temporal noise in analog drive voltages and quantization at several points in the projection system, all of which may introduce errors larger than the desired resolution. In this paper the temperature resolution limits are assessed in general. In particular, the quantization errors are assessed and the post-NUC residual nonuniformity levels required for achievement of fine temperature resolution are calculated.
Results from sparse grid and flood nonuniformity correction (NUC) obtained using the DSTO Primary Infrared Scene
Projection at 1:1 mapping ratio are reported. Residual nonuniformities in the 0.5-1.0% range are currently being
achieved, the flood results equating to noise equivalent temperature differences in the 50-100mK range within the low
drive thermal imager and FLIR simulation region. The NUC techniques and results are discussed in the light of both
their present applicability and scope for further improvement.
Results from application of the sparse grid nonuniformity correction procedure within the DSTO resistor array Primary Infrared Scene Projection system are reported. In particular, the techniques used to cover the full dynamic range and to combat camera drift are described. The effectiveness of the projector NUC procedure is assessed and discussed in terms of the scope for further improvement.
Array nonuniformity is the dominant factor limiting the temperature resolution of the current generation of emissive dynamic infrared scene projectors. Over the past five years or so numerous papers have been presented associated with the measurement of the array nonuniformities and the design and implementation of efficient nonuniformity correction (NUC) techniques. A considerable amount of progress has been made towards achieving the desired NUC goals. A number of factors, however, limit the achievement of fine temperature resolution within emissive infrared projection systems, improvements still being needed to achieve residual nonuniformity levels low enough to satisfy the demanding requirements of low NETD thermal imaging systems. In particular, the NUC camera has a strong influence on the effectiveness of the projector NUC procedure. In this paper we describe an alternative method for collecting projector NUC data that relies on the use of several integration times and also multiple calibration points for correcting the camera nonuniformities, the method being designed to improve the accuracy of the projector NUC procedure.
Proc. SPIE. 5408, Technologies for Synthetic Environments: Hardware-in-the-Loop Testing IX
KEYWORDS: Nonuniformity corrections, Cameras, Resistance, Field programmable gate arrays, Control systems, Data processing, Projection systems, Infrared radiation, Field effect transistors, Algorithm development
The new generation PC-based array control electronics (PACE) system for emissive infrared projector real-time scene data processing has opened the potential for the development of more complex real-time nonuniformity correction (RNUC) algorithms than were formerly possible. In this paper, emitter array response data are analyzed in order to identify the underlying physical processes and to identify the form of the RNUC algorithm they suggest. It is shown that although the PACE system is capable of processing the algorithm, the development of a practical RNUC processor would seem to be limited by the complexities that underlie the observed variability in emitter response.
The DSTO Primary Infrared Scene Projection (PIRSP) system has been used to investigate the practical application of the emitter array flood nonuniformity correction (NUC) technique. In the first instance the measurements have been limited to the special case of unity mapping ratio. The methods for achieving unity mapping at sub-pixel registration are described; in particular, the use of Moire fringes for accurately measuring the optical distortion across the field-of-view and for attaining the optimal mapping condition. Application of the flood NUC technique within the PIRSP system is discussed in terms of its convergence limitations. The latter include the presence of spatial and temporal camera noise, optical distortion, the mixing of neighbouring pixel information due to the finite point spread function and radiance-to-voltage transformation errors.
A new infrared projector emitter response curve-fitting procedure suitable for generating nonuniformity coefficients capable of being applied in existing real-time processing architectures is introduced. The procedure has been developed through detailed analysis of a Honeywell Multi-Spectral Scene Projector (MSSP) sparse array data set, combined with an appreciation of the underlying physical processes that lead to the generation of infrared radiance.
This paper provides an improved method of measuring the modulation transfer function (MTF) in undersampled systems. We show that the currently used canted slit 2D FFT method is limited because interference between the aliased Fourier components and the side-peaks in the non-aliased signal significantly influences the MTF measurements at spatial frequencies larger than the Nyquist frequency. In our new approach, the effective temperature of the slit illumination varies along the slit, with the intensity profile chosen to minimize the interference between the aliased and non-aliased signal components. The accuracy of the measurement procedure has been improved to the point where the main limitation is the temporal and the fixed pattern noise of the IR camera. Experimental results confirming the accuracy at frequencies both below and above the Nyquist frequency are presented.
Proc. SPIE. 4717, Technologies for Synthetic Environments: Hardware-in-the-Loop Testing VII
KEYWORDS: Nonuniformity corrections, Error analysis, Computer simulations, Data processing, Signal processing, Black bodies, Projection systems, Infrared radiation, Field effect transistors, Temperature metrology
An alternative class of infrared projector real-time nonuniformity correction processor is introduced, based on the concept that the fundamental role of the processor is to reverse each of the projector processing steps as the input DAC voltage word is converted into infrared signal radiance output. The design is developed by assessment of the sequence of processes occurring within the projector and is tested by simulation. It is shown that there is potential for high fidelity nonuniformity correction across the infrared dynamic range without the need for the introduction of curve-fitting breakpoints.
The search for optimal IR scene projection nonuniformity correction procedures reported in earlier papers is continued. In this paper the application of the flood nonuniformity correction procedure described earlier is extended to the case where only a sublattice of projector pixels is lit, enabling nonuniformity correction for the practically interesting case of greater-than-unity mapping ratios.
The search for optimal infrared scene projection nonuniformity correction procedures reported in earlier papers is continued. in this paper the application of the specialized flood nonuniformity correction algorithm described earlier is extended to the more practical case where the pixel-to-pixel mapping is imperfect.
The one-dimensional nonuniformity scene generation method presented in an earlier paper is extended to the two- dimensional case of real interest. It is shown that the algorithm applied to the one-dimensional case is extendable although its speed of convergence is reduced in two dimensions because of the increased mixing of nonuniformity information. Alternative nonuniformity correction algorithms are developed and compared and it is demonstrated that by utilizing an estimate of the point spread function the scene correction efficiency can be substantially improved.