In recent hardware-in-the-loop tests conducted in a cryogenic chamber, a dual band sensor observed radiometric anomalies for extended targets. In order to understand the radiometric errors associated with the infrared projection arrays, systematic measurements were performed at both cryogenic and ambient temperatures. Air Force Research Laboratory (AFRL) engineers have previously investigated an artifact observed in these arrays called "busbar robbing," but these observations were of square blocks of emitters and did not characterize radiometric accuracy of extended targets in a dynamic engagement scenario. It was discovered that when numerous emitters in a contiguous pattern are turned on, rather than scattered over the array, the "busbar robbing" effect causes the actual emitter outputs to be different from what you measure if you drive them to the same level with fewer pixels. When the emitters that are driven have some "aspect ratio" or elongated shape, then the effect is dependent on how this pattern is aligned with the emitter axes. The results of these experiments address the radiometric error that can be expected from the resistor array projectors for end game scenarios when a target becomes extended at both ambient and cryogenic temperatures.
The nonuniformity correction (NUC) of a resistor array is typically performed with a high-grade infrared (IR) camera in
the approximate waveband of a sensor under test (SUT). The array emitter outputs, and therefore the response
nonuniformity, are a complicated function of the spectral band. In this paper, we study the performance obtained when
measuring and NUCing the projector in one spectral band, then using the projector for testing in a different band. This is
a practical necessity, since a test facility typically cannot own cameras for NUCing a projector in the wavebands of all
test articles. We show that some aspects of the NUC can be reliably 'converted' or adjusted from one spectral band to
another. But there are several different mechanisms that contribute to the response nonuniformity, and their dependence
on the spectral band is different. We present several studies showing the results of measuring the nonuniformity in one
band, and operating the projector in a different band.
In seekers that never resolve targets spatially, it may be adequate to calibrate only with sources that have known aperture irradiance. In modern missile interceptors, the target becomes spatially resolved at close ranges, and the seeker's ability to accurately measure the radiance at different positions in the scene is also important. Thus, it is necessary to calibrate the seekers with extended sources of known radiance. The aperture irradiance is given by the radiance integrated over the angular extent of the target in the scene. Thus radiance calibrations and accurately presenting the targets spatially produces accurate irradiances. The accuracy of the scene radiance is also important in generating synthetic imagery for testing seeker conceptual designs and seeker algorithms, and for hardware-in-the-loop testing with imaging projection systems. The routine procedure at the Air Force Research Laboratory Munitions Directorate's AFRL/MNGG is to model and project the detailed spatial and radiometric content of the scenes. Hence, accurate depiction of the radiance in the scene is important. AFRL/MNGG calibrates the complete projection system (synthetic image generator and scene projector) with extended sources of known radiance, not unresolved sources of known irradiance. This paper demonstrates that accurate radiance calibrations and accurate spatial rendering do provide accurate aperture irradiances in the projection systems. In recent tests conducted by AFRL/MNGG, the projection system was calibrated in terms of radiance, and the aperture irradiances were determined both as they were observed in the synthetic images that drove the projection system and in the images of the projection system measured by the unit under test. The aperture irradiances were compared with the known truth data and errors were determined. This paper presents results of analyzing the errors associated with the observed aperture irradiances.
One proven technique for nonuniformity correction (NUC) of a resistor array infrared scene projector requires careful measurement of the output-versus-input response for every emitter in a large array. In previous papers, we have discussed methods and results for accomplishing the projector NUC. Two difficulties that may limit the NUC results are residual nonuniformity in the calibration sensor, and nonlinearity in the calibration sensor's response to scene radiance. These effects introduce errors in the measurement of the projector elements' output, which lead to residual nonuniformity. In this paper we describe a recent effort to mitigate both of these problems using a procedure that combines sensor nonuniformity correction and sensor calibration, detector by detector, so that these problems do not contaminate the projector NUC. By measuring a set of blackbody flood-field images at a dozen or so different temperatures, the individual detector output-versus-input radiance responses can be measured. Similar to the projector NUC, we use a curve-fitting routine to model the response of each detector. Using this set of response curves, a post-processing algorithm is used to correct and calibrate the images measured by the sensor. We have used this approach to reduce several sensor error sources by a factor of 10 to 100. The resulting processing is used to correct and calibrate all of the sensor images used to perform the projector NUC, as one step in the projector NUC. The procedure appears to be useful for any application where sensor nonuniformity or response nonlinearities are significant.
In some of its infrared projection systems, the Kinetic Kill Vehicle Hardware-In-the-Loop Simulator (KHILS) facility uses two 512 x 512 Wideband Infrared Scene Projector (WISP) resistor arrays to stimulate two different camera wavebands at the same time. The images from the two arrays are combined with a dichroic beam combiner, allowing the two camera bands to be independently stimulated. In early tests it was observed that the projector bands were not completely independent. When one array was projecting, the projected pattern could be seen in the opposite camera band. This effect is caused by spectral “crosstalk” in the camera/projector system. The purpose of this study was to build a mathematical model of the crosstalk, validate the model with measurements of a 2-color projection system, and then use the model as a tool to determine the spectral characteristics of filters that would reduce the crosstalk. Measurements of the crosstalk were made in the KHILS 2-color projector with two different 2-color cameras. The KHILS Quantum Well Infrared Photodetector (QWIP) Mid-Wave (MW)/Long-Wave (LW) camera and the Army Research Laboratory HgCdTe (HCT) MW/LW camera were used in the tests. The model was used to analyze the measurements, thus validating the model at the same time. The model was then used to describe conceptual designs of new 2-color projection configurations, enabling a prediction of crosstalk in the system, and selection of filters that would eliminate the crosstalk.
Infrared projection systems based on resistor arrays typically produce radiometric outputs with wavelengths that range from less than 3 microns to more than 12 microns. This makes it possible to test infrared sensors with spectral responsivity anywhere in this range. Two resistor-array projectors optically folded together can stimulate the two bands of a 2-color sensor. If the wavebands of the sensor are separated well enough, it is possible to fold the projected images together with a dichroic beam combiner (perhaps also using spectral filters in front of each resistor array) so that each resistor array independently stimulates one band of the sensor. If the wavebands are independently stimulated, it is simple to perform radiometric calibrations of both projector wavebands. In some sensors, the wavebands are strongly overlapping, and driving one of the resistor arrays stimulates both bands of the unit-under-test (UUT). This “coupling” of the two bands causes errors in the radiance levels measured by the sensor, if the projector bands are calibrated one at a time. If the coupling between the bands is known, it is possible to preprocess the driving images to effectively decouple the bands. This requires performing transformations, which read both driving images (one in each of the two bands) and judiciously adjusting both projectors to give the desired radiance in both bands. With this transformation included, the projection system acts as if the bands were decoupled - varying one input radiance at a time only produces a change in the corresponding band of the sensor. This paper describes techniques that have been developed to perform radiometric calibrations of spectrally coupled, 2-color projector/sensor systems. Also presented in the paper are results of tests performed to demonstrate the performance of the calibration techniques. Possible hardware and algorithms for performing the transformation in real-time are also presented.
An unexpected effect was observed in a data set recently measured at the Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) facility. A KHILS projector was driven to illuminate a contiguous block of emitters, with all other emitters turned off. This scene was measured with a two-color IR sensor. A sequence of 100 images was recorded, and certain statistics were computed from the image sequence. After measuring and analyzing these images, a “border” was observed with a particularly large standard deviation around the bright rectangular region. The pixels on the border of the region were much noisier than either inside or outside of the bright region. Although several explanations were possible, the most likely seemed to be a small vibration of either the sensor or projector. The sensor, for example, uses a mechanical cyro-cooler, which produces a vibration that can be felt by hand. Further analyses revealed an erratic motion of the position of objects in the image with amplitude of a few tents of the detector pitch. This small motion is sufficient to produce large fluctuations in the image pixel values in regions that have a large radiance gradient - such as suggest that the standard deviation of a “block image” sequence is easy to compute and will show the characteristic effect in the presence of image motion as small as a fraction of the detector pitch.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.