Standard OpenGL-based rendering has sampling limitations. By default these rendering systems point sample rendered pixels. For highly resolved objects, this sampling is adequate to represent the object accurately, but when the object has a relatively small projected area that is on the order of a few pixels, the object intensity is corrupted with aliasing. Hardware anti-aliasing such as multisampling provides minimal relief by offering 4, 8, or 16 samples within a single pixel. However, for hardware-in-the-loop (HITL) scene generation where accurate energy conservation of unresolved sub-pixel objects must be maintained, standard hardware anti-aliasing is not good enough. Zoom anti-aliasing (ZAA) has been proven as a viable solution for rendering objects that would otherwise be grossly under-sampled. Techniques in the past have focused on processing the zoom window pixels in the CPU because the graphics processor unit (GPU) was not general purpose enough to support the zoom window processing. However, this is no longer the case because of the new capabilities of modern graphics processors. This paper presents a modern GPU-based zoom window approach and compares the results to a classic CPU-based approach.
The KHILS Vacuum Cold Chamber (KVACC) has formed the basis for a comprehensive test capability for newly developed dual-band infrared sensors. Since initial delivery in 1995, the KVACC chamber and its support systems have undergone a number of upgrades, maturing into a valuable test asset and technology demonstrator for missile defense systems. Many leading edge test technologies have been consolidated during the past several years, demonstrating the level of fidelity achievable in tomorrow's missile test facilities. These technologies include resistive array scene projectors, sub-pixel non-linear spatial calibration and coupled two-dimensional radiometric calibration techniques, re-configurable FPGA based calibration electronics, dual-band beam-combination and collimation optics, a closed-cycle multi-chamber cryo-vacuum environment, personal computer (PC) based scene generation systems and a surrounding class-1000 clean room environment. The purpose of this paper is to describe this unique combination of technologies and the capability it represents to the hardware-in-the-loop community.
Infrared detectors operating in two or more wavebands can be used to obtain emissivity-area, temperature, and related parameters. While the cameras themselves may not collect data in the two bands simultaneously in space or time, the algorithms used to calculate such parameters rely on spatial and temporal alignment of the true optical data in the two bands. When such systems are tested in a hardware-in-the-loop (HWIL) environment, this requirement for alignment is in turn imposed on the projection systems used for testing. As has been discussed in previous presentations to this forum, optical distortion and misalignment can lead to significant band-to-band and band-to-truth simulation errors. This paper will address the potential impact of techniques to remove these errors on typical two-color estimation algorithms, as well as improvements obtained using distortion removal techniques applied to HWIL data collected at the Kinetic Kill Vehicle Hardware-in-the-Loop Simulator (KHILS) facility.
A revolution is underway within commercial PC video graphics, driven mainly by the 3-D gaming community and its demands for customizable lighting effects and realistic, visually appealing, 3-D rendering. This revolution is bringing about a configurable transformation and lighting (T&L) engine within modern PC video graphics hardware. The results of these technological advancements will profoundly impact the way computer-based rendering is done. Although PC graphics hardware continues to change rapidly, it has evolved to the point where it can be made to address most of the Hardware-In-the-Loop (HWIL) scene generation demands which historically could be accomplished only on costly graphics workstations. With the ability to control how operations are performed within the hardware rendering process, it is possible to implement customized per-pixel spatial and lighting effects. To illustrate how these capabilities can be applied to solve certain HWIL scene generation problems. A graphics hardware approach will be implemented to demonstrate a method of achieving increased monochrome intensity resolution and a user-defined spatial distortion. There is great potential in modern graphics hardware. The limits are becoming less a function of the hardware capabilities and more a function of the ability of engineers and scientists to exploit the functionality of this rapidly advancing hardware rendering technology.
As discussed in a previous paper to this forum, optical components such as collimators that are part of many infrared projection systems can lead to significant distortions in the sensed position of projected objects versus their true position. The previous paper discussed the removal of these distortions in a single waveband through a polynomial correction process. This correction was applied during post-processing of the data from the infrared camera-under-test. This paper extends the correction technique to two-color infrared projection. The extension of the technique allows the distortions in the individual bands to be corrected, as well as providing for alignment of the two color channels at the aperture of the camera-under-test. The co-alignment of the two color channels is obtained through the application of the distortion removal function to the object position data prior to object projection.
Although the sparse grid Non-Uniformity Correction (NUC) technique can accurately correct individual emitters on a resistor array, it is not a good solution for all projection applications. Due to busbar robbing, the sparse grid NUC breaks down when a large number of emitters are turned on simultaneously. For this case, a more appropriate NUC data collection method is needed. This method involves measuring the entire resistor array at once with a 1:1 mapping between the projector and NUC sensor. Then busbar effects, measured during the NUC data collection, can be accounted for and corrected. This paper presents details pertaining to the flood NUC technique and results. This NUC system is implemented at the Kinetic Kill Vehicle Hardware In the Loop Simulator (KHILS) at Eglin AFB, Florida.
The Kinetic Kill Vehicle Hardware-In-the-Loop Simulator, located at Eglin AFB, has developed the capability to perform broadband 2-color testing of guided missile seekers in both ambient and cryogenic environments. The 2-color capability is provided by optically combining two 512 X 512 resistor arrays and projecting through all-reflective optical systems. This capability has raised the following questions: `How would a resistor array, designed to work at ambient conditions, perform when operated in a cryogenic environment?' and `How would a resistor array that was non- uniformity corrected (NUC) at ambient conditions perform when the NUC is applied to the array in a cryogenic environment?' The authors will attempt to address these questions by performing several measurements on a Wideband Infrared Scene Projector (WISP) Phase III resistor array in both ambient and cryogenic conditions. The WISP array performance will be defined in terms of temporal response, spatial non-uniformity, radiometric and thermal resolution, and radiometric and thermal transfer function.
In the past year, Honeywell has developed a 512 X 512 snapshot scene projector containing pixels with very high radiance efficiency. The array can operate in both snapshot and raster mode. The array pixels have near black body characteristics, high radiance outputs, broad band performance, and high speed. IR measurements and performance of these pixels will be described. In addition, a vacuum probe station that makes it possible to select the best die for packaging and delivery based on wafer level radiance screening, has been developed and is in operation. This system, as well as other improvements, will be described. Finally, a review of the status of the present projectors and plans for future arrays is included.
Phase 3 WISP arrays and BRITE arrays are currently being used extensively in many projection systems in many different facilities. These arrays have not been annealed at the factory, and previous tests with the arrays have revealed instabilities in the radiometric output when the arrays are driven at higher voltages. In some applications, the instabilities can be avoided by operating the arrays at lower voltages. In many KHILS applications, it is desirable to drive the arrays with the highest possible voltages to simulate hot missile targets. In one KHILS application (the KHILS VAcuum Cold Chamber, KVACC), the arrays are cooled to near cryogenic temperatures and then driven to high voltages. At lower substrate temperatures, the characteristic responses of the emitters change. Thus, it is important that the response and the stability of the radiometric output of the arrays be well understood for various substrate temperatures, and that the arrays either be annealed or operated below the voltage where the emitters begin to anneal. KHILS has investigated annealing procedures in the past, but there was concern that the annealing procedures themselves -- driving the arrays at high voltages for long times -- would damage the arrays. In order to understand the performance of the arrays better, and to reduce risks associated with driving the arrays at high voltages and operating the arrays at low substrate temperatures, a systematic measurement program was initiated. The radiometric output of new Phase 3 WISP arrays was accurately measured as a function of voltage and time. Arrays designated for testing were driven to the higher voltages and the radiometric output was measured for as long as two hours. Curves indicative of the annealing were observed, and it was determined that the maximum stable output without annealing was about 500 K (MWIR apparent temperature). Blocks of emitters were annealed and tested again. It was determined that stable output of as much as 680 K could be obtained with annealed emitters. KHILS personnel worked with Honeywell Technology Center (HTC) to establish annealing procedures that could be done by HTC in the future. Conclusions to date are that once the emitters are sufficiently annealed, their output does not change further with time, except for some small transient effects that will be discussed in the paper.
The third generation of the Wide-band Infrared Scene Projector (WISP) resistor arrays has been delivered to the Air Force Research Laboratory's Kinetic Kill Vehicle Hardware-in-the-Loop Simulation facility. A critical parameter in determining the extent with which the thermal arrays simulate the real world is the radiometric and thermal resolution. The measurement of the resolution is dependent upon several factors including the input data word resolution, drive electronics resolution, system noise factors, and the measurement sensor. Several measurements were made to quantify the noise components of the WISP array and the measurement sensor to determine the limiting factor for the measurements. Due to the nonlinear transfer function between the command voltage and the projected radiance, measurements were made at several input levels to determine how the resolution varies as a function of command voltage level. Measurements were performed both with and without the spatial non-uniformity correction (NUC) applied to determine the impact of the NUC on the radiometric resolution. Based on the results of these measurements the resolution of the WISP arrays is defined in both radiometric and thermal units.
The Wideband Infrared Scene Projector (WISP) has been undergoing development for the Kinetic-Kill Vehicle Hardware-in-the-Loop Simulator facility at Eglin AFB, Florida. In order to perform realistic tests of an infrared seeker, the radiometric output of the WISP system must produce the same response in the seeker as the real scene. In order to ensure this radiometric realism, calibration procedures must be established and followed. This paper describes calibration procedures that have been used in recent tests. The procedures require knowledge of the camera spectral response in the seeker under test. The camera is set up to operate over the desired range of observable radiances. The camera is then nonuniformity corrected (NUCed) and calibrated with an extended blackbody. The camera drift rates are characterized, and as necessary, the camera is reNUCed and recalibrated. The camera is then set up to observe the WISP system, and calibration measurements are made of the camera/WISP system.
LADAR (Laser Detection and Ranging) as its name implies uses laser-ranging technology to provide information regarding target and/or background signatures. When fielded in systems, LADAR can provide ranging information to on board algorithms that in turn may utilize the information to analyze target type and range. Real-time closed loop simulation of LADAR seekers in a hardware-in-the-loop (HWIL) facility can be used to provide a nondestructive testing environment to evaluate a system's capability and therefore reduce program risk and cost. However, in LADAR systems many factors can influence the quality of the data obtained, and thus have a significant impact on algorithm performance. It is important therefore to take these factors into consideration when attempting to simulate LADAR data for Digital or HWIL testing. Some of the factors that will be considered in this paper include items such as weak or noisy detectors, multi-return, and weapon body dynamics. Various computer techniques that may be employed to simulate these factors will be analyzed to determine their merit in use for real-time simulations.
For more than a decade, there has been considerable discussion about using different IR bands for the detection of low contrast military targets. Theory predicts that a target can have little to no contrast against the background in one IR band while having a discernible signature in another IR band. A significant amount of effort has been invested towards establishing hardware that is capable of simultaneously imaging in two IR bands to take advantage of this phenomenon. Focal plane arrays (FPA) are starting to materialize with this simultaneous two-color imaging capability. The Kinetic Kill Vehicle Hardware-in-the-loop Simulator (KHILS) team of the Air Force Research Laboratory and the Guided Weapons Evaluation Facility (GWEF), both at Eglin AFB, FL, have spent the last 10 years developing the ability to project dynamic IR scenes to imaging IR seekers. Through the Wideband Infrared Scene Projector (WISP) program, the capability to project two simultaneous IR scenes to a dual color seeker has been established at KHILS. WISP utilizes resistor arrays to produce the IR energy. Resistor arrays are not ideal blackbodies. The projection of two IR colors with resistor arrays, therefore, requires two optically coupled arrays. This paper documents the first demonstration of two-color simultaneous projection at KHILS. Agema cameras were used for the measurements. The Agema's HgCdTe detector has responsivity from 4 to 14 microns. A blackbody and two IR filters (MWIR equals 4.2 t 7.4 microns, LWIR equals 7.7 to 13 microns) were used to calibrate the Agema in two bands. Each filter was placed in front of the blackbody one at a time, and the temperature of the blackbody was stepped up in incremental amounts. The output counts from the Agema were recorded at each temperature. This calibration process established the radiance to Agema output count curves for the two bands. The WISP optical system utilizes a dichroic beam combiner to optically couple the two resistor arrays. The transmission path of the beam combiner provided the LWIR (6.75 to 12 microns), while the reflective path produced the MWIR (3 to 6.5 microns). Each resistor array was individually projected into the Agema through the beam combiner at incremental output levels. Once again the Agema's output counts were recorded at each resistor array output level. These projections established the resistor array output to Agema count curves for the MWIR and LWIR resistor arrays. Using the radiance to Agema counts curves, the MWIR and LWIR resistor array output to radiance curves were established. With the calibration curves established, a two-color movie was projected and compared to the generated movie radiance values. By taking care to correctly account for the spectral qualities of the Agema camera, the calibration filters, and the diachroic beam combiner, the projections matched the theoretical calculations. In the near future, a Lockheed- Martin Multiple Quantum Well camera with true two-color IR capability will be tested.
An optical signal injector (OSI) system has been developed for use in the hardware-in-the-loop (HWIL) testing of laser radar (LADAR) seekers. The OSI, in conjunction with a scene generator, generates optical signals simulating the return signals of a LADAR seeker and delivers them to a Unit Under Test. The signals produced by the OSI represent range and intensity (reflectivity) data of a target scene from a given HWIL scenario. The OSI has a modular architecture to allow for easy modification (e.g., operating wavelength, number of optical channels) and is primarily composed of commercial off-the-shelf components to improve reliability and reduce cost. Presented here is a description of the OSI and its capabilities.