Image Intensifier Tube (IIT) technology remains a critical component of the warfighter's arsenal. However, even after
six decades of fielded systems most IIT inspections are accomplished relying on human judgment and round-robin
calibration techniques. We report on the Automated Intensifier Measurement System (AIMS), a NIST-traceable,
calibratable, machine vision system developed to produce automated, quantifiable, reproducible results on eight of the
major IIT inspections: (1) Useful Diameter, (2) Modulation Transfer Function, (3) Gross Distortion, (4) Shear
Distortion, (5) Bright Spot, (6) Dark Spot, (7) Gain and (8) Uniformity. The overall architecture of the system and a
description of the algorithms required for each test is presented. Translation from the anthropocentric MIL-PRF-A3256363D(CR) OMNI VII Military Specification to measurable quantities (with appropriate uncertainties) is
described. The NIST-traceable system uncertainties associated with each measurement is reported; in all cases AIMS
measures quantities associated with the above tests to more precision than current industry practice. Issues with the
current industry standard equipment and testing methods are also identified. Future work, which will include additional
inspections, is discussed.
The Image Intensifier Tube (IIT) is the most critical component within a night vision device. Acquisition, production, test and evaluation of image intensifier tubes can be greatly enhanced by the application of machine vision technology. The Navy, Air Force and Army have invested over $2,000,000 in the development of a machine vision-based test set known as the Automated Intensifier Measurement System (AIMS). This paper will describe methodologies employed in the AIMS to measure Modulation Transfer Function (MTF), Dark Spots, Bright Spots, Shear Distortion, and Gross Distortion.
In this paper, we present the Video Flashlight System and 3D Visualization Display for providing total battlefield situation awareness by integrating a blanket of ground and aerial video cameras, and UGS data within a 3D model of the site. The system enables visualization of an integrated view of a scene, combining video and sensor data from multiple cameras and UGS. Users can move seamlessly in space -- monitor a site from an aerial view and fly-down to examine suspicious activity up close. The system detects moving objects from all cameras and provides an integrated view of all motion in the monitored zones. Users can click on a moving object to get a zoomed-in view or updated data from the sensor. The aerial and ground videos are geo-registered to a world coordinate system. GPS-located UGS data is correctly positioned on the 3D display. Finally, the system can be used to track multiple objects from camera to camera and make measurements such as velocity, etc., and to fuse data from other emplaced sensors into the displayed scene.
It is extremely difficult and expensive to determine the flight attitude and aimpoint of small maneuvering miniature air vehicles from ground based fixed or tracking photography. Telemetry alone cannot provide sufficient information bandwidth on 'what' the ground tracking is seeing and consequently 'why' it did or did not function properly. Additionally, it is anticipated that 'smart' and 'brilliant' guided vehicles now in development will require a high resolution imaging support system to determine which target and which part of a ground feature is being used for navigation or targeting. Other requirements include support of sub-component separation from developmental supersonic vehicles, where the clean separation from the container is not determinable from ground based film systems and film cameras do not survive vehicle breakup and impact. Hence, the requirement is to develop and demonstrate an imaging support system for development/testing that can provide the flight vehicle developer/analyst with imagery (combined with miniature telemetry sources) sufficient to recreate the trajectory, terminal navigation, and flight termination events. This project is a development and demonstration of a real-time, launch-rated, shuttered, electronic imager, transmitter, and analysis system. This effort demonstrated boresighted imagery from inside small flight vehicles for post flight analysis of trajectory, and capture of ground imagery during random triggered vehicle functions. The initial studies for this capability have been accomplished by the Experimental Dynamics Section of the Air Force Wright Laboratory, Armament Directorate, Eglin AFB, Florida, and the Telemetry Support Branch of the Army Material Research and Development Center at Picatinny Arsenal, New Jersey. It has been determined that at 1/10,000 of a second exposure time, new ultra-miniature CCD sensors have sufficient sensitivity to image key ground target features without blur, thereby providing data for trajectory, timing, and advanced sensor development. This system will be used for ground tracking data reduction in support of small air vehicle and munition testing. It will provide a means of integrating the imagery and telemetry data from the item with ground based photographic support. The technique we have designed will exploit off-the-shelf software and analysis components. A differential GPS survey instrument will establish a photogrammetric calibration grid throughout the range and reference targets along the flight path. Images from the on-board sensor will be used to calibrate the ortho- rectification model in the analysis software. The projectile images will be transmitted and recorded on several tape recorders to insure complete capture of each video field. The images will be combined with a non-linear video editor into a time-correlated record. Each correlated video field will be written to video disk. The files will be converted to DMA compatible format and then analyzed for determination of the projectile altitude, attitude and position in space. The resulting data file will be used to create a photomosaic of the ground the projectile flew over and the targets it saw. The data will be then transformed to a trajectory file and used to generate a graphic overlay that will merge digital photo data of the range with actual images captured. The plan is to superimpose the flight path of the projectile, the path of the weapons aimpoint, and annotation of each internal sequence event. With tools used to produce state-of-the-art computer graphics, we now think it will be possible to reconstruct the test event from the viewpoint of the warhead, the target, and a 'God's-Eye' view looking over the shoulder of the projectile.
Objectivity, measurement accuracy and repeatability are compromised whenever the human vision system is involved in assessing the performance of optical and electro-optical components. One example of this is found in the inspection of image intensifier tubes (IITs). This paper discusses the use of an automated intensifier measurement system (AIMS), that was developed to overcome the drawbacks of the subjective evaluation methods currently in use. The AIMS offers significant improvement over visual inspection techniques typically performed by a human operator. The AIMS quantifies an IIT's performance in a number of categories including: resolution, geometric image distortion, output brightness uniformity (opaque and bright spots), and image format; measurement accuracy is traceable to NIST. Innovative image-processing techniques allow precise characterization of the entire IIT using an off-the-shelf CCD that can capture the complete intensified image without employing scanning or magnifying techniques. The system also performs a self test to ensure correct setup. Now for the first time, because all the necessary setup adjustments and measurements are performed under software control, all subjectivity is eliminated, allowing totally objective measurements to be made. System performance results are discussed.
Combining electro-optic tunable bandpass filters with solid state cameras for multispectral imaging in remote-sensing applications is appealing to say the least, and has been attempted on a prototype level by a number of organizations. Several system design issues must be carefully considered for a commercial or military system that is both quantitative and versatile. This paper addresses selected electro-optical design issues for an imaging spectrometer based on a tunable filter. They include selection of filter parameters, selection of the imaging sensor, optical configuration, and calibration.
Experimental results are reported which measure the variation of SNR performance with input image size for selected Generation 2 microchannel plate intensified CCDs (ICCDs). The evidence shows that SNR performance can begin to decrease when the image size drops below 300 microns. A generalized theoretical model for ICCD SNR performance is also presented.