Modeling a space borne imaging system is key in predicting mission utility and exploring the sensor design trade space. It is important to capture critical real world phenomena in the modeling as accurately as possible to optimize design parameters. As a step toward optimal design of spectral imaging systems, this work presents simulation techniques that were used to model a panchromatic imaging system and predict well-known image quality metrics for a range of values of a key optical design variable - the effective focal length (EFL).We designed a at desert scene that included 7% and 15% reflectance panels and generated simulated images for a range of EFLs. The panels were used to calculate a sensor signal-to-noise ratio (SNR). The simulation incorporated a summer atmosphere with a collection time and geometry set to produce zenith solar and nadir collection angles. Platform motion and height with a given integration time for known detector parameters were also incorporated to produce images with changing EFL. A point spread function (PSF) of a typical optical system was incorporated that was scaled according to the EFL for a constant aperture diameter in order to capture the optical resolution changes. The PSF along with smear from the platform motion and integration time introduced realistic image blur to enable the relative edge response (RER) for the system to be estimated from the simulated images. The simulation used the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model that incorporates ray-tracing techniques and physics based radiation propagation modules. The SNR, RER, and GSD calculated from the synthetic images were used to estimate the National Imagery Interpretability Rating Scale (NIIRS) rating for a range of EFLs. An optimum EFL was found by this process which included the compensating effects of resolution, sampling, and noise as the EFL changed. This technique is currently being expanded to assess trades for sensor design in order to optimize optical payload designs for multispectral and hyperspectral imaging systems.
Typically a regression approach is applied in order to identify the constituents present in a hyperspectral image,
and the task of species identification amounts to choosing the best regression model. Common model selection
approaches (stepwise and criterion based methods) have well known multiple comparisons problems, and they do
not allow the user to control the experimet-wise error rate, or allow the user to include scene-specific knowledge
in the inference process.
A Bayesian model selection technique called Gibbs Variable Selection (GVS) that better handles these issues is
presented and implemented via Markov chain monte carlo (MCMC). GVS can be used to simultaneously conduct
inference on the optical path depth and probability of inclusion in a pixel for a each species in a library. This
method flexibly accommodates an analyst's prior knowledge of the species present in a scene, as well as mixtures
of species of any arbitrary complexity. A series of automated diagnostic measures are developed to monitor
convergence of the Markov chains without operator intervention. This method is compared against traditional
regression approaches for model selection and results from LWIR data from the Airborne Hyperspectral Imager
(AHI) are presented. Finally, the applicability of this identification framework to a variety of scenarios such as
persistent surveillance is discussed.
We describe a new approach for performing pseudo-imaging of point energy sources from spectral-temporal sensor data collected using a rotating-prism spectrometer. Pseudo-imaging, which involves the automatic localization, spectrum estimation, and identification of energetic sources, can be difficult for dim sources and/or noisy images, or in data containing multiple sources which are closely spaced such that their signatures overlap, or where sources move during data collection. The new approach is specifically designed for these difficult cases. It is developed within an iterative, maximum-entropy, framework which incorporates an efficient optimization over the space of all model parameters and mappings between image pixels and sources, or clutter. The optimized set of parameters is then used for detection, localization, tracking, and identification of the multiple sources in the data. The paper includes results computed from experimental data.
Optical sensors aboard space vehicles designated to perform seeker functions need to generate multispectral images in the mid-wave infrared (MWIR) and long-wave infrared (LWIR) spectral regions in order to investigate and classify man-made space objects, and to distinguish them relative to the interfering scene clutter. The spectral imager part of the sensor collects spectral signatures of the observed objects in order to extract information on surface emissivity and target temperature, both important parameters for object-discrimination algorithms. The Adaptive Spectral Imager described in this paper fulfills two functions simultaneously: one output produces instantaneous two-dimensional polychromatic imagery for object acquisition and tracking, while the other output produces multispectral images for object discrimination and classification. The spectral and temporal resolution of the data produced by the spectral imager are adjustable in real time, making it possible to achieve optimum tradeoff between different sensing functions to match dynamic monitoring requirements during a mission. The system has high optical collection efficiency, with output data rates limited only by the readout speed of the detector array. The instrument has no macro-scale moving parts, and can be built in a robust, small-volume and lightweight package, suitable for integration with space vehicles. The technology is also applicable to multispectral imaging applications in diverse areas such as surveillance, agriculture, process control, and biomedical imaging, and can be adapted for use in any spectral domain from the ultraviolet (UV) to the LWIR region.