Multispectral imaging is a useful tool to planetary scientists only if the sensor is sufficiently sensitive to address the scientific questions of interest. In this paper, we demonstrate a quantitative relationship between spectroscopic imaging sensor noise and geologic interpretation of the planetary surface being imaged. By linking surface properties (e.g., chemistry, mineralogy, particle size) to spectra using radiative transfer theory, we determine the relationship between sensor noise and various surface properties which dictate the geologic interpretation of the surface. This relationship can be applied to both 1) past mission data with known sensor performance to determine uncertainty in the scientific interpretation of the data and 2) future mission planning of signal-to-noise requirements to meet specific scientific goals. We use past (NASA’s Clementine), present (ESA’s SMART-1), and future (JAXA’s SELENE) lunar missions as explicit examples.
Unmixing hyperspectral images inherently transfers error from the original hyperspectral image to the unmixed fraction plane image. In essence by reducing the entire information content of an image down to a handful of representative spectra a significant amount of information is lost. In an image with low spectral diversity that obeys the linear mixture model (such as a simple geologic scene), this loss is negligible. However there exist inherent problems in unmixing a hyperspectral image where the actual number of spectrally distinct items in the image exceeds the resolving ability of an unmixing algorithm given sensor noise. This process is demonstrated here with a simple statistical analysis. Stepwise unmixing, where a subset of end-members is used to unmix each pixel provides a means of mitigating this error. The simplest case of stepwise unmixing, constrained unmixing, is statistically examined here. This approach provides a significant reduction in unmixed image error with a corresponding increase in goodness of fit. Some suggestions for future algorithms are presented.
Hyperspectral images can be conveniently and quickly interpreted by detecting spectral endmembers present in the image and unmixing the image in terms of those endmembers. However, spectral diversity common in hyperspectral images leads to high errors in the unmixing process by increasing the likelihood that spectral anomalies will be detected as endmembers. We have developed an algorithm to detect target-like spectral anomalies in the image which are likely to detrimentally interfere with the endmember detection process. The hyperspectral image is preprocessed by detecting target-like spectra and masking them from the subsequent endmember detection analysis. By partitioning target-like spectra from the scene, a set of spectral endmembers is detected which can be used to more accurately unmix the image. The vast majority of data in the original image can be interpreted in terms of these detected spectral endmembers. The few spectra which represent the bulk of the spectral diversity in the scene can then be interpreted individually.
The University of Hawaii's Efficient Materials Mapping program aims to automatically and rapidly produce material maps from hyperspectral scenes. The program combines an end- member determination algorithm and a material identification algorithm to produce context maps in real time without user intervention. The material identification algorithm is a combination of a spectral databse and analytic code; each spectrum in the library augmented with computer readable diagnostic instructions. At present, the material library consists of over three hundred different spectra, generally geological materials from the USGS digital spectral library, however selected spectra from other libraries have been incorporated. Our method has been applied to an AVIRIS sceme taken over Kaneohe Bay, Hawaii. This scene contains large expanses of ocean, developed and undeveloped land, thus providing a good test bed for the program. The results of applying this methodolgy were verified by ground truth where possible by team equipped with hand held spectrometer. Algorithm derived archetypical en-member locations were well matched well by the material identification database, however the end-member determination itself operated sub- optimally on this scene. These results will guid progress with respect to the continued development of this program.
The MUlti Sensor Trial 2000 experiment was a multi-platform remote sensing deployment in Cairns Australia. Included in the deployment were both visible and infrared airborne hyperspectral images. The University of Hawaii's Airborne Hyperspectral Imager represented the thermal infrared portion of the data collect. The ability to discriminate various targets using the thermal infrared was explored. Consequent data processing involved separating targets from clutter using matched filters. In addition, a preliminary atmospheric correction algorithm was developed based on the ISIS algorithm used in SEBASS.
The AHI sensor consists of a long-wave infrared pushbroom hyperspectral imager and a boresighted 3-color visible high resolution CCD linescan camera. The system used a background suppression system to achieve good noise characteristics (less than 1(mu) fl NESR). Work with AHI has shown the utility of the long-wave infrared a variety of applications. The AHI system has been used successfully in the detection of buried land mines using infrared absorption features of disturbed soil. Recently, the AHI has been used to examine the feasibility active and passive hyperspectral imaging under outdoor and laboratory conditions at three ranges. In addition, the AHI was flown over a coral reef ecosystem on the Hawaiian island of Molokai to study fresh water intrusion into coral reef ecosystems. Theoretical calculations have been done propose extensions to the AHI design in order to produce an instrument with a higher signal to noise ratio.
Hyperspectral data rates and volumes challenge analysis approaches that are not highly automated and efficient. Derived products from hyperspectral data, which are presented in units that are physically meaningful, have added value to analysts who are not spectral or statistical experts. The Efficient Materials Mapping project involves developing an approach that is both efficient in terms of processing time and analyzed data volume and produces outputs in terms of surface chemical or material composition. Our approach will exploit the typical redundancy inherent in hyperspectral data of natural scenes to reduce data volume. This data volume reduction is combined with an automated approach to extract chemical information from spectral data. The results will be a method to produce maps of chemical quantities that can be readily interpreted by analysts specializing in characteristics of terrains and targets rather than photons and spectra.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.