1 November 2005 Realistic matched filter performance prediction for hyperspectral target detection
Author Affiliations +
Abstract
The linear matched filter is widely used in hyperspectral target detection applications. Its detection performance is typically evaluated based on Gaussian (or normal) probability density models. However, it is well known that hyperspectral backgrounds exhibit "long-tail" behavior, which cannot be accurately modeled by normal distributions. In this work, we model the distribution of hyperspectral backgrounds using a more accurate model based on elliptically contoured multivariate t distributions. For the target class, we still use a normal distribution model, because the body of the distribution is important for detection applications. We show that the tails of the background distribution deteriorate the performance of the matched filter. In this sense, performance predictions based exclusively on normal models, which ignore the long tails, can be inaccurate and overly optimistic. The proposed technique, which allows for an easy control of the tail's heaviness, can be a very useful tool for evaluating target detection performance for backgrounds with both light and heavy tails.
©(2005) Society of Photo-Optical Instrumentation Engineers (SPIE)
Dimitris G. Manolakis "Realistic matched filter performance prediction for hyperspectral target detection," Optical Engineering 44(11), 116401 (1 November 2005). https://doi.org/10.1117/1.2125487
Published: 1 November 2005
Lens.org Logo
CITATIONS
Cited by 17 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Target detection

Statistical analysis

Sensors

Performance modeling

Hyperspectral target detection

Data modeling

Signal to noise ratio

Back to Top