Using multivariate data analysis to estimate the classification error rates and separability between sets of data samples is
a useful tool for understanding the characteristics of data sets. By understanding the classifiability and separability of the
data, one can better direct the appropriate resources and effort to achieve the desired performance. The following report
describes our procedure for estimating the separability of given data sets. The multivariate tools described in this paper
include calculating the intrinsic dimensionality estimates, Bayes error estimates, and the Friedman-Rafsky tests.
These analysis techniques are based on previous work used to evaluate data for synthetic aperture radar (SAR) automatic
target recognition (ATR), but the current work is unique in the methods used to analyze large dimensionality sets with a
small number of samples. The results of this report show that our procedure can quantitatively measure the performance
between two data sets in both the measure and feature space with the Bayes error estimator procedure and the Friedman-
Rafsky test, respectively. Our procedure, which included the error estimation and Friedman-Rafsky test, is used to
evaluate SAR data but can be used as effective ways to measure the classifiability of many other multidimensional data
Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.
The ATR community has a strong and growing interest in ATR systems that adapt to changing circumstances and is developing means to solve these dynamic and difficult ATR problems. To facilitate this research, the AFRL COMPASE and SDMS organizations have developed an AdaptSAPS framework for developing and assessing such adaptive ATR systems. This framework, in the form of AdaptSAPS Version 1.0, provides MATLAB code, organized procedures, and an organized database for adaptive ATR systems.
SAIC is applying their Ellipse Detector (ED) to this framework to validate the AdaptSAPS procedures and to test the AdaptSAPS database. The ED previously has shown utility on a variety of sensors and ATR problems. Although computationally efficient, the ED is more complex and much more powerful than simpler detectors such as a two parameter CFAR. However, the ED is not currently implemented as an adaptive ATR.
In this paper, we show the utility of the AdaptSAPS framework for developing and assessing a non-trivial adaptive ATR by embedding the SAIC ED in the AdaptSAPS framework. We point out the strong points and weak points of AdaptSAPS Version 1.0 and recommend enhancements for future versions. In particular, we comment on AdaptSAPS as delivered, the current missions and data bases in AdaptSAPS, and the current performance measures in AdaptSAPS.
Current research in minefield detection indicates that operationally no single sensor technology will likely be capable of detecting mines/minefields in a real-time manner and at a performance level suitable for a forward maneuver unit. Minefield detection involves a particularly wide range of operating scenarios and environmental conditions, which requires deployment of complementary sensor suites. Consequently, the NVESD sponsored Signal Processing and Algorithm Development for Robust Mine Detection (SPAD) Program is currently focusing on the development of computationally efficient and robust detection algorithms applicable to a variety of sensors and on the development of a robust decision level fusion algorithm that exploits these detectors. One SPAD detection technique, called the Ellipse Detector, has been previously reported in the open literature. We briefly report on the continued robust performance of this detector on some new sensor output. We also report on another robust detector developed for sensors that produce output not suitable for the Ellipse Detector. However, the focus of this paper is on the SPAD decision level fusion algorithm, called the Piecewise Level Fusion Algorithm (PLFA). We emphasize the robustness and flexibility of the PLFA architecture by describing its performance and results for both multisensor and multilook fusion.
Proc. SPIE. 4394, Detection and Remediation Technologies for Mines and Minelike Targets VI
KEYWORDS: Target detection, Detection and tracking algorithms, Sensors, Image sensors, Signal processing, Laser Doppler velocimetry, Mining, Algorithm development, Land mines, General packet radio service
Current minefield detection research indicates that operationally no single sensor technology will likely be capable of detecting mines/minefields in a real-time manner and at a performance level suitable for a forward maneuver unit. Minefield detection involves a particularly wide range of operating scenarios and environmental conditions, which requires deployment of complementary sensor suites. We have focused, therefore, on the development of a computationally efficient and robust detection algorithm that exploits robust image processing techniques centered on meaningful target feature sets applicable to a variety of imaging sensors. This paper presents the detection technique, emphasizing its robust architecture, and provides performance results for image data generated by complementary sensors. The paper also briefly discusses the application of this detector as a component of fusion architectures for processing returns form diverse imaging sensors, including multi-channel image data from disparate sensors.
The development of ATR performance characterization tools is very important for the design, evaluation and optimization of ATR systems. One possible approach for characterizing ATR performance is to develop measures of the degree of separability of the different target classes based on the available multi-dimensional image measurements. One such measure is the Bayes error which is the minimum probability of misclassification. Bayes error estimates have previously been obtained using Parzen window techniques on real aperture, high range resolution, radar data sets and on simulated synthetic aperture radar (SAR) images. This report extends these results to real MSTAR SAR data. Our results how that the Parzen window technique is a good method for estimating the Bayes error for such large dimensional data sets. However, in order to apply non-parametric error estimation techniques, feature reduction is needed. A discussion of the relationship between feature reduction and non-parametric estimation is included in this paper. The results of multimodal Parzen estimation on MSTAR images are also described. The tools used to produce the Bayes error estimates have been modified to produce Neyman-Pearson criterion estimates as well. Receiver Operating Characteristic curves are presented to illustrate non- parametric Neyman-Pearson error estimation on MSTAR images.
The Gaussian form of the Bhattacharyya distance measure is being used by some in the automatic target recognition (ATR) community to select features and to estimate an upper performance bound for ATR algorithms. One reason for the popularity of this measure is that it is readily computed. This paper shows through both empirical and analytic results the inadequacy of this metric. Empirical results are obtained by processing ADTS field data through both the Gaussian form of the Bhattacharyya distance and a nonparametric error estimation scheme. Analytic results are obtained by deriving the Gaussian form of the Bhattacharyya distance metric for distributions other than Gaussian. These results show that the Gaussian form of the Bhattacharyya distance cannot be trusted to provide a reliable upper performance bound. Additional empirical and analytic results show by using a nonparametric performance estimator that when the data is transformed to be more Gaussian the Bhattacharyya metric gives better performance estimates. The transformations discussed are the power transform and a mode seeker that decomposes the data into Gaussian modes. A conclusion is that tools can be and should be developed that improve the utility of the Bhattacharyya metric mainly since they provide useful information about the distribution of the data. The major conclusion is that even with these tools nonparametric error estimation techniques are superior. The nonparametric performance bounds are more reliable, and the proper use of the Bhattacharyya metric depends upon considerable knowledge of the data distributions.
In this paper, we discuss a new fusion architecture, including some preliminary results on field data. The architecture consists of a new decision level fusion algorithm, the piecewise level fusion algorithm (PLFA), integrated with a new expert system based user assistant that adjusts PLFA parameters to optimize for a user desired classification performance. This architecture is applicable for both multisensor and multilook fusion. The user specifies classification performance by inputting entries for a desired confusion matrix at the fusion center. The intelligent assistant suggests input alternatives to reach the performance goal based on previously supplied user inputs and on performance specifications of the individual sensors. If deadlock results, i.e., the goal is not attainable because of conflicting user inputs, the assistant will inform the user. As the user and assistant interact, the assistant calculates the parameters necessary to automatically adjust the PLFA for the required performance. These parameters and calculations are hidden from the user. That is, the architecture is designed so that user inputs are intuitive for an unskilled operator. The implementation of this adaptable fusion architecture is due to the relatively simple structure of the PLFA and the expert system heuristic rules. We briefly describe the PLFA structure and operation, illustrate some expert system rules, and discuss preliminary performance of the entire architecture, including a sample dialogue between the user and the intelligent assistant. We conclude this paper with a discussion of future extensions to this architecture that include replacing human interactions with dynamic learning techniques.
The application and utility of multivariate data analysis techniques to synthetic aperture radar (SAR) automatic target recognizer (ATR) analysis and design is demonstrated on synthetically generated image data sets from the Xpatch scattering prediction code. The multivariate techniques and tools demonstrated include sampling interval estimation, intrinsic dimensionality estimation, nonparametric Bayes error estimation for performance evaluation, and estimation of the number of Gaussian modes that approximate the data sets. The utility of these techniques and tools to SAR ATR analysis and design are elucidated through quantitative results and discussions. The analysis techniques and tools discussed are enhancements of earlier ones that have been successfully applied to data sets consisting of a small number of samples of moderate dimensionality. References are given to those earlier reports that describe these methods, their theory, and earlier results. This paper focuses on the analysis and results of the enhanced methods and tools as applied to SAR data sets consisting of a small number of samples of large dimensionality. A considerable synergy of these combined multivariate statistical tools and image simulation tools is demonstrated. A general and powerful methodology for the quantification and evaluation of SAR ATR designs based upon a combination of these analysis and simulation tools is proposed.
Improved performance evaluation results for complex data sets, in the Bayes error estimation sense, are shown by first decomposing the data sets into approximately normal modes before input to the error estimator. More specifically, the utility of a particular nonparametric Bayes error estimator, the Parzen error estimator, is generalized to multimodal data sets by preprocessing the data through a mode seeker before input to the error estimator. The utility of the mode seeker and the Parzen error estimator for data analysis and performance evaluation is demonstrated on a field collected radar data set.