Hazard learning algorithms employing ground penetrating radar (GPR) data for purposes of discrimination, detection, and classification suffer from a pernicious robustness problem; models trained on a particular physical region using a given sensor (antenna system) typically do not transfer effectively to diverse regions interrogated with differing sensors. We implement a novel training paradigm using region-based stratified cross-validation that improves learning induction across disparate data sets. We test this training paradigm on a novel deep neueral network architecture (DNN) and report empirical results from testing/training on data collected from multiple sites. Furthermore, we discuss the relationship between penalty loss and evaluation metrics.
Ground penetrating radar (GPR) based detection systems have used a variety of different features and machine learning methods to identify buried hazards and distinguish them from clutter and other objects. In this study, we describe a new feature extraction method based on Kolmogorov complexity and information theory. In particular, a three dimensional subset of GPR data centered at alarm location is partitioned into two-dimensional non-overlapping cells. Then, each cell is compressed using gzip and a feature vector is formed from the file sizes of the compressed cells. Finally, an SVM classifier is trained on compression features. The proposed method is applied to data acquired from outdoor test sites containing over 3800 buried hazards, including nonmetal and low-metal targets. The performance is measured by use of ROC curves and compared against four algorithms. These algorithms are based on geometric features and are fused with a powered geometric mean method. The compression-based algorithm outperforms the other individual methods. We also tested different fusion algorithms involving combinations of these five algorithms. The best combination, the product of compression algorithm and two of the others, dominates the current state of the art solution by a significant margin.