This paper describes an unsupervised machine learning methodology capable of target tracking and background suppression via a novel dual-model approach. “Jekyll” produces a video bit-mask describing an estimate of the locations of moving objects, and “Hyde” outputs a pseudo-background frame to subtract from the original input image sequence. These models were trained with a custom-modified version of Cross Entropy Loss. Simulated data were used to compare the performance of Jekyll and Hyde against a more traditional supervised Machine Learning approach. The results from these comparisons show that the unsupervised methods developed are competitive in output quality with supervised techniques, without the associated cost of acquiring labeled training data.
The Ball Aerospace Pipeline Damage Prevention Radar (PDPR) project evaluated the use of airborne synthetic aperture radar (SAR) to detect vehicles and equipment located within buried pipeline right-of-way areas but obscured from visual detection. The project included the configuration of a commercial dual-band SAR/EO system for airborne operations, hardware and software modifications to optimize SAR change detection processing, and the execution of multiple flight tests to characterize SAR performance for the detection of equipment obscured by vegetation. Flight tests were conducted in 2016 and 2017 using X-band, Ku-band and ultra-wide band (UWB) SAR in urban and rural environments. Targets in the open showed close to 100% detection performance while covered target results depended on the amount of vegetative canopy. Detection "through" vegetation was generally better using the UWB system, but vegetation gaps frequently allowed higher spatial resolution detections with the Ku-band system. While large equipment was frequently identifiable in the Ku-band SAR images, having coincident EO imagery proved critical for context and automated deep learning based object identification. The detection performance difference between open and covered conditions clearly illustrates how a collection plan that optimizes open viewing conditions increases the overall probability of detection. This research was performed in response to the Damage Prevention topic through the Technology Development in the Pipeline Safety Research and Development Announcement DTPH5615RA00001.
Spectral Fingerprint Identification (SFI) attempts to incorporate feature finding, text matching, and data fusion
techniques for fast whole cube material identification. In operation, the SFI algorithm translates spectral data into a
feature space where fast text matching between all pixels in a data cube and a preprocessed SFI spectral library can be
performed. Data fusion of the resulting feature matches creates a listing of materials likely contained in a data cube at
both the whole pixel and subpixel level. The Spectral Fingerprint Identification methodology was implemented in a
prototype Opticks plug-in capable of both standalone and Windows based cluster processing.
The ARTEMIS hyperspectral sensor will be the first spaceborne hyperspectral sensor with an on-board real-time
processing capability. The ARTEMIS real-time processor utilizes both anomaly and material detection algorithms to
locate materials of potential interest. To satisfy the real-time processing timelines, the collected data must be reduced
from hundreds of bands to around 64 bins, where a bin can be a single band or the average of a set of bands. A signature
optimization study was conducted to compare various binning algorithms through the analysis of both the detection
characteristics and the discrimination performance before and after spectral binning.