Translator Disclaimer
Presentation + Paper
10 May 2019 Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery
Author Affiliations +
Abstract
In underwater synthetic aperture sonar (SAS) imagery, there is a need for accurate target recognition algorithms. Automated detection of underwater objects has many applications, not the least of which being the safe extraction of dangerous explosives. In this paper, we discuss experiments on a deep learning approach to binary classification of target and non-target SAS image tiles. Using a fused anomaly detector, the pixels in each SAS image have been narrowed down into regions of interest (ROIs), from which small target-sized tiles are extracted. This tile data set is created prior to the work done in this paper. Our objective is to carry out extensive tests on the classification accuracy of deep convolutional neural networks (CNNs) using location-based cross validation. Here we discuss the results of varying network architectures, hyperparameters, loss, and activation functions; in conjunction with an analysis of training and testing set configuration. It is also in our interest to analyze these unique network setups extensively, rather than comparing merely classification accuracy. The approach is tested on a collection of SAS imagery.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
A. Galusha, J. Dale, J. M. Keller, and A. Zare "Deep convolutional neural network target classification for underwater synthetic aperture sonar imagery", Proc. SPIE 11012, Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV, 1101205 (10 May 2019); https://doi.org/10.1117/12.2519521
PROCEEDINGS
11 PAGES + PRESENTATION

SHARE
Advertisement
Advertisement
Back to Top