Translator Disclaimer
Presentation + Paper
14 May 2019 Identifying low-profile objects from low-light UAS imagery using cascading deep learning
Author Affiliations +
Unmanned aircraft systems (UAS) have gained utility in the Navy for many purposes, including facility needs, security, and intelligence, surveillance, and reconnaissance (ISR). UAS surveys can be employed in place of personnel to reduce safety risks, but they generate significant quantities of data that often require manual review. Research and development of automated methods to identify targets of interest in this type of imagery data can provide multiple benefits, including increasing efficiency, decreasing cost, and potentially saving lives through identification of hazards or threats. This paper presents a methodology to efficiently and effectively identify cryptic target objects from UAS imagery. The approach involves flight and processing of airborne imagery in low-light conditions to find low-profile objects (i.e., birds) in beach and desert-like environments. The object classification algorithms combat the low-light conditions and low-profile nature of the objects of interest using cascading models and a tailored deep convolutional neural network (CNN) architecture. Models were able to identify and count endangered birds (California least terns) and nesting sites on beaches from UAS survey data, achieving negative/positive classification accuracies from candidate images upwards of 97% and an f1 score for detection of 0:837.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Lucas A. Overbey, Jean Pan, Jamie R. Lyle, Georgianna Campbell, Alan Jaegar, Ryan Jaegar, Todd Van Epps, and Martin Ruane "Identifying low-profile objects from low-light UAS imagery using cascading deep learning", Proc. SPIE 11001, Infrared Imaging Systems: Design, Analysis, Modeling, and Testing XXX, 1100116 (14 May 2019);

Back to Top