Presentation + Paper
28 October 2022 Seascape: a due-diligence framework for algorithm acquisition
Author Affiliations +
Abstract
Any program tasked with the evaluation and acquisition of algorithms for use in deployed scenarios must have an impartial, repeatable, and auditable means of benchmarking both candidate and fielded algorithms. Success in this endeavor requires a body of representative sensor data, data labels indicating the proper algorithmic response to the data as adjudicated by subject matter experts, a means of executing algorithms under review against the data, and the ability to automatically score and report algorithm performance. Each of these capabilities should be constructed in support of program and mission goals. By curating and maintaining data, labels, tests, and scoring methodology, a program can understand and continually improve the relationship between benchmarked and fielded performance of acquired algorithms. A system supporting these program needs, deployed in an environment with sufficient computational power and necessary security controls is a powerful tool for ensuring due diligence in evaluation and acquisition of mission critical algorithms. This paper describes the Seascape system and its place in such a process.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Christopher Pitts, Forest Danford, Emily Moore, William Marchetto, Henry Qiu, Leon Ross, and Todd Pitts "Seascape: a due-diligence framework for algorithm acquisition", Proc. SPIE 12276, Artificial Intelligence and Machine Learning in Defense Applications IV, 122760D (28 October 2022); https://doi.org/10.1117/12.2643193
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Algorithm development

Data storage

Data modeling

Computing systems

Detection and tracking algorithms

Machine learning

Data acquisition

Back to Top