Translator Disclaimer
8 March 2007 Reducing variability in the output of artificial neural networks through output calibration
Author Affiliations +
In this study we developed an effective novel method for reducing the variability in the output of different artificial neural network (ANN) configurations that have the same overall performance as measured by the area under their receiver operating characteristic (ROC) curves. This variability can lead to inaccuracies in the interpretation of results when the outputs are employed as classification predictors. We extended a method previously proposed to reduce the variability in the performance of a classifier with data sets from different institutions to the outputs of ANN configurations. Our approach is based on histogram shaping of the outputs of all ANN configurations to resemble the output histogram of a baseline ANN configuration. We tested the effectiveness of the technique using synthetic data generated from two two-dimensional isotropic Gaussian distributions and 100 ANN configurations. The proposed output calibration technique significantly reduced the median standard deviation of the ANN outputs from 0.010 before calibration to 0.006 after calibration. The standard deviation of the sensitivity of the 100 ANN configurations at the same decision threshold reduced significantly from 0.005 before calibration to 0.003 after calibration. Similarly the standard deviation of their specificity values decreased significantly from 0.016 before calibration to 0.006 after calibration.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Shalini Gupta, Wendy C. Kan, Tiffany C. Lin, and Mia K. Markey "Reducing variability in the output of artificial neural networks through output calibration", Proc. SPIE 6515, Medical Imaging 2007: Image Perception, Observer Performance, and Technology Assessment, 65151E (8 March 2007);

Back to Top