Deep Neural Networks (DNN) have made rapid progress in medical image analysis, and it is now common to see algorithms proposed in the literature that claim to meet or exceed clinician performance. However, DNNs typically operate like black box systems. Deployment of these algorithms for safety-critical tasks, such as medical applications, is therefore challenging without methods to characterize and identify the robustness of DNNs in a generalized setting. Furthermore, DNNs are known to be sensitive to small changes in input data, and may rely on spurious correlations seen in the training samples which precludes them from generalizing to data with a different distribution. We previously proposed an attribute-ranking algorithm that ranks discrete data attributes by how informative they are about the DNN performance using Mutual Information. In this study we leverage this algorithm in a novel way to determine if the data attributes that impact the DNN performance are clinically relevant and thereby whether the DNN predictions are robust. We demonstrate the applicability of this method on melanoma classification with a DNN trained on the publicly available HAM 10,000 dataset and achieve 0.855 AUC on the held-out HAM test set. Our analysis identifies that image saturation, which is not a clinically relevant feature, is highly indicative of both whether an image is melanomic in the HAM training data and whether the DNN prediction is correct. Further testing reveals that when the classifier is tested on the SIIM-ISIC Melanoma dataset, where the correlation between image saturation and melanoma is not present, the classifier achieves only an AUC of 0.591, confirming that the DNN is not robust.
Purpose: Surgery involves modifying anatomy to achieve a goal. Reconstructing anatomy can facilitate surgical care through surgical planning, real-time decision support, or anticipating outcomes. Tool motion is a rich source of data that can be used to quantify anatomy. Our work develops and validates a method for reconstructing the nasal septum from unstructured motion of the Cottle elevator during the elevation phase of septoplasty surgery, without need to explicitly delineate the surface of the septum.
Approach: The proposed method uses iterative closest point registration to initially register a template septum to the tool motion. Subsequently, statistical shape modeling with iterative most likely oriented point registration is used to fit the reconstructed septum to Cottle tip position and orientation during flap elevation. Regularization of the shape model and transformation is incorporated. The proposed methods were validated on 10 septoplasty surgeries performed on cadavers by operators of varying experience level. Preoperative CT images of the cadaver septums were segmented as ground truth.
Results: We estimated reconstruction error as the difference between the projections of the Cottle tip onto the surface of the reconstructed septum and the ground-truth septum segmented from the CT image. We found translational differences of 2.74 ( 2.06 − 2.81 ) mm and a rotational differences of 8.95 ( 7.11 − 10.55 ) deg between the reconstructed septum and the ground-truth septum [median (interquartile range)], given the optimal regularization parameters.
Conclusions: Accurate reconstruction of the nasal septum can be achieved from tool tracking data during septoplasty surgery on cadavers. This enables understanding of the septal anatomy without need for traditional medical imaging. This result may be used to facilitate surgical planning, intraoperative care, or skills assessment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.