Surgical excision for basal cell carcinoma (BCC) is a common treatment to remove the affected areas of skin. Minimizing positive margins around excised tissue is essential for successful treatment. Residual cancer cells may result in repeat surgery; however, detecting remaining cancer can be challenging and time-consuming. Using chemical signal data acquired while tissue is excised with a cautery tool, the iKnife system can discriminate between healthy and cancerous tissue but lacks spatial information, making it difficult to navigate back to suspicious margins. Intraoperative videos of BCC excision allow cautery locations to be tracked, providing the sites of potential positive margins. We propose a deep learning approach using convolutional neural networks to recognize phases in the videos and subsequently track the cautery location, comparing two localization methods (supervised and semi-supervised). Phase recognition was used for preprocessing to classify frames as showing the surgery or the start/stop of iKnife data acquisition. Only frames designated as showing the surgery were used for cautery localization. Fourteen videos were recorded during BCC excisions with iKnife data collection. On unseen testing data (2 videos, 1,832 frames), the phase recognition model showed an overall accuracy of 86%. Tool localization performed with a mean average precision of 0.98 and 0.96 for supervised and semisupervised methods, respectively, at a 0.5 intersection over union threshold. Incorporating intraoperative phase data with tool tracking provides surgeons with spatial information about the cautery tool location around suspicious regions, potentially improving the surgeon's ability to navigate back to the area of concern.
|