Automated detection of aggressive prostate cancer on Magnetic Resonance Imaging (MRI) can help guide targeted biopsies and reduce unnecessary invasive biopsies. However, automated methods of prostate cancer detection often have a sensitivity-specificity trade-off (high sensitivity with low specificity or vice-versa), making them unsuitable for clinical use. Here, we study the utility of integrating prior information about the zonal distribution of prostate cancers with a radiology-pathology fusion model in reliably identifying aggressive and indolent prostate cancers on MRI. Our approach has two steps: 1) training a radiology-pathology fusion model that learns pathomic MRI biomarkers (MRI features correlated with pathology features) and uses them to selectively identify aggressive and indolent cancers, and 2) post-processing the predictions using zonal priors in a novel optimized Bayes’ decision framework. We compare this approach with other approaches that incorporate zonal priors during training. We use a cohort of 74 radical prostatectomy patients as our training set, and two cohorts of 30 radical prostatectomy patients and 53 biopsy patients as our test sets. Our rad-path-zonal fusion-approach achieves cancer lesion-level sensitivities of 0.77±0.29 and 0.79±0.38, and specificities of 0.79±0.23 and 0.62±0.27 on the two test sets respectively, compared to baseline sensitivities of 0.91±0.27 and 0.94±0.21 and specificities of 0.39±0.33 and 0.14±0.19, verifying its utility in achieving balance between sensitivity and specificity of lesion detection.
The use of magnetic resonance-ultrasound fusion targeted biopsy improves diagnosis of aggressive prostate cancer. Fusion of ultrasound & magnetic resonance images (MRI) requires accurate prostate segmentations. In this paper, we developed a 2.5 dimensional deep learning model, ProGNet, to segment the prostate on T2-weighted magnetic resonance imaging (MRI). ProGNet is an optimized U-Net model that weighs three adjacent slices in each MRI sequence to segment the prostate in a 2.5D context. We trained ProGNet on 529 cases where experts annotated the whole gland (WG) on axial T2-weighted MRI prior to targeted prostate biopsy. In 132 cases, experts also annotated the central gland (CG) on MRI. After five-fold cross-validation, we found that for WG segmentation, ProGNet had a mean Dice similarity coefficient (DSC) of 0.91±0.02, sensitivity of 0.89±0.03, specificity of 0.97±0.00, and an accuracy of 0.95±0.01. For CG segmentation, ProGNet achieved a mean DSC 0.86±0.01, sensitivity of 0.84±0.03, specificity of 0.99±0.01, and an accuracy of 0.96±0.01. We then tested the generalizability of the model on the 60-case NCI-ISBI 2013 challenge dataset and on a local, independent 61-case test set. We achieved DSCs of 0.81±0.02 and 0.72±0.02 for WG and CG segmentation on the NCI-ISBI 2013 challenge dataset, and 0.83±0.01 and 0.75±0.01 for WG and CG segmentation on the local dataset. Model performance was excellent and outperformed state-of-art U-Net and holistically-nested edge detector (HED) networks in all three datasets.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.