Treatment planning is an essential step in radiation therapy (RT). It requires solving a complex inverse optimization problem through time-consuming adjustment of a set of parameters to meet clinical objectives in a trial and error fashion. Fast and robust treatment plan optimization is important to achieve effective RT of cancer patients. We propose an artificial intelligence (AI)-based RT planning strategy that uses a deep-Q reinforcement learning (RL) to automatically optimize machine parameters by finding an optimal machine control policy. The network uses RT planning CT and contours as input, and dual deep-Q networks that are trained to control the dose rate and multi-leaf collimator positions based on the current dose distribution and machine parameter state. The Q-value is computed as the discounted cumulative cost based on dose objectives, and minimized by experience-replay RL to determine the policy. The proposed approach was applied to prostate cancer RT planning, and validated on 10 prostate cancer cases. Dose distributions generated by RL were compared to conformal arcs and clinical intensity modulated radiotherapy (IMRT) plans. RL was able to generate RT plan with comparable target and normal tissue dose to clinical plans with mean±SD doses of 83.1±1.7 Gy, 39.9±10.0 Gy, and 39.6±13.9 Gy for planning target volume (PTV), rectum, and bladder, respectively (vs 84.4±1.0 Gy, 41.8±15.0 Gy, and 50.6±11.4 Gy for clinical IMRT). This preliminary study demonstrates the potential of an RL approach to enable rapid AI-based RT plan optimization, significantly reducing time and burden required for RT planning.
High-dose-rate interstitial gynecologic brachytherapy requires multiple needles to be inserted into the tumor and surrounding area, avoiding nearby healthy organs-at-risk (OARs), including the bladder and rectum. We propose the use of a 360° three-dimensional (3D) transvaginal ultrasound (TVUS) guidance system for visualization of needles and report on the implementation of two automatic needle segmentation algorithms to aid the localization of needles intraoperatively. Two-dimensional (2D) needle segmentation, allowing for immediate adjustments to needle trajectories to mitigate needle deflection and avoid OARs, was implemented in near real-time using a method based on a convolutional neural network with a U-Net architecture trained on a dataset of 2D ultrasound images from multiple applications with needle-like structures. In 18 unseen TVUS images, the median position difference [95% confidence interval] was 0.27 [0.20, 0.68] mm and mean angular difference was 0.50 [0.27, 1.16]° between manually and algorithmically segmented needles. Automatic needle segmentation was performed in 3D TVUS images using an algorithm leveraging the randomized 3D Hough transform. All needles were accurately localized in a proof-of-concept image with a median position difference of 0.79 [0.62, 0.93] mm and median angular difference of 0.46 [0.31, 0.62]°, when compared to manual segmentations. Further investigation into the robustness of the algorithm to complex cases containing large shadowing, air, or reverberation artefacts is ongoing. Intraoperative automatic needle segmentation in interstitial gynecologic brachytherapy has the potential to improve implant quality and provides the potential for 3D ultrasound to be used for treatment planning, eliminating the requirement for post-insertion CT scans.
Modern image-guided cervical cancer brachytherapy involves the insertion of hollow applicators in the uterus and surrounding the cervix to deliver a radioactive source. These applicators are imaged and manually digitized following insertion for treatment planning. We present an algorithm to automatically digitize these applicators using MRI for cervical cancer brachytherapy planning. Applicators were digitized in vivo using T2-weighted MR images (1.5 T) from 21 brachytherapy fractions including 9 patients. The model-to-image registration algorithm was implemented in C++ involving a 2D matched filter to identify the applicator center, and a 3D surface model to identify local position by maximizing the intensity gradient normal to the surface. Surface models were produced using training MR images. Errors in the algorithm results were calculated as the 3D distances of the applicator tip and center from those identified manually. A model based on manufacturer data was also used for applicator digitization to assess algorithm sensitivity to surface model variation. The algorithm correctly identified the applicator in 20 out of 21 images with mean execution time of 2.5 s. Mean±SD error following digitization using the MRI and manufacturer-based surface models was 1.2±0.6 mm and 1.3±0.7 mm for the tandem tip (p = 0.52), and 1.4±0.9 mm and 1.3±0.7 mm for the ring center (p = 0.61). The algorithm requires no manual initialization with consistent results across surface models, showing promise for clinical implementation.
Background: High-dose-rate brachytherapy (HDR-BT) is a prostate cancer treatment option involving the insertion of hollow needles into the gland through the perineum to deliver a radioactive source. Conventional needle imaging involves indexing a trans-rectal ultrasound (TRUS) probe in the superior/inferior (S/I) direction, using the axial transducer to produce an image set for organ segmentation. These images have limited resolution in the needle insertion direction (S/I), so the sagittal transducer is used to identify needle tips, requiring a manual registration with the axial view. This registration introduces a source of uncertainty in the final segmentations and subsequent treatment plan. Our lab has developed a device enabling 3D-TRUS guided insertions with high S/I spatial resolution, eliminating the need to align axial and sagittal views.
Purpose: To compare HDR-BT needle tip localization accuracy between 2D and 3D-TRUS.
Methods: 5 prostate cancer patients underwent conventional 2D TRUS guided HDR-BT, during which 3D images were also acquired for post-operative registration and segmentation. Needle end-length measurements were taken, providing a gold standard for insertion depths.
Results: 73 needles were analyzed from all 5 patients. Needle tip position differences between imaging techniques was found to be largest in the S/I direction with mean±SD of -2.5±4.0 mm. End-length measurements indicated that 3D TRUS provided statistically significantly lower mean±SD insertion depth error of -0.2±3.4 mm versus 2.3±3.7 mm with 2D guidance (p < .001).
Conclusions: 3D TRUS may provide more accurate HDR-BT needle localization than conventional 2D TRUS guidance for the majority of HDR-BT needles.
Eli Gibson, Mena Gaed, Thomas Hrinivich, José Gómez, Madeleine Moussa, Cesare Romagnoli, Jonathan Mandel, Matthew Bastian-Jordan, Derek Cool, Suha Ghoul, Stephen Pautler, Joseph Chin, Cathie Crukley, Glenn Bauman, Aaron Fenster, Aaron Ward
KEYWORDS: Tumors, Tissues, Image registration, Cancer, Magnetic resonance imaging, Prostate cancer, Image fusion, In vivo imaging, 3D image reconstruction, 3D image processing
Purpose: Multiparametric magnetic resonance imaging (MPMRI) supports detection and staging of prostate cancer, but the image characteristics needed for tumor boundary delineation to support focal therapy have not been widely investigated. We quantified the detectability (image contrast between tumor and non-cancerous contralateral tissue) and the localizability (image contrast between tumor and non-cancerous neighboring tissue) of Gleason score 7 (GS7) peripheral zone (PZ) tumors on MPMRI using tumor contours mapped from histology using accurate 2D–3D registration.
Methods: MPMRI [comprising T2-weighted (T2W), dynamic-contrast-enhanced (DCE), apparent diffusion coefficient (ADC) and contrast transfer coefficient images] and post-prostatectomy digitized histology images were acquired for 6 subjects. Histology contouring and grading (approved by a genitourinary pathologist) identified 7 GS7 PZ tumors. Contours were mapped to MPMRI images using semi-automated registration algorithms (combined target registration error: 2 mm). For each focus, three measurements of mean ± standard deviation of image intensity were taken on each image: tumor tissue (mT±sT), non-cancerous PZ tissue < 5 mm from the tumor (mN±sN), and non-cancerous contralateral PZ tissue (mC±sC). Detectability [D, defined as mT-mC normalized by sT and sC added in quadrature] and localizability [L, defined as mT-mN normalized by sT and sN added in quadrature] were quantified for each focus on each image.
Results: T2W images showed the strongest detectability, although detectability |D|≥1 was observed on either ADC or DCE images, or both, for all foci. Localizability on all modalities was variable; however, ADC images showed localizability |L|≥1 for 3 foci.
Conclusions: Delineation of GS7 PZ tumors on individual MPMRI images faces challenges; however, images may contain complementary information, suggesting a role for fusion of information across MPMRI images for delineation.
Purpose: T2 weighted and diffusion weighted magnetic resonance imaging (MRI) show promise in isolating prostate tumours. Dynamic contrast enhanced (DCE)-MRI has also been employed as a component in multi-parametric tumour detection schemes. Model-based parameters such as Ktrans are conventionally used to characterize DCE images and require arterial contrast agent (CR) concentration. A robust parameter map that does not depend on arterial input may be more useful for target volume delineation. We present a dimensionless parameter (Wio) that characterizes CR wash-in and washout rates without requiring arterial CR concentration. Wio is compared to Ktrans in terms of ability to discriminate cancer in the prostate, as demonstrated via comparison with histology. Methods: Three subjects underwent DCE-MRI using gadolinium contrast and 7 s imaging temporal resolution. A pathologist identified cancer on whole-mount histology specimens, and slides were deformably registered to MR images. The ability of Wio maps to discriminate cancer was determined through receiver operating characteristic curve (ROC) analysis. Results: There is a trend that Wio shows greater area under the ROC curve (AUC) than Ktrans with median AUC values of 0.74 and 0.69 respectively, but the difference was not statistically significant based on a Wilcoxon signed-rank test (p = 0.13). Conclusions: Preliminary results indicate that Wio shows potential as a tool for Ktrans QA, showing similar ability to discriminate cancer in the prostate as Ktrans without requiring arterial CR concentration.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.