KEYWORDS: Data modeling, Computed tomography, Monte Carlo methods, Image quality, Data acquisition, Reliability, Deep convolutional neural networks, Algorithm development
Convolutional neural network (CNN)-based material decomposition has the potential to improve image quality (visual appearance) and quantitative accuracy of material maps. Most methods use deterministic CNNs with mean-square-error loss to provide point-estimates of mass densities. Point estimates can be over-confident as the reliability of CNNs is frequently compromised by bias and two major uncertainties – data and model uncertainties originating from noise in inputs and train-test data dissimilarity, respectively. Also, mean-square-error lacks explicit control of uncertainty and bias. To tackle these problems, a Bayesian dual-task CNN (BDT-CNN) with explicit penalization of uncertainty and bias was developed. It is a probabilistic CNN that concurrently conducts material classification and quantification and allows for pixel-wise modeling of bias, data uncertainty, and model uncertainty. CNN was trained with images of physical and simulated tissue-mimicking inserts at varying mass densities. Hydroxyapatite (nominal density 400mg/cc) and blood (nominal density 1095mg/cc) inserts were placed in different-sized body phantoms (30 – 45cm) and used to evaluate mean-absolute-bias (MAB) in predicted mass densities across different images at routine- and half-routine-dose. Patient CT exams were collected to assess generalizability of BDT-CNN in the presence of anatomical background. Noise insertion was used to simulate patient exams at half- and quarter-routine-dose. The deterministic dual-task CNN was used as baseline. In phantoms, BDT-CNN improved consistency of insert delineation, especially edges, and reduced overall bias (average MAB for hydroxyapatite: BDT-CNN 5.4mgHA/cc, baseline 11.0mgHA/cc and blood: BDT-CNN 8.9mgBlood/cc, baseline 14.0mgBlood/cc). In patient images, BDT-CNN improved detail preservation, lesion conspicuity, and structural consistency across different dose levels.
Eye-tracking techniques can be used to understand the visual search process in diagnostic radiology. Nonetheless, most prior eye-tracking studies in CT only involved single cross-sectional images or video playback of the reconstructed volume and meanwhile applied strong constraints to reader-image interactivity, yielding a disconnection between the corresponding experimental setup and clinical reality. To overcome this limitation, we developed an eye-tracking system that integrates eye-tracking hardware with in-house-built image viewing software. This system enabled recording of radiologists’ real-time eye-movement and interactivity with the displayed images in clinically relevant tasks. In this work, the system implementation was demonstrated, and the spatial accuracy of eye-tracking data was evaluated using digital phantom images and patient CT angiography exam. The measured offset between targets and gaze points was comparable to that of many prior eye-tracking systems (The median offset: phantom – visual angle ~0.8°; patient CTA – visual angle ~0.7 – 1.3°). Further, the eye-tracking system was used to record radiologists’ visual search in a liver lesion detection task with contrast-enhanced abdominal CT. From the measured data, several variables were found to correlate with radiologists’ sensitivity, e.g., mean sensitivity of readers with longer interpretation time was higher than that of the others (88 ± 3% vs 78 ± 10%; p < 0.001). In summary, the proposed eye-tracking system has the potential of providing high-quality data to characterize radiologists’ visual-search process in clinical CT tasks.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.