Automated breast ultrasound (ABUS) is a 3D imaging technique which is rapidly emerging as a safe and relatively inexpensive modality for screening of women with dense breasts. However, reading ABUS examinations is very time consuming task since radiologists need to manually identify suspicious findings in all the different ABUS volumes available for each patient. Image analysis techniques to automatically link findings across volumes are required to speed up clinical workflow and make ABUS screening more efficient. In this study, we propose an automated system to, given the location in the ABUS volume being inspected (source), find the corresponding location in a target volume. The target volume can be a different view of the same study or the same view from a prior examination. The algorithm was evaluated using 118 linkages between suspicious abnormalities annotated in a dataset of ABUS images of 27 patients participating in a high risk screening program. The distance between the predicted location and the center of the annotated lesion in the target volume was computed for evaluation. The mean ± stdev and median distance error achieved by the presented algorithm for linkages between volumes of the same study was 7.75±6.71 mm and 5.16 mm, respectively. The performance was 9.54±7.87 and 8.00 mm (mean ± stdev and median) for linkages between volumes from current and prior examinations. The proposed approach has the potential to minimize user interaction for finding correspondences among ABUS volumes.
We investigated the benefits of incorporating texture features into an existing computer-aided diagnosis (CAD) system for classifying benign and malignant lesions in automated three-dimensional breast ultrasound images. The existing system takes into account 11 different features, describing different lesion properties; however, it does not include texture features. In this work, we expand the system by including texture features based on local binary patterns, gray level co-occurrence matrices, and Gabor filters computed from each lesion to be diagnosed. To deal with the resulting large number of features, we proposed a combination of feature-oriented classifiers combining each group of texture features into a single likelihood, resulting in three additional features used for the final classification. The classification was performed using support vector machine classifiers, and the evaluation was done with 10-fold cross validation on a dataset containing 424 lesions (239 benign and 185 malignant lesions). We compared the classification performance of the CAD system with and without texture features. The area under the receiver operating characteristic curve increased from 0.90 to 0.91 after adding texture features (p<0.001).
Recent studies have demonstrated that applying Automated Breast Ultrasound in addition to mammography in women with dense breasts can lead to additional detection of small, early stage breast cancers which are occult in corresponding mammograms. In this paper, we proposed a fully automatic method for detecting the nipple location in 3D ultrasound breast images acquired from Automated Breast Ultrasound Systems. The nipple location is a valuable landmark to report the position of possible abnormalities in a breast or to guide image registration. To detect the nipple location, all images were normalized. Subsequently, features have been extracted in a multi scale approach and classification experiments were performed using a gentle boost classifier to identify the nipple location. The method was applied on a dataset of 100 patients with 294 different 3D ultrasound views from Siemens and U-systems acquisition systems. Our database is a representative sample of cases obtained in clinical practice by four medical centers. The automatic method could accurately locate the nipple in 90% of AP (Anterior-Posterior) views and in 79% of the other views.
KEYWORDS: Image segmentation, Breast, Ultrasonography, 3D image processing, Chest, Computer aided diagnosis and therapy, Tissues, 3D modeling, Image classification, Cancer
Computer-aided detection (CAD) systems are expected to improve effectiveness and efficiency of radiologists in reading automated 3D breast ultrasound (ABUS) images. One challenging task on developing CAD is to reduce a large number of false positives. A large amount of false positives originate from acoustic shadowing caused by ribs. Therefore determining the location of the chestwall in ABUS is necessary in CAD systems to remove these false positives. Additionally it can be used as an anatomical landmark for inter- and intra-modal image registration. In this work, we extended our previous developed chestwall segmentation method that fits a cylinder to automated detected rib-surface points and we fit the cylinder model by minimizing a cost function which adopted a term of region cost computed from a thoracic volume classifier to improve segmentation accuracy. We examined the performance on a dataset of 52 images where our previous developed method fails. Using region-based cost, the average mean distance of the annotated points to the segmented chest wall decreased from 7.57±2.76 mm to 6.22±2.86 mm.art.
Screening with automated 3D breast ultrasound (ABUS) is gaining popularity. However, the acquisition of multiple views required to cover an entire breast makes radiologic reading time-consuming. Linking lesions across views can facilitate the reading process. In this paper, we propose a method to automatically predict the position of a lesion in the target ABUS views, given the location of the lesion in a source ABUS view. We combine features describing the lesion location with respect to the nipple, the transducer and the chestwall, with features describing lesion properties such as intensity, spiculation, blobness, contrast and lesion likelihood. By using a grid search strategy, the location of the lesion was predicted in the target view. Our method achieved an error of 15.64 mm±16.13 mm. The error is small enough to help locate the lesion with minor additional interaction.
Automated 3D breast ultrasound (ABUS) is a novel imaging modality, in which motorized scans of the breasts are made
with a wide transducer through a membrane under modest compression. The technology has gained high interest and
may become widely used in screening of dense breasts, where sensitivity of mammography is poor. ABUS has a high
sensitivity for detecting solid breast lesions. However, reading ABUS images is time consuming, and subtle abnormalities
may be missed. Therefore, we are developing a computer aided detection (CAD) system to help reduce reading
time and errors. In the multi-stage system we propose, segmentations of the breast and nipple are performed, providing
landmarks for the detection algorithm. Subsequently, voxel features characterizing coronal spiculation patterns, blobness,
contrast, and locations with respect to landmarks are extracted. Using an ensemble of classifiers, a likelihood
map indicating potential malignancies is computed. Local maxima in the likelihood map are determined using a local
maxima detector and form a set of candidate lesions in each view. These candidates are further processed in a second
detection stage, which includes region segmentation, feature extraction and a final classification. Region segmentation
is performed using a 3D spiral-scanning dynamic programming method. Region features include descriptors of shape,
acoustic behavior and texture. Performance was determined using a 78-patient dataset with 93 images, including 50
malignant lesions. We used 10-fold cross-validation. Using FROC analysis we found that the system obtains a lesion
sensitivity of 60% and 70% at 2 and 4 false positives per image respectively.
In this paper we investigated classification of malignant and benign lesions in automated 3D breast ultrasound (ABUS).
As a new imaging modality, ABUS overcomes the drawbacks of 2D hand-held ultrasound (US) such as its operator dependence
and limited capability in visualizing the breast in 3D. The classification method we present includes a 3D lesion
segmentation stage based on dynamic programming, which effectively deals with limited visibility of lesion boundaries
due to shadowing and speckle. A novel aspect of ABUS imaging, in which the breast is compressed by means of a dedicated
membrane, is the presence of spiculation in coronal planes perpendicular to the transducer. Spiculation patterns, or
architectural distortion, are characteristic for malignant lesions. Therefore, we compute a spiculation measure in coronal
planes and combine this with more traditional US features related to lesion shape, margin, posterior acoustic behavior,
and echo pattern. However, in our work the latter features are defined in 3D. Classification experiments were performed
with a dataset of 40 lesions including 20 cancers. Linear discriminant analysis (LDA) was used in combination with leaveone-
patient-out and feature selection in each training cycle. We found that spiculation and margin contrast were the most
discriminative features and that these features were most often chosen during feature selection. An Az value of 0.86 was
obtained by merging all features, while an Az value of 0.91 was obtained by feature selection.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.