In breast cancer screening for high-risk women, follow-up magnetic resonance images (MRI) are acquired with a time interval ranging from several months up to a few years. Prior MRI studies may provide additional clinical value when examining the current one and thus have the potential to increase sensitivity and specificity of screening. To build a spatial correlation between suspicious findings in both current and prior studies, a reliable alignment method between follow-up studies is desirable. However, long time interval, different scanners and imaging protocols, and varying breast compression can result in a large deformation, which challenges the registration process.
In this work, we present a fast and robust spatial alignment framework, which combines automated breast segmentation and current-prior registration techniques in a multi-level fashion. First, fully automatic breast segmentation is applied to extract the breast masks that are used to obtain an initial affine transform. Then, a non-rigid registration algorithm using normalized gradient fields as similarity measure together with curvature regularization is applied. A total of 29 subjects and 58 breast MR images were collected for performance assessment. To evaluate the global registration accuracy, the volume overlap and boundary surface distance metrics are calculated, resulting in an average Dice Similarity Coefficient (DSC) of 0.96 and root mean square distance (RMSD) of 1.64 mm. In addition, to measure local registration accuracy, for each subject a radiologist annotated 10 pairs of markers in the current and prior studies representing corresponding anatomical locations. The average distance error of marker pairs dropped from 67.37 mm to 10.86 mm after applying registration.
Johannes Lotz, Judith Berger, Benedikt Müller, Kai Breuhahn, Niels Grabe, Stefan Heldmann, André Homeyer, Bernd Lahrmann, Hendrik Laue, Janine Olesch, Michael Schwier, Oliver Sedlaczek, Arne Warth
Much insight into metabolic interactions, tissue growth, and tissue organization can be gained by analyzing differently stained histological serial sections. One opportunity unavailable to classic histology is three-dimensional (3D) examination and computer aided analysis of tissue samples. In this case, registration is needed to reestablish spatial correspondence between adjacent slides that is lost during the sectioning process. Furthermore, the sectioning introduces various distortions like cuts, folding, tearing, and local deformations to the tissue, which need to be corrected in order to exploit the additional information arising from the analysis of neighboring slide images. In this paper we present a novel image registration based method for reconstructing a 3D tissue block implementing a zooming strategy around a user-defined point of interest. We efficiently align consecutive slides at increasingly fine resolution up to cell level. We use a two-step approach, where after a macroscopic, coarse alignment of the slides as preprocessing, a nonlinear, elastic registration is performed to correct local, non-uniform deformations. Being driven by the optimization of the normalized gradient field (NGF) distance measure, our method is suitable for differently stained and thus multi-modal slides. We applied our method to ultra thin serial sections (2 μm) of a human lung tumor. In total 170 slides, stained alternately with four different stains, have been registered. Thorough visual inspection of virtual cuts through the reconstructed block perpendicular to the cutting plane shows accurate alignment of vessels and other tissue structures. This observation is confirmed by a quantitative analysis. Using nonlinear image registration, our method is able to correct locally varying deformations in tissue structures and exceeds the limitations of globally linear transformations.
In this work a fully automated detection method for artery input function (AIF) and venous output function (VOF) in 4D-computer tomography (4D-CT) data is presented based on unsupervised classification of the
time intensity curves (TIC) as input data. Bone and air voxels are first masked out using thresholding of the
baseline measurement. The TICs for each remaining voxel are converted to time-concentration-curves (TCC)
by subtracting the baseline value from the TIC. Then, an unsupervised K-means classifier is applied to each
TCC with an area under the curve (AUC) larger than 95% of the maximum AUC of all TCCs. The results are
three clusters, which yield average TCCs for vein and artery voxels in the brain, respectively. A third cluster
generally represents a vessel outside the brain. The algorithm was applied to five 4D-CT patient data who were
scanned on the suspicion of ischemic stroke. For all _ve patients, the algorithm yields reasonable classification
of arteries and veins as well as reasonable and reproducible AIFs and VOF. To our knowledge, this is the first
application of an unsupervised classification method to automatically identify arteries and veins in 4D-CT data.
Preliminary results show the feasibility of using K-means clustering for the purpose of artery-vein detection in
4D-CT patient data.
Breast cancer diagnosis based on magnetic resonance images (breast MRI) is increasingly being accepted as an
additional diagnostic tool to mammography and ultrasound, with distinct clinical indications.1 Its capability
to detect and differentiate lesion types with high sensitivity and specificity is countered by the fact that visual
human assessment of breast MRI requires long experience. Moreover, the lack of evaluation standards causes
diagnostic results to vary even among experts. The most important MR acquisition technique is dynamic contrast
enhanced (DCE) MR imaging since different lesion types accumulate contrast material (CM) differently. The
wash-in and wash-out characteristic as well as the morphologic characteristic recorded and assessed from MR
images therefore allows to differentiate benign from malignant lesions. In this work, we propose to calculate
second order statistical features (Haralick textures) for given lesions based on subtraction and 4D images and
on parametermaps. The lesions are classified with a linear classification scheme into probably malignant or
probably benign. The method and model was developed on 104 histologically graded lesions (69 malignant and
35 benign). The area under the ROC curve obtained is 0.91 and is already comparable to the performance of a
trained radiologist.
The automatic segmentation of relevant structures such as skin edge, chest wall, or nipple in dynamic contrast
enhanced MR imaging (DCE MRI) of the breast provides additional information for computer aided diagnosis (CAD) systems. Automatic reporting using BI-RADS criteria benefits of information about location of those
structures. Lesion positions can be automatically described relatively to such reference structures for reporting
purposes. Furthermore, this information can assist data reduction for computation expensive preprocessing such
as registration, or for visualization of only the segments of current interest. In this paper, a novel automatic method for determining the air-breast boundary resp. skin edge, for approximation of the chest wall, and locating of the nipples is presented. The method consists of several steps which are built on top of each other. Automatic threshold computation leads to the air-breast boundary which is then analyzed to determine the location of the nipple. Finally, results of both steps are starting point for approximation of the chest wall. The proposed process was evaluated on a large data set of DCE MRI recorded by T1 sequences and yielded reasonable results in all cases.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.