Existing methods for automated breast ultrasound lesions detection and recognition tend to be based on multi-stage processing, such as preprocessing, filtering/denoising, segmentation and classification. The performance of these processes is dependent on the prior stages. To improve the current state of the art, we have proposed an end-to-end breast ultrasound lesions detection and recognition using a deep learning approach. We implemented a popular semantic segmentation framework, i.e. Fully Convolutional Network (FCN-AlexNet) for our experiment. To overcome data deficiency, we used a pre-trained model based on ImageNet and transfer learning. We validated our results on two datasets, which consist of a total of 113 malignant and 356 benign lesions. We assessed the performance of the model using the following split: 70% for training data, 10% for validation data, and 20% testing data. The results show that our proposed method performed better on benign lesions, with a Dice score of 0.6879, when compared to the malignant lesions with a Dice score of 0.5525. When considering the number of images with Dice score > 0.5, 79% of the benign lesions were successfully segmented and correctly recognised, while 65% of the malignant lesions were successfully segmented and correctly recognised. This paper provides the first end-to-end solution for breast ultrasound lesion recognition. The future challenges for the proposed approaches are to obtain additional datasets and customize the deep learning framework to improve the accuracy of this method.
Manual identification of capillaries in transverse muscle sections is laborious and time consuming. Although the process of classifying a structure as a capillary is facilitated by (immuno)histochemical staining methods, human judgement is still required in a significant number of cases. This is mainly due to the fact that not all capillaries stain as strongly: they may have an elongated appearance and/or there may be staining artefacts that would lead to a false identification of a capillary. Here we propose two automated methods of capillary detection: a novel image processing approach and an existing machine learning approach that has been previously used to detect nuclei-shaped objects. The robustness of the proposed methods was tested on two sets of differently stained muscle sections. On average, the image processing approach scored a True Positive Rate of 0.817 and a harmonic mean (F1 measure) of 0.804 whilst the machine learning approach scored a True Positive Rate 0.843 and F1 measure of 0.846. Both proposed methods are thus able to mimic most of the manual capillary detection, but further improvements are required for practical applications.
Automatic segmentation of anatomic structures of magnetic resonance thigh scans can be a challenging task due to the potential lack of precisely defined muscle boundaries and issues related to intensity inhomogeneity or bias field across an image. In this paper, we demonstrate a combination framework of atlas construction and image registration methods to propagate the desired region of interest (ROI) between atlas image and the targeted MRI thigh scans for quadriceps muscles, femur cortical layer and bone marrow segmentations. The proposed system employs a semi-automatic segmentation method on an initial image in one dataset (from a series of images). The segmented initial image is then used as an atlas image to automate the segmentation of other images in the MRI scans (3-D space). The processes include: ROI labeling, atlas construction and registration, and morphological transform correspondence pixels (in terms of feature and intensity value) between the atlas (template) image and the targeted image based on the prior atlas information and non-rigid image registration methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.