With the ever growing occurrences of skin cancer and limited healthcare settings, a reliable computer assisted diagnostic system is needed to assist the dermatologists for lesion diagnosis. Skin lesion segmentation on dermo- scopic images can be an efficient tool to determine the differences between benign and malignant skin lesions. The dermoscopic images in the public skin lesion datasets are collected from various sources around the world. The color of lesions in dermoscopic images can be strongly dependent on the light source. In this work, we provide a new insight on the effect of color constancy algorithms on skin lesion segmentation with deep learning algorithm. We pre-process the ISIC Challenge Segmentation 2017 dataset using different color constancy algorithms and study the effect on a popular semantic segmentation algorithm, i.e. Fully Convolutional Networks. We evaluate the results with two evaluation metrics, i.e. Dice Similarity Coefficient and Jaccard Similarity Index. Overall, our experiments showed improvements in semantic segmentation of skin lesions when pre-processed with color constancy algorithms. Further, we investigate the effect of these algorithms on different types of lesions (Naevi, Melanoma and Seborrhoeic Keratosis). We found pre-processing with color constancy algorithms improved the segmentation results on Naevi and Seborrhoeic Keratosis, but not Melanoma. Future work will seek to investigate an adaptive color constancy algorithm that could improve the segmentation results.
The skin is the largest organ in our body. There is a high prevalence of skin diseases and a scarcity of dermatologists, the experts in diagnosing and managing skin diseases, making CAD (Computer Aided Diagnosis) of skin disease an important field of research. Many patients present with a skin lesion of concern, to determine if it is benign or malignant. Lesion diagnosis is currently performed by dermatologists taking a history and examining the lesion and the entire body surface with the aid of a dermatoscope. Automatic lesion segmentation and evaluation of the symmetry or asymmetry of structures and colors with the help of computers may classify a lesion as likely benign or as likely malignant. We have explored a deep learning program called Deep Extreme Cut (DEXTR) and used the Faster-RCNN-InceptionV2 network to determine extreme points (left-most, right-most, top and bottom pixels). We used the ISIC challenge-2017 images for the training set and received Jaccard index of 82.2% on the ISIC testing set 2017 and 85.8% on the PH2 dataset. The proposed method outperformed the winner algorithm of the competition by 5.7% for the Jaccard index.
Multistage processing of automated breast ultrasound lesions recognition is dependent on the performance of prior stages. To improve the current state of the art, we propose the use of end-to-end deep learning approaches using fully convolutional networks (FCNs), namely FCN-AlexNet, FCN-32s, FCN-16s, and FCN-8s for semantic segmentation of breast lesions. We use pretrained models based on ImageNet and transfer learning to overcome the issue of data deficiency. We evaluate our results on two datasets, which consist of a total of 113 malignant and 356 benign lesions. To assess the performance, we conduct fivefold cross validation using the following split: 70% for training data, 10% for validation data, and 20% testing data. The results showed that our proposed method performed better on benign lesions, with a top “mean Dice” score of 0.7626 with FCN-16s, when compared with the malignant lesions with a top mean Dice score of 0.5484 with FCN-8s. When considering the number of images with Dice score >0.5, 89.6% of the benign lesions were successfully segmented and correctly recognised, whereas 60.6% of the malignant lesions were successfully segmented and correctly recognized. We conclude the paper by addressing the future challenges of the work.
Existing methods for automated breast ultrasound lesions detection and recognition tend to be based on multi-stage processing, such as preprocessing, filtering/denoising, segmentation and classification. The performance of these processes is dependent on the prior stages. To improve the current state of the art, we have proposed an end-to-end breast ultrasound lesions detection and recognition using a deep learning approach. We implemented a popular semantic segmentation framework, i.e. Fully Convolutional Network (FCN-AlexNet) for our experiment. To overcome data deficiency, we used a pre-trained model based on ImageNet and transfer learning. We validated our results on two datasets, which consist of a total of 113 malignant and 356 benign lesions. We assessed the performance of the model using the following split: 70% for training data, 10% for validation data, and 20% testing data. The results show that our proposed method performed better on benign lesions, with a Dice score of 0.6879, when compared to the malignant lesions with a Dice score of 0.5525. When considering the number of images with Dice score > 0.5, 79% of the benign lesions were successfully segmented and correctly recognised, while 65% of the malignant lesions were successfully segmented and correctly recognised. This paper provides the first end-to-end solution for breast ultrasound lesion recognition. The future challenges for the proposed approaches are to obtain additional datasets and customize the deep learning framework to improve the accuracy of this method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.