Goyal, M, Oakley, A, Bansal, P, Dancey, D and Yap, MH ORCID: https://orcid.org/0000-0001-7681-4287 (2020) Skin Lesion Segmentation in Dermoscopic Images with Ensemble Deep Learning Methods. IEEE Access, 8. pp. 4171-4181. ISSN 2169-3536
|
Published Version
Available under License Creative Commons Attribution. Download (1MB) | Preview |
Abstract
© 2013 IEEE. Early detection of skin cancer, particularly melanoma, is crucial to enable advanced treatment. Due to the rapid growth in the number of skin cancers, there is a growing need of computerised analysis for skin lesions. The state-of-the-art public available datasets for skin lesions are often accompanied with a very limited amount of segmentation ground truth labeling. Also, the available segmentation datasets consist of noisy expert annotations reflecting the fact that precise annotations to represent the boundary of skin lesions are laborious and expensive. The lesion boundary segmentation is vital to locate the lesion accurately in dermoscopic images and lesion diagnosis of different skin lesion types. In this work, we propose the fully automated deep learning ensemble methods to achieve high sensitivity and high specificity in lesion boundary segmentation. We trained the ensemble methods based on Mask R-CNN and DeeplabV3+ methods on ISIC-2017 segmentation training set and evaluate the performance of the ensemble networks on ISIC-2017 testing set and PH2 dataset. Our results showed that the proposed ensemble methods segmented the skin lesions with Sensitivity of 89.93% and Specificity of 97.94% for the ISIC-2017 testing set. The proposed ensemble method Ensemble-A outperformed FrCN, FCNs, U-Net, and SegNet in Sensitivity by 4.4%, 8.8%, 22.7%, and 9.8% respectively. Furthermore, the proposed ensemble method Ensemble-S achieved a specificity score of 97.98% for clinically benign cases, 97.30% for the melanoma cases, and 98.58% for the seborrhoeic keratosis cases on ISIC-2017 testing set, exhibiting better performance than FrCN, FCNs, U-Net, and SegNet.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.