↓ Skip to main content

Extraction of skin lesions from non-dermoscopic images for surgical excision of melanoma

Overview of attention for article published in International Journal of Computer Assisted Radiology and Surgery, March 2017
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • Good Attention Score compared to outputs of the same age (72nd percentile)
  • High Attention Score compared to outputs of the same age and source (90th percentile)

Mentioned by

twitter
15 X users

Citations

dimensions_citation
64 Dimensions

Readers on

mendeley
92 Mendeley
citeulike
1 CiteULike
Title
Extraction of skin lesions from non-dermoscopic images for surgical excision of melanoma
Published in
International Journal of Computer Assisted Radiology and Surgery, March 2017
DOI 10.1007/s11548-017-1567-8
Pubmed ID
Authors

M. Hossein Jafari, Ebrahim Nasr-Esfahani, Nader Karimi, S. M. Reza Soroushmehr, Shadrokh Samavi, Kayvan Najarian

Abstract

Computerized prescreening of suspicious moles and lesions for malignancy is of great importance for assessing the need and the priority of the removal surgery. Detection can be done by images captured by standard cameras, which are more preferable due to low cost and availability. One important step in computerized evaluation is accurate detection of lesion's region, i.e., segmentation of an image into two regions as lesion and normal skin. In this paper, a new method based on deep neural networks is proposed for accurate extraction of a lesion region. The input image is preprocessed, and then, its patches are fed to a convolutional neural network. Local texture and global structure of the patches are processed in order to assign pixels to lesion or normal classes. A method for effective selection of training patches is proposed for more accurate detection of a lesion's border. Our results indicate that the proposed method could reach the accuracy of 98.7% and the sensitivity of 95.2% in segmentation of lesion regions over the dataset of clinical images. The experimental results of qualitative and quantitative evaluations demonstrate that our method can outperform other state-of-the-art algorithms exist in the literature.

X Demographics

X Demographics

The data shown below were collected from the profiles of 15 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 92 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 92 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 21 23%
Student > Bachelor 11 12%
Student > Master 9 10%
Researcher 7 8%
Student > Doctoral Student 6 7%
Other 13 14%
Unknown 25 27%
Readers by discipline Count As %
Computer Science 26 28%
Medicine and Dentistry 13 14%
Engineering 13 14%
Physics and Astronomy 3 3%
Biochemistry, Genetics and Molecular Biology 3 3%
Other 6 7%
Unknown 28 30%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 7. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 04 April 2017.
All research outputs
#4,940,425
of 24,059,832 outputs
Outputs from International Journal of Computer Assisted Radiology and Surgery
#86
of 912 outputs
Outputs of similar age
#84,274
of 312,487 outputs
Outputs of similar age from International Journal of Computer Assisted Radiology and Surgery
#3
of 21 outputs
Altmetric has tracked 24,059,832 research outputs across all sources so far. Compared to these this one has done well and is in the 79th percentile: it's in the top 25% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 912 research outputs from this source. They receive a mean Attention Score of 3.1. This one has done particularly well, scoring higher than 90% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 312,487 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 72% of its contemporaries.
We're also able to compare this research output to 21 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 90% of its contemporaries.