↓ Skip to main content

Deep learning-based detection of motion artifacts in probe-based confocal laser endomicroscopy images

Overview of attention for article published in International Journal of Computer Assisted Radiology and Surgery, August 2018
Altmetric Badge

About this Attention Score

  • Average Attention Score compared to outputs of the same age and source

Mentioned by

twitter
1 X user

Citations

dimensions_citation
41 Dimensions

Readers on

mendeley
53 Mendeley
Title
Deep learning-based detection of motion artifacts in probe-based confocal laser endomicroscopy images
Published in
International Journal of Computer Assisted Radiology and Surgery, August 2018
DOI 10.1007/s11548-018-1836-1
Pubmed ID
Authors

Marc Aubreville, Maike Stoeve, Nicolai Oetter, Miguel Goncalves, Christian Knipfer, Helmut Neumann, Christopher Bohr, Florian Stelzle, Andreas Maier

Abstract

Probe-based confocal laser endomicroscopy (pCLE) is a subcellular in vivo imaging technique capable of producing images that enable diagnosis of malign structural modifications in epithelial tissue. Images acquired with pCLE are, however, often tainted by significant artifacts that impair diagnosis. This is especially detrimental for automated image analysis, which is why said images are often excluded from recognition pipelines. We present an approach for the automatic detection of motion artifacts in pCLE images and apply this methodology to a data set of 15 thousand images of epithelial tissue acquired in the oral cavity and the vocal folds. The approach is based on transfer learning from intermediate endpoints within a pre-trained Inception v3 network with tailored preprocessing. For detection within the non-rectangular pCLE images, we perform pooling within the activation maps of the network and evaluate this at different network depths. We achieved area under the ROC curve values of 0.92 with the proposed method, compared to 0.80 for the best feature-based machine learning approach. Our overall accuracy with the presented approach is 94.8%. Over traditional machine learning approaches with state-of-the-art features, we achieved significantly improved overall performance.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 53 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 53 100%

Demographic breakdown

Readers by professional status Count As %
Student > Master 11 21%
Researcher 7 13%
Student > Ph. D. Student 7 13%
Student > Bachelor 4 8%
Student > Doctoral Student 4 8%
Other 6 11%
Unknown 14 26%
Readers by discipline Count As %
Computer Science 13 25%
Medicine and Dentistry 10 19%
Engineering 4 8%
Agricultural and Biological Sciences 2 4%
Nursing and Health Professions 1 2%
Other 5 9%
Unknown 18 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 07 August 2018.
All research outputs
#20,318,407
of 24,980,180 outputs
Outputs from International Journal of Computer Assisted Radiology and Surgery
#692
of 939 outputs
Outputs of similar age
#260,641
of 336,128 outputs
Outputs of similar age from International Journal of Computer Assisted Radiology and Surgery
#11
of 18 outputs
Altmetric has tracked 24,980,180 research outputs across all sources so far. This one is in the 10th percentile – i.e., 10% of other outputs scored the same or lower than it.
So far Altmetric has tracked 939 research outputs from this source. They receive a mean Attention Score of 3.2. This one is in the 19th percentile – i.e., 19% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 336,128 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 12th percentile – i.e., 12% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 18 others from the same source and published within six weeks on either side of this one. This one is in the 44th percentile – i.e., 44% of its contemporaries scored the same or lower than it.