↓ Skip to main content

User-Guided Segmentation of Multi-modality Medical Imaging Datasets with ITK-SNAP

Overview of attention for article published in Neuroinformatics, June 2018
Altmetric Badge

Mentioned by

twitter
2 X users

Citations

dimensions_citation
102 Dimensions

Readers on

mendeley
127 Mendeley
Title
User-Guided Segmentation of Multi-modality Medical Imaging Datasets with ITK-SNAP
Published in
Neuroinformatics, June 2018
DOI 10.1007/s12021-018-9385-x
Pubmed ID
Authors

Paul A. Yushkevich, Artem Pashchinskiy, Ipek Oguz, Suyash Mohan, J. Eric Schmitt, Joel M. Stein, Dženan Zukić, Jared Vicory, Matthew McCormick, Natalie Yushkevich, Nadav Schwartz, Yang Gao, Guido Gerig

Abstract

ITK-SNAP is an interactive software tool for manual and semi-automatic segmentation of 3D medical images. This paper summarizes major new features added to ITK-SNAP over the last decade. The main focus of the paper is on new features that support semi-automatic segmentation of multi-modality imaging datasets, such as MRI scans acquired using different contrast mechanisms (e.g., T1, T2, FLAIR). The new functionality uses decision forest classifiers trained interactively by the user to transform multiple input image volumes into a foreground/background probability map; this map is then input as the data term to the active contour evolution algorithm, which yields regularized surface representations of the segmented objects of interest. The new functionality is evaluated in the context of high-grade and low-grade glioma segmentation by three expert neuroradiogists and a non-expert on a reference dataset from the MICCAI 2013 Multi-Modal Brain Tumor Segmentation Challenge (BRATS). The accuracy of semi-automatic segmentation is competitive with the top specialized brain tumor segmentation methods evaluated in the BRATS challenge, with most results obtained in ITK-SNAP being more accurate, relative to the BRATS reference manual segmentation, than the second-best performer in the BRATS challenge; and all results being more accurate than the fourth-best performer. Segmentation time is reduced over manual segmentation by 2.5 and 5 times, depending on the rater. Additional experiments in interactive placenta segmentation in 3D fetal ultrasound illustrate the generalizability of the new functionality to a different problem domain.

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 127 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 127 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 16 13%
Researcher 15 12%
Student > Bachelor 15 12%
Student > Master 10 8%
Professor > Associate Professor 8 6%
Other 22 17%
Unknown 41 32%
Readers by discipline Count As %
Medicine and Dentistry 28 22%
Engineering 16 13%
Neuroscience 9 7%
Computer Science 8 6%
Agricultural and Biological Sciences 3 2%
Other 11 9%
Unknown 52 41%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 22 August 2021.
All research outputs
#18,640,437
of 23,092,602 outputs
Outputs from Neuroinformatics
#323
of 407 outputs
Outputs of similar age
#254,277
of 329,163 outputs
Outputs of similar age from Neuroinformatics
#6
of 9 outputs
Altmetric has tracked 23,092,602 research outputs across all sources so far. This one is in the 11th percentile – i.e., 11% of other outputs scored the same or lower than it.
So far Altmetric has tracked 407 research outputs from this source. They receive a mean Attention Score of 4.5. This one is in the 12th percentile – i.e., 12% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 329,163 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 12th percentile – i.e., 12% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 9 others from the same source and published within six weeks on either side of this one. This one has scored higher than 3 of them.