↓ Skip to main content

Designing next-generation platforms for evaluating scientific output: what scientists can learn from the social web

Overview of attention for article published in Frontiers in Computational Neuroscience, January 2012
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#26 of 1,475)
  • High Attention Score compared to outputs of the same age (98th percentile)
  • High Attention Score compared to outputs of the same age and source (91st percentile)

Mentioned by

blogs
3 blogs
twitter
90 X users
peer_reviews
1 peer review site
facebook
1 Facebook page
googleplus
2 Google+ users

Readers on

mendeley
89 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Designing next-generation platforms for evaluating scientific output: what scientists can learn from the social web
Published in
Frontiers in Computational Neuroscience, January 2012
DOI 10.3389/fncom.2012.00072
Pubmed ID
Authors

Tal Yarkoni

Abstract

Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. Efforts to replace or supplement traditional evaluation models with open evaluation platforms that leverage advances in information technology are slowly gaining traction, but remain in the early stages of design and implementation. Here I discuss a number of considerations relevant to the development of such platforms. I focus particular attention on three core elements that next-generation evaluation platforms should strive to emphasize, including (1) open and transparent access to accumulated evaluation data, (2) personalized and highly customizable performance metrics, and (3) appropriate short-term incentivization of the userbase. Because all of these elements have already been successfully implemented on a large scale in hundreds of existing social web applications, I argue that development of new scientific evaluation platforms should proceed largely by adapting existing techniques rather than engineering entirely new evaluation mechanisms. Successful implementation of open evaluation platforms has the potential to substantially advance both the pace and the quality of scientific publication and evaluation, and the scientific community has a vested interest in shifting toward such models as soon as possible.

X Demographics

X Demographics

The data shown below were collected from the profiles of 90 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 89 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United Kingdom 5 6%
United States 3 3%
Chile 1 1%
South Africa 1 1%
Italy 1 1%
Germany 1 1%
Belgium 1 1%
Mexico 1 1%
Greece 1 1%
Other 1 1%
Unknown 73 82%

Demographic breakdown

Readers by professional status Count As %
Researcher 19 21%
Student > Ph. D. Student 14 16%
Student > Master 13 15%
Professor > Associate Professor 9 10%
Other 6 7%
Other 20 22%
Unknown 8 9%
Readers by discipline Count As %
Psychology 31 35%
Agricultural and Biological Sciences 10 11%
Computer Science 8 9%
Neuroscience 5 6%
Medicine and Dentistry 5 6%
Other 18 20%
Unknown 12 13%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 76. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 28 May 2022.
All research outputs
#576,610
of 25,759,158 outputs
Outputs from Frontiers in Computational Neuroscience
#26
of 1,475 outputs
Outputs of similar age
#3,019
of 251,832 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#6
of 69 outputs
Altmetric has tracked 25,759,158 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,475 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.1. This one has done particularly well, scoring higher than 98% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 251,832 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 98% of its contemporaries.
We're also able to compare this research output to 69 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 91% of its contemporaries.