↓ Skip to main content

Tracking Replicability as a Method of Post-Publication Open Evaluation

Overview of attention for article published in Frontiers in Computational Neuroscience, January 2012
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#31 of 1,475)
  • High Attention Score compared to outputs of the same age (98th percentile)
  • High Attention Score compared to outputs of the same age and source (87th percentile)

Mentioned by

news
1 news outlet
blogs
4 blogs
twitter
30 X users
googleplus
1 Google+ user

Readers on

mendeley
140 Mendeley
citeulike
1 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Tracking Replicability as a Method of Post-Publication Open Evaluation
Published in
Frontiers in Computational Neuroscience, January 2012
DOI 10.3389/fncom.2012.00008
Pubmed ID
Authors

Joshua K. Hartshorne, Adena Schachner

Abstract

Recent reports have suggested that many published results are unreliable. To increase the reliability and accuracy of published papers, multiple changes have been proposed, such as changes in statistical methods. We support such reforms. However, we believe that the incentive structure of scientific publishing must change for such reforms to be successful. Under the current system, the quality of individual scientists is judged on the basis of their number of publications and citations, with journals similarly judged via numbers of citations. Neither of these measures takes into account the replicability of the published findings, as false or controversial results are often particularly widely cited. We propose tracking replications as a means of post-publication evaluation, both to help researchers identify reliable findings and to incentivize the publication of reliable results. Tracking replications requires a database linking published studies that replicate one another. As any such database is limited by the number of replication attempts published, we propose establishing an open-access journal dedicated to publishing replication attempts. Data quality of both the database and the affiliated journal would be ensured through a combination of crowd-sourcing and peer review. As reports in the database are aggregated, ultimately it will be possible to calculate replicability scores, which may be used alongside citation counts to evaluate the quality of work published in individual journals. In this paper, we lay out a detailed description of how this system could be implemented, including mechanisms for compiling the information, ensuring data quality, and incentivizing the research community to participate.

X Demographics

X Demographics

The data shown below were collected from the profiles of 30 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 140 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
United States 2 1%
Spain 1 <1%
Germany 1 <1%
Switzerland 1 <1%
Unknown 135 96%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 15 11%
Student > Master 11 8%
Researcher 9 6%
Student > Bachelor 8 6%
Professor > Associate Professor 7 5%
Other 16 11%
Unknown 74 53%
Readers by discipline Count As %
Psychology 22 16%
Agricultural and Biological Sciences 6 4%
Medicine and Dentistry 6 4%
Computer Science 5 4%
Business, Management and Accounting 3 2%
Other 22 16%
Unknown 76 54%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 57. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 13 October 2017.
All research outputs
#761,761
of 25,759,158 outputs
Outputs from Frontiers in Computational Neuroscience
#31
of 1,475 outputs
Outputs of similar age
#4,135
of 251,832 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#9
of 70 outputs
Altmetric has tracked 25,759,158 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 97th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,475 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.0. This one has done particularly well, scoring higher than 97% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 251,832 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 98% of its contemporaries.
We're also able to compare this research output to 70 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 87% of its contemporaries.