↓ Skip to main content

Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science

Overview of attention for article published in Frontiers in Computational Neuroscience, January 2012
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • Among the highest-scoring outputs from this source (#22 of 1,475)
  • High Attention Score compared to outputs of the same age (99th percentile)
  • High Attention Score compared to outputs of the same age and source (94th percentile)

Mentioned by

news
1 news outlet
blogs
3 blogs
twitter
101 X users
googleplus
2 Google+ users
reddit
1 Redditor

Readers on

mendeley
98 Mendeley
citeulike
3 CiteULike
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science
Published in
Frontiers in Computational Neuroscience, January 2012
DOI 10.3389/fncom.2012.00020
Pubmed ID
Authors

Jelte M. Wicherts, Rogier A. Kievit, Marjan Bakker, Denny Borsboom

Abstract

With the emergence of online publishing, opportunities to maximize transparency of scientific research have grown considerably. However, these possibilities are still only marginally used. We argue for the implementation of (1) peer-reviewed peer review, (2) transparent editorial hierarchies, and (3) online data publication. First, peer-reviewed peer review entails a community-wide review system in which reviews are published online and rated by peers. This ensures accountability of reviewers, thereby increasing academic quality of reviews. Second, reviewers who write many highly regarded reviews may move to higher editorial positions. Third, online publication of data ensures the possibility of independent verification of inferential claims in published papers. This counters statistical errors and overly positive reporting of statistical results. We illustrate the benefits of these strategies by discussing an example in which the classical publication system has gone awry, namely controversial IQ research. We argue that this case would have likely been avoided using more transparent publication practices. We argue that the proposed system leads to better reviews, meritocratic editorial hierarchies, and a higher degree of replicability of statistical analyses.

X Demographics

X Demographics

The data shown below were collected from the profiles of 101 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 98 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Netherlands 4 4%
Germany 2 2%
Switzerland 2 2%
Canada 2 2%
United States 2 2%
United Kingdom 1 1%
Italy 1 1%
Belgium 1 1%
Mexico 1 1%
Other 2 2%
Unknown 80 82%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 20 20%
Researcher 17 17%
Other 11 11%
Student > Master 10 10%
Professor 9 9%
Other 24 24%
Unknown 7 7%
Readers by discipline Count As %
Psychology 29 30%
Computer Science 12 12%
Agricultural and Biological Sciences 10 10%
Medicine and Dentistry 8 8%
Social Sciences 7 7%
Other 16 16%
Unknown 16 16%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 97. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 11 June 2023.
All research outputs
#445,335
of 25,793,330 outputs
Outputs from Frontiers in Computational Neuroscience
#22
of 1,475 outputs
Outputs of similar age
#2,264
of 251,977 outputs
Outputs of similar age from Frontiers in Computational Neuroscience
#4
of 69 outputs
Altmetric has tracked 25,793,330 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 98th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,475 research outputs from this source. They typically receive a little more attention than average, with a mean Attention Score of 7.1. This one has done particularly well, scoring higher than 98% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 251,977 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 99% of its contemporaries.
We're also able to compare this research output to 69 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 94% of its contemporaries.