↓ Skip to main content

Reliability of Reviewer Ratings in the Manuscript Peer Review Process: An Opportunity for Improvement

Overview of attention for article published in Accountability in Research, June 2013
Altmetric Badge

Mentioned by

twitter
1 X user

Citations

dimensions_citation
9 Dimensions

Readers on

mendeley
24 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Reliability of Reviewer Ratings in the Manuscript Peer Review Process: An Opportunity for Improvement
Published in
Accountability in Research, June 2013
DOI 10.1080/08989621.2013.804345
Pubmed ID
Authors

Adedayo A. Onitilo, Jessica M. Engel, Sherry A. Salzman-Scott, Rachel V. Stankowski, Suhail A. R. Doi

Abstract

Accountability to authors and readers cannot exist without proper peer review practices. Thus, the information a journal seeks from its peer reviewers and how it makes use of this information is paramount. Disagreement amongst peer reviewers can be considerable, resulting in very diverse comments to authors. Incorporating a clear scoring system for key concrete items and requiring referees to provide justification for scores may ensure that reviewers contribute in a consistently fair and effective manner. This article evaluates information collected from reviewers and proposes an example of a system that aims to improve accountability, while having the potential to make it easier for reviewers to perform a more objective review.

X Demographics

X Demographics

The data shown below were collected from the profile of 1 X user who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 24 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Poland 1 4%
Unknown 23 96%

Demographic breakdown

Readers by professional status Count As %
Other 5 21%
Researcher 4 17%
Student > Bachelor 3 13%
Lecturer 2 8%
Librarian 2 8%
Other 5 21%
Unknown 3 13%
Readers by discipline Count As %
Business, Management and Accounting 3 13%
Agricultural and Biological Sciences 3 13%
Arts and Humanities 2 8%
Nursing and Health Professions 2 8%
Computer Science 2 8%
Other 8 33%
Unknown 4 17%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 1. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 03 July 2013.
All research outputs
#20,655,488
of 25,371,288 outputs
Outputs from Accountability in Research
#390
of 437 outputs
Outputs of similar age
#158,933
of 208,847 outputs
Outputs of similar age from Accountability in Research
#4
of 4 outputs
Altmetric has tracked 25,371,288 research outputs across all sources so far. This one is in the 10th percentile – i.e., 10% of other outputs scored the same or lower than it.
So far Altmetric has tracked 437 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 17.3. This one is in the 2nd percentile – i.e., 2% of its peers scored the same or lower than it.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 208,847 tracked outputs that were published within six weeks on either side of this one in any source. This one is in the 11th percentile – i.e., 11% of its contemporaries scored the same or lower than it.
We're also able to compare this research output to 4 others from the same source and published within six weeks on either side of this one.