↓ Skip to main content

How peer-review constrains cognition: on the frontline in the knowledge sector

Overview of attention for article published in Frontiers in Psychology, November 2015
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (89th percentile)
  • High Attention Score compared to outputs of the same age and source (85th percentile)

Mentioned by

twitter
28 X users

Readers on

mendeley
44 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
How peer-review constrains cognition: on the frontline in the knowledge sector
Published in
Frontiers in Psychology, November 2015
DOI 10.3389/fpsyg.2015.01706
Pubmed ID
Authors

Stephen J. Cowley

Abstract

Peer-review is neither reliable, fair, nor a valid basis for predicting 'impact': as quality control, peer-review is not fit for purpose. Endorsing the consensus, I offer a reframing: while a normative social process, peer-review also shapes the writing of a scientific paper. In so far as 'cognition' describes enabling conditions for flexible behavior, the practices of peer-review thus constrain knowledge-making. To pursue cognitive functions of peer-review, however, manuscripts must be seen as 'symbolizations', replicable patterns that use technologically enabled activity. On this bio-cognitive view, peer-review constrains knowledge-making by writers, editors, reviewers. Authors are prompted to recursively re-aggregate symbolizations to present what are deemed acceptable knowledge claims. How, then, can recursive re-embodiment be explored? In illustration, I sketch how the paper's own content came to be re-aggregated: agonistic review drove reformatting of argument structure, changes in rhetorical ploys and careful choice of wordings. For this reason, the paper's knowledge-claims can be traced to human activity that occurs in distributed cognitive systems. Peer-review is on the frontline in the knowledge sector in that it delimits what can count as knowing. Its systemic nature is therefore crucial to not only discipline-centered 'real' science but also its 'post-academic' counterparts.

X Demographics

X Demographics

The data shown below were collected from the profiles of 28 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 44 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 44 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 7 16%
Student > Master 6 14%
Other 5 11%
Student > Ph. D. Student 5 11%
Student > Doctoral Student 4 9%
Other 12 27%
Unknown 5 11%
Readers by discipline Count As %
Social Sciences 6 14%
Business, Management and Accounting 4 9%
Engineering 3 7%
Nursing and Health Professions 3 7%
Agricultural and Biological Sciences 3 7%
Other 17 39%
Unknown 8 18%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 16. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 16 March 2024.
All research outputs
#2,322,858
of 25,721,020 outputs
Outputs from Frontiers in Psychology
#4,666
of 34,761 outputs
Outputs of similar age
#32,480
of 297,265 outputs
Outputs of similar age from Frontiers in Psychology
#71
of 491 outputs
Altmetric has tracked 25,721,020 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 90th percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 34,761 research outputs from this source. They typically receive a lot more attention than average, with a mean Attention Score of 13.4. This one has done well, scoring higher than 86% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 297,265 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 89% of its contemporaries.
We're also able to compare this research output to 491 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 85% of its contemporaries.