↓ Skip to main content

Are all “research fields” equal? Rethinking practice for the use of data from crowdsourcing market places

Overview of attention for article published in Behavior Research Methods, August 2016
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (90th percentile)
  • High Attention Score compared to outputs of the same age and source (89th percentile)

Mentioned by

blogs
1 blog
twitter
18 X users

Citations

dimensions_citation
73 Dimensions

Readers on

mendeley
145 Mendeley
Title
Are all “research fields” equal? Rethinking practice for the use of data from crowdsourcing market places
Published in
Behavior Research Methods, August 2016
DOI 10.3758/s13428-016-0789-y
Pubmed ID
Authors

Ilka H. Gleibs

Abstract

New technologies like large-scale social media sites (e.g., Facebook and Twitter) and crowdsourcing services (e.g., Amazon Mechanical Turk, Crowdflower, Clickworker) are impacting social science research and providing many new and interesting avenues for research. The use of these new technologies for research has not been without challenges, and a recently published psychological study on Facebook has led to a widespread discussion of the ethics of conducting large-scale experiments online. Surprisingly little has been said about the ethics of conducting research using commercial crowdsourcing marketplaces. In this article, I focus on the question of which ethical questions are raised by data collection with crowdsourcing tools. I briefly draw on the implications of Internet research more generally, and then focus on the specific challenges that research with crowdsourcing tools faces. I identify fair pay and the related issue of respect for autonomy, as well as problems with the power dynamic between researcher and participant, which has implications for withdrawal without prejudice, as the major ethical challenges of crowdsourced data. Furthermore, I wish to draw attention to how we can develop a "best practice" for researchers using crowdsourcing tools.

X Demographics

X Demographics

The data shown below were collected from the profiles of 18 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 145 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Japan 2 1%
United States 1 <1%
Brazil 1 <1%
Unknown 141 97%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 27 19%
Student > Master 22 15%
Student > Bachelor 18 12%
Researcher 13 9%
Student > Doctoral Student 10 7%
Other 25 17%
Unknown 30 21%
Readers by discipline Count As %
Psychology 38 26%
Social Sciences 19 13%
Business, Management and Accounting 13 9%
Computer Science 8 6%
Economics, Econometrics and Finance 6 4%
Other 22 15%
Unknown 39 27%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 20. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 06 July 2018.
All research outputs
#1,846,035
of 25,374,917 outputs
Outputs from Behavior Research Methods
#181
of 2,525 outputs
Outputs of similar age
#33,323
of 369,331 outputs
Outputs of similar age from Behavior Research Methods
#5
of 49 outputs
Altmetric has tracked 25,374,917 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 92nd percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,525 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.2. This one has done particularly well, scoring higher than 92% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 369,331 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 90% of its contemporaries.
We're also able to compare this research output to 49 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 89% of its contemporaries.