The kinds of attention that scholarly articles receive often tell interesting stories. In the “Interactions” weekly series of blog posts, we look at how intertwining conversations and differing views of the general public, scientists, medical professionals, and science communicators contribute to the overall impact of a scholarly article.
Capturing the nuances of digital attention
The number of citations that an article receives has long been used as a standard measure of research impact. However, citation counts cannot capture the nature of the attention (such as whether the response is positive or negative) is also critical for assessing a paper’s true impact within the research community and society at large. This week, a highly-mentioned article in the Altmetric database was one written by Urisman et al. and published in PLOS Pathogens in 2006, entitled “Identification of a Novel Gammaretrovirus in Prostate Tumors of Patients Homozygous for R462Q RNASEL Variant” . Having been cited 490 times as of 28 September (according to Google Scholar), it would appear that this paper made an important contribution to scientific understanding, but in fact, this was not the case. This was a landmark paper in a way – but not in the sense that the authors might have hoped for. Delving into article’s data within the Altmetric Explorer revealed that all of the discussions were actually focused on the recent retraction of the paper by the editors of PLOS Pathogens.
PLoS Pathogens has just retracted the very first paper about XMRV,a 2006 study that linked virus to prostate cancer. bit.ly/S75wxk
— Martin Enserink (@martinenserink) September 19, 2012
Example: the notorious XMRV
The Urisman et al. paper was the first to characterise a mouse retrovirus called xenotropic murine leukemia virus-related virus, or XMRV, as a human pathogen. Most importantly, the paper identified a link between the development of human prostate cancer tumours and infection by XMRV. This seemed like a truly significant finding, and one that could even inspire new prostate cancer therapies. But the story that began with the Urisman et al. prostate cancer paper soon became entangled with another unrelated disease. In 2007, Robert Silverman, one of the authors on the Urisman et al. paper, met an immunologist, Judy Mikovits of the Whittemore Peterson Institute for Neuro-Immune Disease (Reno, Nevada, USA), and provided her with reagents to test for XMRV. Subsequently, in 2009, the Mikovits group published a paper (by Lombardi et al.) in Science , which found evidence of XMRV infection in blood samples of patients with chronic fatigue syndrome, a condition with no known cause and a reputation for being seriously neglected by the medical community.
The immediate responses from chronic fatigue syndrome patients were intense, and Mikovits was praised, not only for providing medical validation for the condition, but also for instilling hope for a cure. The findings of the XMRV and chronic fatigue syndrome study even prompted the American Red Cross to refuse blood donations from people with chronic fatigue syndrome, out of fears that XMRV could be transmitted between humans. Hype grew amongst patients and the general public about XMRV infection as the next big blood-borne epidemic, spawning online advocacy groups such as the highly-active Facebook page, XMRV Global Action. Yet many scientists expressed their reservations about the associations between XMRV and chronic fatigue syndrome, and attempts to replicate the findings of the Lombardi et al. study were unsuccessful. Throughout this time, scientists continued to search for an explanation of the discrepant results, and sought to determine whether XMRV was indeed a cause for concern in humans at all.
Eventually, through numerous independent studies, it became clear that XMRV really was just a mouse virus – in fact, it had never infected a human before, and was only detectable in human samples due to contamination with mouse DNA. The authors of the Urisman et al. and the Lombardi et al. papers had made a colossal mistake. The Lombardi et al. paper was retracted by Science in 2011, despite the objections of Mikovits (the primary investigator), who stubbornly defended her findings up until this week, when both PLOS ONE  and mBio  published articles disproving prior XMRV links to prostate cancer and chronic fatigue syndrome, respectively. The PLOS ONE article (co-authored by many of the original authors on the Urisman et al. paper) definitively confirmed that contamination had falsely implicated XMRV in prostate cancer, meticulously identifying how the initial contamination had occurred. Both the PLOS ONE and mBio articles were published on 18 September, and the editors of PLOS Pathogens issued their retraction of the original 2006 prostate cancer paper by Urisman et al. on the same day.
“… Science has lost confidence in the Report and the validity of its conclusions… We are therefore editorially retracting the Report. We regret the time and resources that the scientific community has devoted to unsuccessful attempts to replicate these results.”
— Bruce Alberts, Editor-in-Chief of Science (23 December 2011), referring to the decision to retract the Lombardi et al. (2009) paper
XMRV, the notorious virus that never infected a single human being, literally went viral online, sparking a massive 6-year-long search for the truth by multiple laboratories that burned through millions of dollars in research funding along the way. Nobody could have guessed that the initial “discoveries” of XMRV’s disease links could ever ignite such a raging conflict between members of the scientific, publishing, and patient communities, triggering a dramatic series of events that would even lead to Judy Mikovits’ dismissal from the Whittemore Peterson Institute for Neuro-Immune Disease, as well as her arrest for the theft of lab notebooks (the charges were dropped in June 2012). (View a timeline of the events leading up to the present here.)
Measuring impact and the quality of research
Quickly surveying the broad context that a scholarly article fits into requires a method for accurately assessing research impact. Citation counts, which obscure context and the nature of individual reactions, cannot always provide meaningful information about the value of a particular research article. The story of XMRV, as told through numerous primary research articles, news articles, blog posts, and discussions on social media, was built entirely using the Altmetric Explorer. By providing not only a score, but also the contexts in which the data appear, the Altmetric Explorer is able to circumvent many limitations of traditional measures of impact and present a more comprehensive view of the conversations surrounding scholarly research. As online buzz about research articles on Twitter and other forms of social media becomes increasingly prevalent, consumers of scientific information no longer have to rely on a simple citation count to assess the quality of research.