Altmetric Blog

Interactions: Three Little Words

Jean Liu, 17th October 2012

The kinds of attention that scholarly articles receive often tell interesting stories. In the “Interactions” weekly series of blog posts, we look at how intertwining conversations and differing views of the general public, scientists, medical professionals, and science communicators contribute to the overall impact of a scholarly article.

The first wave: making a “big” splash

After a hot research article is first published in a scientific journal, a wave of digital attention sweeps across global social media networks. This wave is sometimes propelled by science communication outlets such as news sites and blogs, which often boil down research findings into catchy, concise statements to catch the interest of readers. However, in certain cases, the desire to capture the gist of a study in 140 characters or less results in the misinterpretation of the original findings. A mistaken statement, so compelling that it begs to be shared with others, can trigger waves of online attention: initially, members of the public react to and spread the statement, but this phase can be quickly followed by introspection and analyses by scientists themselves.

For instance, take the following sentence, which was quoted repeatedly by many members of the general public in tweets about an article entitled “The weight of nations: an estimation of adult human biomass” (published on 18 June 2012 in BMC Public Health) [1]:

“North America has 6% of the world population but 34% of biomass.”

This erroneous statement, which conjures up a sensational image of enormous, gluttonous North Americans contentedly stuffing supersized hamburgers into their mouths, prompted a flurry of re-tweets after it first appeared in June.

Wow. US takes up 6% of world’s population, but 34% of its human weight (biomass). Go for a run or something.…

— Dom Lovatt (@dom83) June 18, 2012

The misinterpretation originated from a careless omission of 3 critical words in the following sentence, which appeared in the BMC Public Health article’s abstract:

“North America has 6% of the world population but 34% of biomass due to obesity.”

The deletion of the words “due to obesity” completely altered the meaning of the statement, which was trying to convey that North Americans account for 34% of biomass as calculated from total global excess body weight, not from total human biomass.


The second wave: the double-take

A day after the BMC Public Health article was published, a post (“ugh!!!”) on the blog Permutations complained about the constant misquotes of the study, but this did little to stop online mentions of the misleading statement. Even a blog post (“The Couch Potato Goes Global”) which appeared a month later on the New York Times site was guilty of repeating the bad statistic. After the New York Times blog post came out, a second wave of online buzz from scientists began in response to all the misinformation that had been spread.

Always read the abstract of a paper to check what a media report says. For example versus

— Robert Wilson (@PlanktonMath) July 18, 2012

The New York Times blog post eventually came clean as well:

“Corrections: An earlier version of this article misstated that a third of global biomass exists in North America; it is a third of global biomass due to obesity, not biomass over all.”


Context and information from multiple sources can counter misinformation

The case of the statistical misquote highlights a problem in the way that science findings are discussed online. By distilling research results into short, snappy sentences, occasional misinterpretations by science communicators and their followers can cause much confusion. How, then, can a reader decide whether the buzz about a paper was related to the value of the research itself, or to a misinterpretation of the findings? If you’re not an expert in the subject, you’ll have to place stories in some kind of context, preferably by reading through the primary research article, some related articles, and various news sources. The “alternative” measures of impact that track online reactions in real-time can become skewed by the propagation of misinformation, although in usual situations they may be more useful than traditional research metrics. Ultimately, to see past a sensationalised viewpoint and get to the truth, readers must try to use as many resources as possible to assess a particular story’s value over time.



1. Walpole, S.C. et al. BMC Public Health 12, 439 (2012).

Leave a Reply

Your email address will not be published. Required fields are marked *