Altmetrics in action: examples and feedback from data-savvy researchers

For this post in our researcher blog series, we decided to examine some interesting examples of researchers integrating the data into their workflows. We also reached out to the six academics who created these examples, to review their usage and collect some feedback (positive or otherwise) about their experience of the data.

Three out of the six researchers we talked to said they had first heard about Altmetric from other academics, either on Twitter or through internal communication within a university. This is interesting as it suggests that one academic’s knowledge of a tool can potentially change the behaviors of their peers, whether that’s all the other researchers in their faculty, or all their followers on social media. In last month’s blog post we briefly talked about how academics can use Twitter and altmetrics to keep up-to-date with research trends. However, perhaps it’s also worth noting that conversation surrounding research management tools can itself start a trend, and that being active in such networks can open up a whole world of informative and timesaving possibilities for researchers. Two other academics said they recalled seeing the data displayed on publisher’s websites, and this had inspired them to learn more.

 

How have researchers used Altmetric data online?

Egon Willighagen

Egon Willighagen’s ORCID profile with Altmetric publications data added

Screen Shot 2014-11-14 at 17.10.18

James Grecian’s online list of publications with badges embedded

Screen Shot 2014-11-26 at 17.01.31

Steve Davis’ publications page with Altmetric logo and links to “Selected Press” stories extracted from the Altmetric data

All the researchers we spoke to had investigated the Altmetric data for their articles, and a few had taken further steps to embed the data in their online profiles. Looking at these examples, it’s clear that adding the data to publication pages or merging it with a unique ORCID ID can add another dimension to your digital identity as a researcher. Because the colours for sources such as Twitter and Facebook correspond to the platforms’ logo colours, the donut is both an aesthetically pleasing infographic and a way of showcasing the online attention papers have received in one easy click. For Steve Davis, adding the data to his online profile saved him time and allowed him to display attention from a range of sources simultaneously, rather than through manual recording; “I have in the past curated my own list of news outlets and blogs that have covered my work in order to track what kind of reaction it was getting”. Davis also said the data prompted him to think about social media monitoring, which he had not been doing previously. James Grecian said he found Altmetric provided “a useful way to curate and link to the media articles that mention my research”, and also commented that he found the badges easy to use.  Egon Willighagen found the Altmetric/ORCID app saved time as he “did not have to look up the statistics manually…it uses my ORCID profile to get the DOIs and aggregate stuff”.

 

What did the researchers discover as a result of using the data?  Screen Shot 2014-11-14 at 17.03.51

Researchers who had used the bookmarklet said the data had helped them in other ways. Vincent Scalfani (assistant professor and science librarian at the University of Alabama) said he had used the bookmarklet to view the attention for a short autobiographical commentary he released about his childhood and career progression and was pleasantly surprised about how much attention it had received. He theorized; “as this article is more of a commentary and not a research article, it will likely not receive many (or any at all) official literature citations”. This is a fascinating insight as it testifies to the idea that altmetrics can offer an indication of real-time attention beyond citation counts. It’s also interesting to think about how this feeds in to the idea that altmetrics can capture the impact of more reflective personal pieces as well as write-ups of scientific investigations. Thus, for Vincent, the data solved the problem of awareness, by providing him with readership information for a non-traditional research output. Steve Portugal of Royal Holloway University “used the data in grant applications as additional evidence for the general interest and impact” to strengthen his academic profile in the eyes of funders. Steve reported that the data meant he could showcase his impact and better understand the value of online communication channels such as Twitter when trying to promote research and grow an online network. He said one of the reasons why social media sites like Twitter are valuable for researchers is that “explaining your work to a general audience is an excellent exercise in communication skills”. Brian D. Earp (University of Oxford) added Altmetric statistics to his online CV, in the hope that “those who look at my CV will get a sense that that my papers are in fact being read and downloaded, and how they’re doing along those dimensions compared to other papers in the same journal, or of a similar age etc”. For Earp then, the data became a benchmarking and contextualisation tool as well as a way of showcasing impact.

 

Further use cases: how did the researchers use the data to target an audience? 

Screen Shot 2014-11-26 at 17.09.07   In the image opposite, you can see Jean Peccoud’s attractive and sophisticated laboratory website, which uses the Altmetric API as a way of “providing dynamic and valuable content to the visitor of my site”. Jean said the data helped him “figure out what to read”, by allowing him to gage the level of online attention for papers. Although we never try and suggest that the Altmetric data provides a measure of quality this use case is interesting; Jean used the data to develop a reading list, then added it to one of his web pages, thereby allowing his peers to discover the data for themselves. It was interesting that although the researchers gave a range of responses when asked how the data had helped them, their answers were quite consistent when they described the initial aims they had in mind when they started to play around with the information. Egon Willighagen summarised his motive for using Altmetric as follows; “getting  insight on how my research (not just papers, but including those) is used and what people say about it is important to me, and helps me make strategic decisions on where my research funding chances lie”.  All six of them said they wanted a way of showcasing the attention their papers had received, whether to get the attention of grant funders (Steve Portugal), the notice of a tenure committee (Vincent Scalfani), or the general interest of people visiting their online publication lists.

 

Other feedback

We also wanted to hear about the downsides, so we asked the researchers one more thing; whether they felt there were any limitations with the data or things they hoped to do with it in future. One of the really interesting responses came from Jean Peccoud: “the score sometimes feels a little like black magic”, while Steve Davis mentioned “the unpredictability of what the service picks up and what it doesn’t”. Vincent Scalfani also said that more documentation on this issue might be beneficial for researchers, posing the idea of “an overview to explain what is being measured, what the data sources are, and how Altmetric fits into measuring research impact”.  In our next post in this series, we are intending to lay out the logic behind the score and sources more clearly, so hopefully we’ll be able to address some of those questions then. For now, anyone can visit our support resources pages for some slightly more detailed information on how we collect and rank mentions from different places.

In summary then, these responses can tell us a lot about how researchers are using Altmetric right now, and about how they would like to see the tools and support resources develop in future. The feedback suggests the data is useful as you can see who has been talking about your research, and then take ownership of this information by adding it to important documentation and incorporating it into your digital identity. It also seems like researchers welcome the realisation that their research doesn’t exist in a vacuum; that they can compare their Altmetric statistics with similar articles in the same academic discipline.

As always, we’d love to know what you think about this post, so please feel free to comment with any feedback/questions you may have. Please get in touch with us if you’d like to request an API key, and visit this page for more information about embedding the Altmetric badges. Thanks for reading!

You’re an Academic Librarian or Research Officer and support researchers making their research freely available online via open access. A major part of this process is enabling researchers to deposit open access versions of their work in your institutional repository or research publications system. So how do you encourage your researchers to deposit full-text papers? Do you focus on advocating the societal benefits of increased access to research during presentations and training? Perhaps you share repository download statistics to demonstrate usage? Or implement tools to encourage high open access deposit rates? Do funder open access policies also feature high on your agenda?

Let’s talk about funder open access mandates for a moment. Internationally, there’s a big focus on open access compliance across Higher Education institutions: with the UK HEFCE Open Access Policy kicking off in 2016, a number of funding agencies internationally are also adopting similar policies. Many funding agencies allow an embargo period before a research output is available via open access. For example, the recently announced Canadian Open Access policy requires outputs to be made available via open access in an institutional repository within 12 months of publication or published in a journal that offers free access within 12 months. Such policies create an additional driver for institutions to enable researchers to deposit their work within given timeframes.

Researchers, however, are faced with an increasing number of mandates, so it’s important to encourage OA deposit in a way that gives something back to researchers.

Here are our ideas about how you can use Altmetric tools to encourage researchers to make their work freely available online by depositing in your institutional repository, and what’s in it for them:

Install the free Altmetric Badges in your institutional repository

We offer free embeddable Altmetric Badges for institutional repositories. This helps researchers track attention to specific papers – so if a paper has been deposited in your repository, researchers can monitor attention to that output via the repository record. This might be displayed alongside other metrics, such as repository downloads or citation counts. See this example of an embedded Altmetric badge in a University of Glasgow Enlighten record.

 Let us know your repository domains so we can make sure we’re tracking your IR content – just email support@altmetric.com.

Altmetric for Institutions: Syncing with your Publications System

You can also sync your publications system or institutional repository with Altmetric for Institutions. When we populate Altmetric for Institutions, we often use your publications system as a data source. This is an extra incentive for researchers to deposit the metadata and full-text of their papers: by simply depositing metadata, it will appear in Altmetric for Institutions, and researchers can then track the attention to all of their papers in one place. Badges can also be embedded in Research Information Management or Current Research Information Systems (CRIS), see our example on the left of Altmetric badges in Symplectic Elements, helping raise awareness of altmetrics in existing workflows.

Register for a free Librarian Altmetric Explorer account

If you’re an Academic Librarian, you can also register for a free Altmetric Explorer account. Although these free accounts don’t have all of the functionality of our full instutitional platform, you could analyse the data for all the papers to which we’ve ever tracked attention or compare the altmetrics of a small set of your own open access papers with those behind paywalls. Do you see any trends? Are there any open access success stories to share across the institution?

Open access altmetrics advantage?

Here at Altmetric, we think our tools can help institutions encourage open access compliance and we’re particularly interested in the correlation between open access papers and altmetrics attention. In Euan’s recent blog post, we found that open access papers from a set of Nature Communications articles generated significantly more tweets and Mendeley readers when compared with articles behind paywalls. We also know that open access journals perform well when looking at average attention per article, e.g. papers from PLOS ONE, the BioMed Central series and Scientific Reports. 37% of papers in the Altmetric 2014 Top 100 list were available via open access. So there is anecdotal evidence to suggest that open access papers attract more attention across the sources we track but we’d love to see more research in this area.

We want to hear how you have used Altmetric to encourage researchers to share their work via open access. Have you embedded the badges in your repository? How do you share altmetrics best practice with your researchers? Get in touch!

BenAltmetric Product Specialist Ben McLeish recently attended an Altmetrics workshop hosted at the Carol Davila University Bucharest. Here he reports on what took place and what he took away from the day: 

On February 18th 2015 Carol Davila University Bucharest held an event dedicated to the emerging field of alternative metrics. The event was organized and introduced by Cristina Huidiu, one of Carol Davila’s librarians and a metrics expert all by herself. You can catch up on comments from the event on the #altmetricsWHO hashtag on Twitter.

The four-hour overview of the now burgeoning and varied field included appearances by Mike Taylor of Elsevier who is mid-PhD on the subject, Peter Porozs, also of Elsevier, who was there to present an overview of SciVal’s latest developments, and Victor Velter, a Romanian government official who has been active in the field of scientometrics for some twenty years. Victor spoke at great length (although, unfortunately for me, in Romanian) on his view of metrics and scholarly work. I was also in attendance to give an overview of Altmetric’s own institutional offering.

By way of introducing the event, Cristina gave an overview of what traditional scholarly metrics are. Article citation counts, impact factors and the h-index were all explained along with many others. Mike then gave an illuminating talk on the field of alternative metrics. He explained what they are, and what his work at Elsevier has allowed him to research in the last few years. I followed him, and took the audience through what Altmetric has been doing to gather data from news, blogs, social media, grey literature and the many other sources, and aggregate these at not only an article level, but at an institutional level. Mike

Even though the adoption of altmetrics for institutional research evaluation is presently in its early stages in Romania, the questions from the audience following my talk were well-focused.

“Alternative metrics are only alternative to the existing methods of quantification – these aren’t complete replacements for traditional metrics.”

How, asked one audience member, are we to allow altmetrics to gain traction as an accepted method of research evaluation when authors are still required to publish in high-impact journals (which are high-impact on account of their traditional metric of impact factor and article citation counts)? I explained that it’s important to remember that alternative metrics are only alternative to the existing methods of quantification – these aren’t complete replacements for traditional metrics. Instead, we are filling a gap between publication and the ultimate citations and impact factors which will only be available in years to come. This gap exists now, and so altmetrics are already useful to tell a story about the broader impacts and conversation surrounding a piece of research.

“We are filling a gap between publication and the ultimate citations and impact factors which will only be available in years to come.”

Peter Porosz of Elsevier began his SciVal talk with a very entertaining video of Hungarian visual artist Istvan Banyai, whose animated piece “Zoom” is a meditation on shifting context. Check out the 5 minute animation on Youtube.

One of the most powerful aspects of altmetrics is the ability to gather more information about the context of a discussion surrounding an article, rather than just flat numbers devoid of detail. We have previously blogged about articles gaining unwanted attention, demonstrating how high Altmetric scores don’t confer higher quality to a piece of research; in fact it can sometimes be quite the opposite. This angle of context really helped the audience understand that altmetrics are more than just numbers, they are the organized effort to present and allow interpretation of metrics which are produced when users interact with research online.

The event gathered a lot of very positive feedback from those who attended. It is always a real delight to see growing interest in the scholarly community to new forms of engagement with the public, and with distributed and open data.

If you are organizing your own event, be sure to let us know so we can spread the word!

Many of you will be familiar with the Altmetric donut, often to be found nestling on article details pages across many publisher platforms both large and small. The data provided there gives authors and readers some great insight into who is talking about the article, and can be useful to them in identifying new content to read or new communities to engage with. But how are the publishers themselves using Altmetric data, and what value do they get from it?

The (kind of) short answer is: in lots of different ways we would have never even thought of. Many of our publisher partners have reported back to us that they are not only using the Altmetric data to keep an eye on the levels and types of attention their articles are getting, but also applying this extensively across both their internal and external activities. In this post we’ll take a look at some of the use cases we’ve come across so far, but do let us know by email or in the comments below if you have anything to add.

The challenge
The obstacles publishers face today are in many ways very similar to those they have always faced: competing for library budgets, adding value for users, producing the best content, and understanding their audience and markets.

New Open Access mandates and guidelines have meant that many publishers have had to adapt their traditional models and work to attract new authors, whilst at the same time ensuring usage remains high on content published under the subscription model. This new era of research dissemination and a focus on public engagement offers an opportunity: for publishers to align their goals more closely with the institutions and academics they serve, and to proactively demonstrate this.

The solution
An increasingly growing list of metrics provide publishers and their customers with more insight into the content they publish and how it is being used than ever before. The bigger challenge, for many, is determining which of that data is of most use, and applying it effectively.

Download counts, impact factors and citation stats will get you so far, but actually tell you little about why someone is reading a paper, what they think of it, or even who Altmetric Explorerthey are. This is where Altmetric and our data can help. We surface not only the number of mentions of an article from a particular source, but also the content of that mention, who made it, and what they were saying about the research.

Here’s some of the ways publishers are using Altmetric data (and in particular the Altmetric Explorer reporting tool) across their teams to help drive their strategy and better serve their stakeholders:

 

As part of their editorial activities

Reporting back to society partners, or providing evidence of success to potential collaborators
With the Altmetric Badges embedded in article pages all users can click through to see the original mentions of each paper. Even better, with the Altmetric Explorer, you can explore the entire Altmetric database and view all of the articles from a particular journal or set of journals in one go – allowing you to easily benchmark between articles and report on performance on a month by month basis – great for when you need to show the amount of reach and attention content published in a journal has received to society publishing partners (and how your journal is doing compared with competing titles).

Identifying high profile authors to attract
With the Altmetric Explorer, you can easily search and report on not only articles in your own journals, but also those of similar titles from other publishers. By filtering the data you can identify authors publishing in other journals who are getting a lot of attention – and because we show you not just how many people are talking about an article but also what they’re saying, you can ensure that it’s attention for the right reasons.

Repurposing existing contentebooks
Using altmetrics data to understand which of your own content is popular an be really useful. We spoke last year to MIT Press, who had done exactly this. They looked at the Altmetric mentions for all of their articles to help narrow down which content it might be good to include in their new ebook series. Altmetric data was of course not the only factor involved, and a team of editors had the final say, but it gave them a good starting point and enabled them to easily see which content might have a broader appeal.

Enriching author feedback and reporting to encourage future submission
As more and more publishers switch to an Open Access author pays model, attracting and building relationships with authors is more crucial than ever. When Wiley first launched Altmetric Badges on their articles they ran a survey to see what their authors and readers thought of the data provided. The result? Nearly 80% of those that took the survey agreed that article metrics enhanced the value of a journal article. More pertinently, nearly 50% said that they were more likely to submit to a journal that provided article level metrics, with over 75% reporting that they were using the information provided to discover and network with researchers who were interested in the same area of their work.

Gathering evidence for future strategy decisions
Knowing which of your content is getting the most attention where can be invaluable when it comes to forming future editorial strategy – as can monitoring the online activity happening around articles in competitor journals. Altmetric data can be useful for determining hot topics for special issues, or help you identify where in the world you should consider ding additional author and reader outreach.

 

In their marketing

Uncovering and further promoting hot topic contentNPG
Timing is crucial for promoting your content – once something is published you have a timespan of maybe a few months in which to really make the most of it. But other than editorial guidance, what can help you determine which content is worth promoting? Citations take too long to accrue, and downloads offer only one dimension of the full picture. By monitoring the Altmetric data you can quickly and easily identify which articles are proving popular and turn it into a content discovery collection – ‘if you like this then you might like’.

We’ve seen quite a few publishers so far using the Altmetric data as a way of generating top 10 lists of articles by journal or discipline that have been attracting a lot of online discussion and attention (see this example from Nature and this example from Elsevier) – a great way to showcase content that may not otherwise have been promoted.

Assessing the impact of marketing and PR activity
Monitoring the effects and ROI of your marketing activity is tricky at the best of times – and even more difficult is communicating the benefits of outreach work to your authors. Surfacing Altmetric data helps them and you to immediately see the results of your efforts; without needing to pull together lengthy reports. Collating all of the attention for an article in one place means you can easily see who is retweeting you, which press have picked up the article, and which bloggers are writing about it.

Identifying new contacts and channels to engage with
Along with monitoring the success of existing campaigns, the Altmetric data can also help you identify gaps and new opportunities. Perhaps there’s a blogger or a particular news outlet that it would be beneficial to build relationships with – or even an entire user group that you weren’t aware of with an interest in the research. Take a look at similar articles from other publishers to see where they are getting most of their engagement, and work out how you can improve your existing plans.

Support existing authors and institutional customersdetails
Helping your authors understand where their work is getting attention and who is engaging with it is a great way to encourage them to undertake more of this kind of activity themselves. Even if they don’t want to join in the conversation, it’s beneficial for them to at least be able to see who is talking about their work, and what’s being said.

The data is also useful for demonstrating to institutional customers how much attention the research their faculty have published with you has attracted, and might even provide them with additional insight that they can in turn use for funding or to support career development.

 

To support sales objectives

Identify new markets
Publishers are already analysing turn-aways and paying close attention to who is downloading their content, but what about all the other people who are sharing and commenting on it? With Altmetric data you can get a better understanding of who is playing a key part in the online networks that surround your content – and this extra insight can help uncover new audiences who have an interest in the content but might not yet be a subscriber.

Understand networks
Paying attention to how your content is being disseminated and shared online is an increasingly important part of any publisher strategy. It might become obvious that there is a key portal by which you should be making your content available, or it might even be a chance to become more closely involved with the institutional environment, and therefore better understand how you can work alongside your customers to help them achieve their ideal outcomes.

 

If you’re interested in what’s been discussed in this blog post you might like to watch our short video to learn more about what Altmetric offers for publishers, or do get in touch if you’d like to talk us about any of our data or tools.

Welcome to the The High Five research articles of the month!  Image: NASA Goddard Space Flight Center, Flickr.com

Welcome to the The High Five research articles of the month!
Image: NASA Goddard Space Flight Center, Flickr.com

Welcome to the February 2015 High Five here at Altmetric! In this blog post, my first for Altmetric, I’ll be leading you on a tour of the top 5 peer-reviewed scientific articles this month according to Altmetric’s scoring system. On a monthly basis from here on out, my High Five posts will examine a selection of the most popular research outputs Altmetric have seen attention for that month.

To tell you a bit more about me, and what this ‘High Five’ bit is all about… My name is Paige B. Jarreau, and I’m a PhD candidate studying science communication. (My blog From The Lab Bench is over at SciLogs.com where I am also the blog manager, if you want to check it out). I also have a Master’s degree in Biological Engineering. So my expertise is wide-ranging. In these posts you might find me commenting in-depth on molecular biology and nanotechnology on the one hand, and psychology and communication on the other. But the point with this High Five series isn’t for me to interpret and summarize the top papers of the month at Altmetric – there are plenty of scientists and journalists doing that already. The point is to look at the published research each month that is resonating with readers, journalists and bloggers, and to explore what they are saying about this academic work and perhaps why they were inspired to tweet about it, blog about it, write about it or visualize it. So, let’s go!

#1. Marine Trash

The first of the High Five is a research report published this month in Science, titled “Plastic waste inputs from land into the ocean” and authored by Jenna R. Jambeck and colleagues.

“Considerable progress has been made in determining the amount and location of plastic debris in our seas, but how much plastic actually enters them in the first place is more uncertain. Jambeck et al. combine available data on solid waste with a model that uses population density and economic status to estimate the amount of land-based plastic waste entering the ocean. Unless waste management practices are improved, the flux of plastics to the oceans could increase by an order of magnitude within the next decade.” – Editor Summary, Science

The New Zealand Herald headlined with “More than 25,000kg of plastic littered in NZ daily,” writing that according to this new study, between five and 13 million tonnes of plastic waste wind up in the world’s oceans every year. The more precise estimate according to the Science study authors is 8 million tonnes of marine plastic a year. But according to data from Jambeck and colleagues, the “worst offenders” in terms of dumping plastic into the oceans are China, Indonesia, Vietnam and the Philippines. The US also makes the list of “worst offenders,” ranking 20th as visible in a graphic produced by The Guardian.

“Coastal populations put about 8m tonnes of plastic rubbish into the oceans in 2010, an annual figure that could double over the next decade without major improvements in waste management efforts, scientists warn.

The mountain of plastic litter, including bags, food packaging and toys, was equivalent to five full shopping bags of debris for every foot of coastline bordering nearly 200 countries the team studied.” – The Guardian

“This input of plastic waste to the oceans is several orders of magnitude more than we can see, which means there’s a lot of plastic out there that we are not finding,” study author Jenna Jambeck told Ian Sample at The Guardian.

This study seemed to resonate with scientists, news makers and citizens alike. For one, estimates of marine litter have until now been very coarse. The documentation of ocean-bound waste streams reveals an alarmingly clear picture of what has been classically mysterious to us as our trash “disappears” every day from the bins at the end of our driveway or from the sides of our streets.

“I take photos of the way people manage waste all over the world,” Jenna Jambeck told NPR. “I take pictures of garbage cans. I met my husband at the landfill. … At least he understands me.”

Beautiful Bali. "Plastic bags, bottles and other trash entangled in spent fishing nets litter the potentially beautiful Jimbaran beach on the island of Bali, Indonesia." Image by killerturnip, Flickr.com

Beautiful Bali. “Plastic bags, bottles and other trash entangled in spent fishing nets litter the potentially beautiful Jimbaran beach on the island of Bali, Indonesia.” Image by killerturnip, Flickr.com

Some news outlets covered the story with a “blame” frame, mostly highlighting the “worst offenders” data published by Jambeck and colleagues. Other outlets, including Vox, took the story in a “where does it all go?” direction, which I believe is the better question to leave scientists and publics with.

“A separate study last year in the Proceedings of the National Academy of Sciences identified massive swirling garbage patches in each of the world’s oceans that contain up to 35,000 tons of plastic.

Yet those patches accounted for less than 1 percent of the plastic thought to be in the oceans — and no one quite knows where the other 99 percent went. One possibility is that marine creatures are eating the rest of the plastic and it’s somehow entering the food chain. But that’s still unclear.” – Brad Plumer, Vox

The bottom line? Reduce, re-use and recycle your plastic products. These things don’t “disappear,” even if we’re not quite sure where they are going.

#2. Twitter Language Patterns Reveal Heart Disease Factors?

Image by Kooroshication. Flickr.com

Image by Kooroshication. Flickr.com

The second paper in our High Five list is an interesting study that uses language expressed on Twitter to characterize community-level psychology correlates of heart disease. “Language patterns reflecting negative social relationships, disengagement, and negative emotions—especially anger—emerged as risk factors; positive emotions and psychological engagement emerged as protective factors” (Eichstaedt et al., 2015, Psychological Science). This study got lots of attention on Twitter (of course!) and in online science news outlets. Pacific Standard headlined with the phrase “Happier Tweets, Healthier Communities: New research finds county-level mortality from heart disease can be accurately predicted by analyzing the emotional language of local Twitter users.”

“Measuring such things is tough, but newly published research reports telling indicators can be found in bursts of 140 characters or less. Examining data on a county-by-county basis, it finds a strong connection between two seemingly disparate factors: deaths caused by the narrowing and hardening of coronary arteries and the language residents use on their Twitter accounts.” – Tom Jacobs, Pacific Standard

Guess which Twitter-based factor emerged a protective factor against heart disease mortality in the study? Engagement, or participation in community activities. Apparently, community influences are playing more of a role in predicting heart disease mortality than the directly expressed sentiments on Twitter. Twitter users are young (median age is 31 years old), but the sentiment of young people’s tweets say something about the nature and “health” of local communities, which in turn say something about heart disease mortality. So it’s not like if you tab over to Twitter right now and type in a bunch of hostile words your blood pressure will rise (although it might make your followers’ blood pressure rise!)

David Glance points out in an article at The Conversation that “[w]hat this study definitely does not say is that being angry on twitter will lead to, or is anyway related to, heart disease, which has been the unfortunate suggestion in the selling of the study in news reports.”

“The combined psychological character of the community is more informative for predicting risk than are the self-reports of any one individual.” – Eichstaedt and colleagues, 2015

It is very important to point out that this study is based upon correlational data and can’t say anything about causation or prediction of heart disease at the individual level, as some news headlines suggested. A few scientists pointed this out. Glance highlighted the misleading nature of several graphics accompanying news stories about this study, which make the Twitter sentiment data look like it has more predictive power than it actually does.

“The study is interesting in that it suggests that a specific communities’ makeup can be identified by the language that they use on social media. This in turn may be an indicator of other factors such as the specific community’s health and well-being and the consequences of that health and well-being.

However, as attractive as those suggestions may be, the study showed weak correlations and gave very little underlying support for any particular theory of “causation”. So as much as they could speculate, the researchers could really not say why they saw these statistical results. The paper actually highlights some of the fundamental problems with so-called big data where large numbers distort statistics to point at imagined relationships.” – David Glance, The Conversation

 #3. Using Your Smartphone to Detect Infectious Disease

The next High Five article takes mobile apps to a whole new level. In a February 2015 report in Science Translational Medicine, researchers describe “a smartphone dongle for diagnosis of infectious diseases at the point of care,” a lab-on-a-chip technology. You could see how this report captured readers’ imaginations. Smithsonian magazine reported that the “$34 Smartphone-Assisted Device Could Revolutionize Disease Testing: A new lost-cost device that plugs into a smartphone could cut down on expensive lab tests.”

This video from the Sia lab (Samuel K. Sia was the PI on this research) shows how the device works: http://youtu.be/TC9XNqSgj4w

“All it takes is a drop of blood from a finger prick. Pressing the device’s big black button creates a vacuum that sucks the blood into a maze of tiny channels within its disposable credit card–sized cartridge. There, several detection zones snag any antibodies in the blood that reveal the presence of a particular disease. It only takes a tiny bit of power from the smart phone to detect and display the results: A fourth-generation iPod Touch could screen 41 patients on a single charge, the team says.” – Nicholas Weiler, Science Mag

#4. Forever Young

Our forth stop is a study titled “Forever Young(er): potential age-defying effects of long-term meditation on gray matter atrophy” published in the journal “Frontiers in Psychology” in January 2015.

“Last week, a study from UCLA found that long-term meditators had better-preserved brains than non-meditators as they aged. Participants who’d been meditating for an average of 20 years had more grey matter volume throughout the brain — although older meditators still had some volume loss compared to younger meditators, it wasn’t as pronounced as the non-meditators.” – Alice Walton, Forbes

This study received quite a bit of Twitter and blog attention, especially from members of the public and psychology bloggers. The study didn’t seem to attract much attention from science communicators. NeuroscienceStuff at tumblr.com provides a breakdown of the study here.

“The researchers cautioned that they cannot draw a direct, causal connection between meditation and preserving gray matter in the brain. Too many other factors may come into play, including lifestyle choices, personality traits, and genetic brain differences.” - NeuroscienceStuff

#5. Martian Mystery Cloud

Grupo Ciencias Planetarias (GCP) - UPV/EHU, W. Jaeschke, D. Parker

Grupo Ciencias Planetarias (GCP) – UPV/EHU, W. Jaeschke, D. Parker

What better way to start off a popular science story than with a tale of a mysterious Martian cloud? A letter in Nature published this month reports “the occurrence in March and April 2012 of two bright, extremely high-altitude plumes at the Martian terminator (the day–night boundary) at 200 to 250 kilometres or more above the surface, and thus well into the ionosphere and the exosphere.” According to Eric Mack writing at Forbes, “A strange cloud or haze rising extremely high above the surface of Mars has astronomers struggling to come up with an explanation for the phenomena that fits with the existing scientific understanding of the Martian atmosphere. [...] The phenomena extended 250 kilometers above the surface and was the size of a major tropical storm on Earth.”

“Such record-breaking feature defies our current understanding of processes on Mars atmosphere.” - Grupo de Ciencias Planetarias

George Dvorsky pondered several theories for the occurrence of these plumes in his io9 article, as did Christian Schroeder at Discover’s D-brief. Did a volcano or asteroid impact cause these clouds?

 Did other scientific research articles inspire you this month? Share them here or tweet them at us! Find Altmetric on Twitter @altmetric. Find me on Twitter @FromTheLabBench.

Can data emerging from media ‘mentions’ of research provide timely indicators of outcomes linked to social and technical innovation for stakeholders in networks beyond academia? This was the focus of a study released this week by the consultancy arm of Digital Science

coverThe study, funded and commissioned by Nesta, used Altmetric data to examine and identify how online mentions of research might offer insight into the broader impact of a piece of academic work.

Focussing particularly on outputs in medicine, the study acknowledges that there are professional, academic and social motivations for mentions, and recognizes that the challenge is to discern patterns and concentrations that capture these factors. The authors ask:

“Do mentions have innovative value in communicating impact beyond the academic world?”

They propose that there are two key types of communities who regularly engage with published research online, and the characteristics of each can be distinguished by their motivations:

Impact in communities of practice. Rapid and accessible communication of innovative research outcomes relevant to practitioners and professionalsin the health sector is also of value to research users and managers. We interviewed a range of experts but found no clear characteristics of research publications with economic, social or professional – rather than academic – impact. Analysis confirmed many motivations for research mentions and highlighted their communication potential,  but found no consistent view as to why some articles get mentioned frequently.

venn

Impact in communities of interest. Patients, carers and supporters of disease charities represent a network that wants to look at, understand and communicate research about new treatments. Statistical analysis showed more mentions were given to papers associated with diseases tackled by charities with larger research funds. Cardiovascular disease receives more attention than its charitable research spend suggests, however, whereas spend on Immune and Musculo-skeletal diseases is high but media mentions are relatively low.

The study observes that health and clinical networks can enable timely, rapid and ‘serious’ media communication of innovative research with non-academic social and professional stakeholder benefits, but they may need key people as active nodes to engage them.

Click here to access the full report online.

 

 

We recently announced that over 20 academic institutions around the world have adopted the Altmetric for Institutions platform to help them track and report on the attention surrounding their research outputs.

Here we talk to Scott Taylor, Research Services Librarian at the University of Manchester, who were one of the early adopters of the Altmetric platform.

Hi Scott, thanks for taking the time to talk to us today. Could you tell us a little bit about your role within the university?
Of course; I work as part of the Research Services Team at the University Library. One of our roles is to provide research tools to the entire institution – with the aim of identifying and implementing those that can provide the most benefit to our researchers.


So how did you first become aware of altmetrics, and what motivated you to find out more?
I first became aware of altmetrics via conferences and industry newsletters. It’s important in my role that we stay up to date with the latest opportunities in research management and assessment, and I was particularly interested in what tools and data there might be that we could use across the institution.

What kinds of institutional goals are you hoping to achieve?
We and our researchers spent a lot of time trying to collate and summarize the attention and engagement our research generates as evidence to submit to the 2014 Research Excellence Framework (REF) which takes place every 6/7 years in the UK. It’s important for us to be able to demonstrate this broader impact and show how our work is having an influence on society as a whole. We’re always aiming to make this process easier for our researchers, and this goal more attainable as a result.

Added to that, there are a lot of public outreach and other activities that go on at Manchester – and it can be hard to keep track of what is happening where, or to get any idea of how effective it is. We wanted to validate efforts and provide our researchers, administrative staff, and communications and marketing colleagues with qualitative feedback on the outcomes of their strategies.

And what challenges do you face?
As I’ve mentioned, for research and administrative staff, time is always a massive constraint in terms of what we can realistically achieve. Ultimately, this means that core activities will always get prioritised, and any evaluation or planning for improvement can be pushed aside until it becomes vital (when submitting to the REF, for example). We wanted to provide a reliable system that would make it easy and quick for people to regularly monitor and report on their wider impact, and one which would offer valuable and relevant data that they would benefit from.

“Not only does having all of this data in one place save us time, but it offers a lot of interesting indicators of impact and qualitative data that we would not necessarily have gathered elsewhere.”

How do you hope the Altmetric for Institutions platform can help you realize your objectives?
We’ve set up site license access to the Altmetric for Institutions platform, and are rolling this out across the institution via training and support from the Altmetric team. Already we’ve identified a number of use cases – not only does having all of this data in one place save us time, but it offers a lot of interesting indicators of impact and qualitative data that we would not necessarily have gathered elsewhere. This includes things like where our research has been referenced in policy documents, for example, and not only where in the world our author’s research outputs are getting attention, but what people are actually saying about them.

Could you tell us a bit about your experience of using the platform so far? Are there any particular scenarios you can see it being particularly useful for?
Our experience so far has been really positive. The implementation process was fairly straightforward, and we had good support from the Altmetric team throughout. We’ve worked closely with our Academic Engagement Librarians to ensure they have a good understanding of the platform and the data it provides – so that they are well equipped to go out and introduce it to our researchers.

Screen Shot 2015-02-06 at 17.11.44.png

A particular feature that we really like on the platform is the summary reports. We can see at a glance how much attention all of our research has attracted over the last week, the last month, or even the last year/few years. It helps us identify which channels are main drivers of traffic to our content, and helps to identify which researchers or departments have been particularly successful in engaging with a broader audience.

We’ve also been busy setting up automated alerts to notify us when research from a department gets new attention in one of the sources tracked by Altmetric, which we can then highlight to authors and departmental heads to provide them with further insight.

What has the feedback from your researchers been like?
It’s been mostly positive! They’re still getting to grips with altmetrics but already I’m hearing how much better it is now that they can explore this kind of data consistently across all of their published outputs in one place. Quite a few of them have been surprised and pleased to discover coverage for their work that they were unaware of, or have identified new communities and stakeholders to engage with. I think it makes them feel like they have a lot more control over how their work is disseminated and represented. They don’t have to join every conversation or respond to every comment, but if they wanted to, having this platform makes it much easier to uncover those conversations.

“Quite a few [of our researchers] have been surprised and pleased to discover coverage for their work that they were unaware of, or have identified new communities and stakeholders to engage with.”

Thanks so much for taking the time to share your experience with us – just lastly, what do you have in mind for altmetrics at Manchester in future?
We’re hoping that the Altmetric platform and data might be particularly useful for our early-career researchers – who are perhaps not yet often cited (or haven’t had the chance to be) but are having an impact in other ways. More generally, we’re keen that the Altmetric platform becomes a tool for discovery – either of new content, new potential collaborators, or just new communities to engage with. Our researchers are doing some great work and we want to help them demonstrate that and form the most effective strategies for getting the most out of their academic outputs.

A couple of weeks ago we released a post announcing a new series of entries on how researchers can use Altmetric and altmetrics to enhance their scholarly workflows. In this initial overview, we’ve looked at a variety of different use cases to identify how and where researchers can gain value from altmetrics data. Before we get into that, we thought it would be useful to quickly recap exactly what altmetrics are:

Briefly defined, alternative metrics, or altmetrics are indicators of impact and engagement that extend far beyond traditional methods of influence measurement. Whereas to date academics and those involved in scholarly communication have looked primarily to impact factors and citation counts to assess the success of a piece of research, altmetrics provide a much more granular level of insight – both at the research output level and, crucially, beyond the academic sphere.

At Altmetric, we track a defined list of sources (including public policy documents, mainstream and social media sites, post-publication peer review forums, and online reference managers) for shares and mentions of research outputs (that’s published articles, datasets, images, and things like white papers and grey literature). We disambiguate between different versions of the same output (the publisher and the institutional repository version, for example), collate all of the attention data together, and surface both a count of mentions per source and the underlying qualitative data (the actual text of the original mentions) via the Altmetric donut visualization and Altmetric details page.

So, now that we’ve got all the data together, here’s some of the ways we’re seeing researchers put altmetrics to use:

Monitoring and tracking early engagement
Collated altmetrics mean you can identify how your work is being received immediately after publication, and respond to the conversation. By viewing online media attention, researchers can easily see attention from people and places they may not have been aware of previously.

This real-time data is particularly useful for early career researchers with a smaller body of work and a lower citation count, or those who are seeking to report back on their impact for a deadline soon after publication. It also presents an opportunity: whereas in the course of traditional citation and referencing methods the original author has little or no say in what context their work is represented, altmetrics offers stakeholders a chance to see who is saying what about a piece of research, and to respond accordingly.

@altmetric showed me that the BBSRC blog actually cites our CHEMINF paper http://blogs.bbsrc.ac.uk/index.php/2011/10/oxford-nanopore-technology-ibers/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+bbsrc+%28BBSRC+-+News+stories+and+features%29 … (would never have seen that otherwise)” – @egonwillighagen

Tools like the free Altmetric bookmarklet can save you a massive amount of time, enabling you to view all of the data simultaneously, rather than checking multiple platforms and trying to accumulate the relevant online attention piece by piece. When combined with citation data, this information can help researchers to understand social impact and engagement as well as academic recognition.

Showcasing Impact
By understanding, analysing and interpreting their altmetrics data, researchers can add interesting evidence of broader impact to funding applications and reports.bookmarklet

We’ve already seen a lot of activity that speaks to this need on the part of research funding bodies, who are attempting to position themselves as thought leaders in terms of narrowing the gap between research and the public.

According to the Research Excellence Framework,  grant funders are looking for proof of “broader impacts”, while The Royal Netherlands Academy of Arts and Sciences defines the “primary aim” of their SEP assessments as “to reveal and confirm the quality and relevance of the research to society”. In a short film on the Wellcome Trust’s funding website, Professor Jeremy Farrar said “researchers don’t exist in a bubble outside of society….to cocoon ourselves is no longer viable”.

This would suggest that if researchers can demonstrate that they have adopted the relevant practices to demonstrate the public visibility of their work, their applications and reports are likely to be stronger.

We’ve seen examples of this being done numerous times – you might like to take a look at this recent blog post, in which researcher Fernando Maestre details how he has used Altmetric data in his funding proposals.

Researchers can also boost their online profiles by adding data to personal websites, faculty profile pages, and CVs – as Marine Ecologist James Grecian has done on his page:

james

Having to show “evidence of research engagement” to their institutions can seem less of a chore if they can start hosting the data in one easy step. Click here for instructions on how to embed the free version of our donut badges on your own page.

Altmetrics for Discovery
Using altmetrics, researchers can find new and interesting content, identify and follow new research trends, and contribute to the conversation around the work of others as well as their own. Being able to see who has Tweeted or blogged about your work allows you to engage with a community of researchers outside the walls of your own institution.

“Only just discovered #altmetric. Very proud to discover that all of my (few) papers have been tweeted about at some point!” – @Frances_Tyler

This means you can start communicating with an online network of academics with similar research interests, which could potentially lead to new collaborative research partnerships. By identifying influential figures in their field through social media, researchers can adapt future outreach strategies for disseminating research accordingly.

Altmetrics can also be useful for identifying which journals in your field have effective outreach strategies for promoting their research. Take a look at other articles in your field to see which ones are getting the most attention, where they were published, and what channels or outlets have been used to disseminate the content. Equally, if you’re teaching, altmetrics data can be useful for identifying which articles are particularly noteworthy or might be good for your students to read as an example of engaging content.

Next month, we will be analyzing some more specific examples of how academics have used Altmetric data on personal web pages to showcase their impact. We will also be publishing some Q&A responses from altmetrics-savvy researchers, to provide a first-hand perspective on the benefits of altmetrics data.

We’d love to hear any ideas or experiences you might have on how researchers can get the most out of altmetrics – feel free to share any comments below or drop us a line to discuss further.

Want to know more? Join our free webinar at 3pm ET/8pm GMT on February 18th to hear direct from researchers how they have been putting altmetrics to good use. Click here to register.

Since its launch in 2001, Wikipedia has quickly grown to become the largest electronic encyclopedia in the world. The English version alone contains over 4.7 million articles, and as of February 2014 there were nearly 500 million unique users and over 73,000 active editors.

Within those millions of articles exist hundreds of thousands of links to academic research. Publishers, institutions and researchers are increasingly moving to leverage the exposure and traffic that a reference on Wikipedia can generate for their content. Although the value and relevance of that traffic is sometimes debated, for their part Wikipedia enforce strict editorial guidelines to try and ensure that quality and standards are consistent across articles, and that undue bias isn’t shown towards over-zealous posters.

There’s a long standing connection between Wikipedia and altmetrics – Dario Taraborelli at the Wikimedia Foundation is one of the authors of the altmetrics manifesto, and he’s doing a lot of great work with groups like CrossRef on helping to make Wikipedia data available for altmetrics tools and research projects, and to help set standards around its use and presentation.

Adding Wikipedia to the list of sources we track has been a goal of ours for a while, and in the course of working on an exciting new project with Springer, one of our customers (more on that at a later date), we had the chance to spend some time on it. We’re really pleased to be able to announce that any mentions of articles or other academic outputs in Wikipedia will now be reflected in a new ‘Wikipedia’ tab on the Altmetric details page.

details

 

 

 

 

 

 

 

 

 

In order to capture these mentions the academic output being mentioned must be referenced with proper Wikipedia citation tag – see http://en.wikipedia.org/wiki/Wikipedia:Citation_templates for information on how these work. We capture that as a mention and add it to the details page for that output. Users can then click on the heading to be taken to the original Wikipedia article, on the username of the person who wrote the mention to see their profile on Wikipedia, and on the date stamp to view the edit record for that article:

wiki

 

 

 

 

 

 

 

 

 

 

 

 

 

Because we’re looking for lots of different identifiers and have some constraints around auditability – every mention we collect has to have an author and a timestamp – we wrote our own system to get the data out in the format we need but we’re hoping to share tips, approaches and data with others in the community.

Excitingly, this means we now have a new colour to add to the Altmetric donut! From now on Wikipedia mentions of a research output will be visualized by a dark gray color in the donut. Any mention in Wikipedia (or even multiple mentions) will add a flat count of 3 to the score (so even if an article is mentioned 3 times across Wikipedia, it will still only add 3 to the score).

donut

For now we’ll just be tracking the English language version of Wikipedia, but we’ll be adding all of the other language versions over the next few months. Also please note that to begin with we’ve had a big load of historical data to update all at once, so some of it is still feeding through to the details pages – but give it a week or so and it should all be up to date.

Going forward the article details pages will reflect any Wikipedia references as soon as they are made (the only caveat being that we cache our API for an hour when the details page for any output is loaded, so sometimes there might be a bit of a delay). If a reference is deleted from Wikipedia the details page will update at the same speed to reflect that, and if all Wikipedia references for a research output are removed, the gray color would go from the donut and the score would drop by 3.

The addition of this source will lead to increased coverage in terms of the number of research outputs we’ve captured any online activity for. Wikipedia references are a key driver of referral traffic for publishers and institutions, and collating the associated references allows us to more accurately reflect the broader engagement surrounding a piece of research.

We’re certainly not the first to notice this trend. A number of scholarly content providers and research institutions have taken the step of employing a Wikimedian in Residence to provide outreach within their community, and to ensure that their research is represented appropriately. The Bodleian libraries in Oxford are currently hiring their very own curator, and several national libraries and museums have also invested in this role. At the Royal Society of Chemistry (RSC) in the UK, they have not only hired a Wikimedian, but also extended fully open access across all of their content to active Chemistry Wikipedia editors.

Andy Mabbett, Wikipedian in Residence for the RSC, notes that “Wikipedia is the first port of call for both students and lay people seeking information about chemicals and other chemistry topics. Recognising the importance of providing citations to reliable sources as part of Wikipedia’s policy of verifiability, by which readers can be assured that its content is reliable, the Royal Society of Chemistry is delighted when its publications are referenced by Wikipedia editors, and it is in the interests of all concerned that such citations are clear, precise, unambiguous and formatted in a standard manner, both for the convenience of humans, and to allow parsing by computers.”

Andy reports that the response to their efforts to date has been “overwhelmingly positive”, and that there is already a large amount of interest for a program of public engagement events that he is planning to run later in the year.

Combining the Wikipedia references together with mentions from the wide variety of sources that we already track will allow researchers, institutions and funders to quickly and easily see where their article has been referenced on Wikipedia, and how that fits in to the broader picture of the attention it’s received. This in turn can help identify gaps,  uncover opportunities for further outreach or more extensive curation, and give context to reading material.

If you’ve got questions or comments about this new source please do get in touch – you can email us at support@altmetric.com, or send us a tweet @altmetric.

On January 5th, Kelli Marshall published an article in the The Chronicle of Higher Education entitled “How to Curate Your Digital Identity as an Academic”, in which she discussed the importance of having an online presence. Marshal argues “as an academic or would-be academic, you need to take control of your public persona and then take steps to build and maintain it”. She also points out; “if you do not have a clear online presence, you are allowing Google, Yahoo and Bing to create your identity for you”.

Kelli talks about the need for academics to build and maintain websites about themselves that are separate from their institutional faculty pages, in order to increase search engine optimization and ensure they are the top hit on a list of Google results, not their doppelganger. She argues that once you have built a presence, you can build a network and “piggyback off sites whose web pages rank high in Google”.

So, how does this relate to Altmetric? We talk to a lot of researchers about how altmetrics can help them achieve their goals. One of the frequently raised issues is “lack of time”. A typical researcher at a university divides their time between teaching, writing papers, peer review, site or laboratory-based research, grant applications and committee duties. Because their existing workload is so high, many of them feel that the increasing requirement to “show evidence of research engagement” is details page simply another plate to spin, another chore to add to an already intimidating to-do list.

We recently talked to Terrie Moffitt, a researcher at Duke University, who said that “this kind of housekeeping can lessen the amount of time that can be allocated for genuine research”, and that “it is therefore both advantageous and convenient if researchers can outsource this kind of data curation”.

To elaborate further on Terrie’s points, I’ll be posting regularly to our blog, Facebook and Google+ pages with tips and ideas for how researchers can enhance their workflows with the use of altmetrics and Altmetric. We’ll feature a number of real-life case studies and conduct interviews with researchers who have already begun to use these tools.

From embedding badges on departmental websites to using Altmetric for teaching and supervision, we’ll be aiming to clearly demonstrate the uses of altmetrics data to faculty across an institution.