Welcome to the The High Five research articles of the month!  Image: NASA Goddard Space Flight Center, Flickr.com

Welcome to the The High Five research articles of the month!
Image: NASA Goddard Space Flight Center, Flickr.com

Welcome to the February 2015 High Five here at Altmetric! In this blog post, my first for Altmetric, I’ll be leading you on a tour of the top 5 peer-reviewed scientific articles this month according to Altmetric’s scoring system. On a monthly basis from here on out, my High Five posts will examine a selection of the most popular research outputs Altmetric have seen attention for that month.

To tell you a bit more about me, and what this ‘High Five’ bit is all about… My name is Paige B. Jarreau, and I’m a PhD candidate studying science communication. (My blog From The Lab Bench is over at SciLogs.com where I am also the blog manager, if you want to check it out). I also have a Master’s degree in Biological Engineering. So my expertise is wide-ranging. In these posts you might find me commenting in-depth on molecular biology and nanotechnology on the one hand, and psychology and communication on the other. But the point with this High Five series isn’t for me to interpret and summarize the top papers of the month at Altmetric – there are plenty of scientists and journalists doing that already. The point is to look at the published research each month that is resonating with readers, journalists and bloggers, and to explore what they are saying about this academic work and perhaps why they were inspired to tweet about it, blog about it, write about it or visualize it. So, let’s go!

#1. Marine Trash

The first of the High Five is a research report published this month in Science, titled “Plastic waste inputs from land into the ocean” and authored by Jenna R. Jambeck and colleagues.

“Considerable progress has been made in determining the amount and location of plastic debris in our seas, but how much plastic actually enters them in the first place is more uncertain. Jambeck et al. combine available data on solid waste with a model that uses population density and economic status to estimate the amount of land-based plastic waste entering the ocean. Unless waste management practices are improved, the flux of plastics to the oceans could increase by an order of magnitude within the next decade.” – Editor Summary, Science

The New Zealand Herald headlined with “More than 25,000kg of plastic littered in NZ daily,” writing that according to this new study, between five and 13 million tonnes of plastic waste wind up in the world’s oceans every year. The more precise estimate according to the Science study authors is 8 million tonnes of marine plastic a year. But according to data from Jambeck and colleagues, the “worst offenders” in terms of dumping plastic into the oceans are China, Indonesia, Vietnam and the Philippines. The US also makes the list of “worst offenders,” ranking 20th as visible in a graphic produced by The Guardian.

“Coastal populations put about 8m tonnes of plastic rubbish into the oceans in 2010, an annual figure that could double over the next decade without major improvements in waste management efforts, scientists warn.

The mountain of plastic litter, including bags, food packaging and toys, was equivalent to five full shopping bags of debris for every foot of coastline bordering nearly 200 countries the team studied.” – The Guardian

“This input of plastic waste to the oceans is several orders of magnitude more than we can see, which means there’s a lot of plastic out there that we are not finding,” study author Jenna Jambeck told Ian Sample at The Guardian.

This study seemed to resonate with scientists, news makers and citizens alike. For one, estimates of marine litter have until now been very coarse. The documentation of ocean-bound waste streams reveals an alarmingly clear picture of what has been classically mysterious to us as our trash “disappears” every day from the bins at the end of our driveway or from the sides of our streets.

“I take photos of the way people manage waste all over the world,” Jenna Jambeck told NPR. “I take pictures of garbage cans. I met my husband at the landfill. … At least he understands me.”

Beautiful Bali. "Plastic bags, bottles and other trash entangled in spent fishing nets litter the potentially beautiful Jimbaran beach on the island of Bali, Indonesia." Image by killerturnip, Flickr.com

Beautiful Bali. “Plastic bags, bottles and other trash entangled in spent fishing nets litter the potentially beautiful Jimbaran beach on the island of Bali, Indonesia.” Image by killerturnip, Flickr.com

Some news outlets covered the story with a “blame” frame, mostly highlighting the “worst offenders” data published by Jambeck and colleagues. Other outlets, including Vox, took the story in a “where does it all go?” direction, which I believe is the better question to leave scientists and publics with.

“A separate study last year in the Proceedings of the National Academy of Sciences identified massive swirling garbage patches in each of the world’s oceans that contain up to 35,000 tons of plastic.

Yet those patches accounted for less than 1 percent of the plastic thought to be in the oceans — and no one quite knows where the other 99 percent went. One possibility is that marine creatures are eating the rest of the plastic and it’s somehow entering the food chain. But that’s still unclear.” – Brad Plumer, Vox

The bottom line? Reduce, re-use and recycle your plastic products. These things don’t “disappear,” even if we’re not quite sure where they are going.

#2. Twitter Language Patterns Reveal Heart Disease Factors?

Image by Kooroshication. Flickr.com

Image by Kooroshication. Flickr.com

The second paper in our High Five list is an interesting study that uses language expressed on Twitter to characterize community-level psychology correlates of heart disease. “Language patterns reflecting negative social relationships, disengagement, and negative emotions—especially anger—emerged as risk factors; positive emotions and psychological engagement emerged as protective factors” (Eichstaedt et al., 2015, Psychological Science). This study got lots of attention on Twitter (of course!) and in online science news outlets. Pacific Standard headlined with the phrase “Happier Tweets, Healthier Communities: New research finds county-level mortality from heart disease can be accurately predicted by analyzing the emotional language of local Twitter users.”

“Measuring such things is tough, but newly published research reports telling indicators can be found in bursts of 140 characters or less. Examining data on a county-by-county basis, it finds a strong connection between two seemingly disparate factors: deaths caused by the narrowing and hardening of coronary arteries and the language residents use on their Twitter accounts.” – Tom Jacobs, Pacific Standard

Guess which Twitter-based factor emerged a protective factor against heart disease mortality in the study? Engagement, or participation in community activities. Apparently, community influences are playing more of a role in predicting heart disease mortality than the directly expressed sentiments on Twitter. Twitter users are young (median age is 31 years old), but the sentiment of young people’s tweets say something about the nature and “health” of local communities, which in turn say something about heart disease mortality. So it’s not like if you tab over to Twitter right now and type in a bunch of hostile words your blood pressure will rise (although it might make your followers’ blood pressure rise!)

David Glance points out in an article at The Conversation that “[w]hat this study definitely does not say is that being angry on twitter will lead to, or is anyway related to, heart disease, which has been the unfortunate suggestion in the selling of the study in news reports.”

“The combined psychological character of the community is more informative for predicting risk than are the self-reports of any one individual.” – Eichstaedt and colleagues, 2015

It is very important to point out that this study is based upon correlational data and can’t say anything about causation or prediction of heart disease at the individual level, as some news headlines suggested. A few scientists pointed this out. Glance highlighted the misleading nature of several graphics accompanying news stories about this study, which make the Twitter sentiment data look like it has more predictive power than it actually does.

“The study is interesting in that it suggests that a specific communities’ makeup can be identified by the language that they use on social media. This in turn may be an indicator of other factors such as the specific community’s health and well-being and the consequences of that health and well-being.

However, as attractive as those suggestions may be, the study showed weak correlations and gave very little underlying support for any particular theory of “causation”. So as much as they could speculate, the researchers could really not say why they saw these statistical results. The paper actually highlights some of the fundamental problems with so-called big data where large numbers distort statistics to point at imagined relationships.” – David Glance, The Conversation

 #3. Using Your Smartphone to Detect Infectious Disease

The next High Five article takes mobile apps to a whole new level. In a February 2015 report in Science Translational Medicine, researchers describe “a smartphone dongle for diagnosis of infectious diseases at the point of care,” a lab-on-a-chip technology. You could see how this report captured readers’ imaginations. Smithsonian magazine reported that the “$34 Smartphone-Assisted Device Could Revolutionize Disease Testing: A new lost-cost device that plugs into a smartphone could cut down on expensive lab tests.”

This video from the Sia lab (Samuel K. Sia was the PI on this research) shows how the device works: http://youtu.be/TC9XNqSgj4w

“All it takes is a drop of blood from a finger prick. Pressing the device’s big black button creates a vacuum that sucks the blood into a maze of tiny channels within its disposable credit card–sized cartridge. There, several detection zones snag any antibodies in the blood that reveal the presence of a particular disease. It only takes a tiny bit of power from the smart phone to detect and display the results: A fourth-generation iPod Touch could screen 41 patients on a single charge, the team says.” – Nicholas Weiler, Science Mag

#4. Forever Young

Our forth stop is a study titled “Forever Young(er): potential age-defying effects of long-term meditation on gray matter atrophy” published in the journal “Frontiers in Psychology” in January 2015.

“Last week, a study from UCLA found that long-term meditators had better-preserved brains than non-meditators as they aged. Participants who’d been meditating for an average of 20 years had more grey matter volume throughout the brain — although older meditators still had some volume loss compared to younger meditators, it wasn’t as pronounced as the non-meditators.” – Alice Walton, Forbes

This study received quite a bit of Twitter and blog attention, especially from members of the public and psychology bloggers. The study didn’t seem to attract much attention from science communicators. NeuroscienceStuff at tumblr.com provides a breakdown of the study here.

“The researchers cautioned that they cannot draw a direct, causal connection between meditation and preserving gray matter in the brain. Too many other factors may come into play, including lifestyle choices, personality traits, and genetic brain differences.” - NeuroscienceStuff

#5. Martian Mystery Cloud

Grupo Ciencias Planetarias (GCP) - UPV/EHU, W. Jaeschke, D. Parker

Grupo Ciencias Planetarias (GCP) – UPV/EHU, W. Jaeschke, D. Parker

What better way to start off a popular science story than with a tale of a mysterious Martian cloud? A letter in Nature published this month reports “the occurrence in March and April 2012 of two bright, extremely high-altitude plumes at the Martian terminator (the day–night boundary) at 200 to 250 kilometres or more above the surface, and thus well into the ionosphere and the exosphere.” According to Eric Mack writing at Forbes, “A strange cloud or haze rising extremely high above the surface of Mars has astronomers struggling to come up with an explanation for the phenomena that fits with the existing scientific understanding of the Martian atmosphere. [...] The phenomena extended 250 kilometers above the surface and was the size of a major tropical storm on Earth.”

“Such record-breaking feature defies our current understanding of processes on Mars atmosphere.” - Grupo de Ciencias Planetarias

George Dvorsky pondered several theories for the occurrence of these plumes in his io9 article, as did Christian Schroeder at Discover’s D-brief. Did a volcano or asteroid impact cause these clouds?

 Did other scientific research articles inspire you this month? Share them here or tweet them at us! Find Altmetric on Twitter @altmetric. Find me on Twitter @FromTheLabBench.

Can data emerging from media ‘mentions’ of research provide timely indicators of outcomes linked to social and technical innovation for stakeholders in networks beyond academia? This was the focus of a study released this week by the consultancy arm of Digital Science

coverThe study, funded and commissioned by Nesta, used Altmetric data to examine and identify how online mentions of research might offer insight into the broader impact of a piece of academic work.

Focussing particularly on outputs in medicine, the study acknowledges that there are professional, academic and social motivations for mentions, and recognizes that the challenge is to discern patterns and concentrations that capture these factors. The authors ask:

“Do mentions have innovative value in communicating impact beyond the academic world?”

They propose that there are two key types of communities who regularly engage with published research online, and the characteristics of each can be distinguished by their motivations:

Impact in communities of practice. Rapid and accessible communication of innovative research outcomes relevant to practitioners and professionalsin the health sector is also of value to research users and managers. We interviewed a range of experts but found no clear characteristics of research publications with economic, social or professional – rather than academic – impact. Analysis confirmed many motivations for research mentions and highlighted their communication potential,  but found no consistent view as to why some articles get mentioned frequently.

venn

Impact in communities of interest. Patients, carers and supporters of disease charities represent a network that wants to look at, understand and communicate research about new treatments. Statistical analysis showed more mentions were given to papers associated with diseases tackled by charities with larger research funds. Cardiovascular disease receives more attention than its charitable research spend suggests, however, whereas spend on Immune and Musculo-skeletal diseases is high but media mentions are relatively low.

The study observes that health and clinical networks can enable timely, rapid and ‘serious’ media communication of innovative research with non-academic social and professional stakeholder benefits, but they may need key people as active nodes to engage them.

Click here to access the full report online.

 

 

We recently announced that over 20 academic institutions around the world have adopted the Altmetric for Institutions platform to help them track and report on the attention surrounding their research outputs.

Here we talk to Scott Taylor, Research Services Librarian at the University of Manchester, who were one of the early adopters of the Altmetric platform.

Hi Scott, thanks for taking the time to talk to us today. Could you tell us a little bit about your role within the university?
Of course; I work as part of the Research Services Team at the University Library. One of our roles is to provide research tools to the entire institution – with the aim of identifying and implementing those that can provide the most benefit to our researchers.


So how did you first become aware of altmetrics, and what motivated you to find out more?
I first became aware of altmetrics via conferences and industry newsletters. It’s important in my role that we stay up to date with the latest opportunities in research management and assessment, and I was particularly interested in what tools and data there might be that we could use across the institution.

What kinds of institutional goals are you hoping to achieve?
We and our researchers spent a lot of time trying to collate and summarize the attention and engagement our research generates as evidence to submit to the 2014 Research Excellence Framework (REF) which takes place every 6/7 years in the UK. It’s important for us to be able to demonstrate this broader impact and show how our work is having an influence on society as a whole. We’re always aiming to make this process easier for our researchers, and this goal more attainable as a result.

Added to that, there are a lot of public outreach and other activities that go on at Manchester – and it can be hard to keep track of what is happening where, or to get any idea of how effective it is. We wanted to validate efforts and provide our researchers, administrative staff, and communications and marketing colleagues with qualitative feedback on the outcomes of their strategies.

And what challenges do you face?
As I’ve mentioned, for research and administrative staff, time is always a massive constraint in terms of what we can realistically achieve. Ultimately, this means that core activities will always get prioritised, and any evaluation or planning for improvement can be pushed aside until it becomes vital (when submitting to the REF, for example). We wanted to provide a reliable system that would make it easy and quick for people to regularly monitor and report on their wider impact, and one which would offer valuable and relevant data that they would benefit from.

“Not only does having all of this data in one place save us time, but it offers a lot of interesting indicators of impact and qualitative data that we would not necessarily have gathered elsewhere.”

How do you hope the Altmetric for Institutions platform can help you realize your objectives?
We’ve set up site license access to the Altmetric for Institutions platform, and are rolling this out across the institution via training and support from the Altmetric team. Already we’ve identified a number of use cases – not only does having all of this data in one place save us time, but it offers a lot of interesting indicators of impact and qualitative data that we would not necessarily have gathered elsewhere. This includes things like where our research has been referenced in policy documents, for example, and not only where in the world our author’s research outputs are getting attention, but what people are actually saying about them.

Could you tell us a bit about your experience of using the platform so far? Are there any particular scenarios you can see it being particularly useful for?
Our experience so far has been really positive. The implementation process was fairly straightforward, and we had good support from the Altmetric team throughout. We’ve worked closely with our Academic Engagement Librarians to ensure they have a good understanding of the platform and the data it provides – so that they are well equipped to go out and introduce it to our researchers.

Screen Shot 2015-02-06 at 17.11.44.png

A particular feature that we really like on the platform is the summary reports. We can see at a glance how much attention all of our research has attracted over the last week, the last month, or even the last year/few years. It helps us identify which channels are main drivers of traffic to our content, and helps to identify which researchers or departments have been particularly successful in engaging with a broader audience.

We’ve also been busy setting up automated alerts to notify us when research from a department gets new attention in one of the sources tracked by Altmetric, which we can then highlight to authors and departmental heads to provide them with further insight.

What has the feedback from your researchers been like?
It’s been mostly positive! They’re still getting to grips with altmetrics but already I’m hearing how much better it is now that they can explore this kind of data consistently across all of their published outputs in one place. Quite a few of them have been surprised and pleased to discover coverage for their work that they were unaware of, or have identified new communities and stakeholders to engage with. I think it makes them feel like they have a lot more control over how their work is disseminated and represented. They don’t have to join every conversation or respond to every comment, but if they wanted to, having this platform makes it much easier to uncover those conversations.

“Quite a few [of our researchers] have been surprised and pleased to discover coverage for their work that they were unaware of, or have identified new communities and stakeholders to engage with.”

Thanks so much for taking the time to share your experience with us – just lastly, what do you have in mind for altmetrics at Manchester in future?
We’re hoping that the Altmetric platform and data might be particularly useful for our early-career researchers – who are perhaps not yet often cited (or haven’t had the chance to be) but are having an impact in other ways. More generally, we’re keen that the Altmetric platform becomes a tool for discovery – either of new content, new potential collaborators, or just new communities to engage with. Our researchers are doing some great work and we want to help them demonstrate that and form the most effective strategies for getting the most out of their academic outputs.

A couple of weeks ago we released a post announcing a new series of entries on how researchers can use Altmetric and altmetrics to enhance their scholarly workflows. In this initial overview, we’ve looked at a variety of different use cases to identify how and where researchers can gain value from altmetrics data. Before we get into that, we thought it would be useful to quickly recap exactly what altmetrics are:

Briefly defined, alternative metrics, or altmetrics are indicators of impact and engagement that extend far beyond traditional methods of influence measurement. Whereas to date academics and those involved in scholarly communication have looked primarily to impact factors and citation counts to assess the success of a piece of research, altmetrics provide a much more granular level of insight – both at the research output level and, crucially, beyond the academic sphere.

At Altmetric, we track a defined list of sources (including public policy documents, mainstream and social media sites, post-publication peer review forums, and online reference managers) for shares and mentions of research outputs (that’s published articles, datasets, images, and things like white papers and grey literature). We disambiguate between different versions of the same output (the publisher and the institutional repository version, for example), collate all of the attention data together, and surface both a count of mentions per source and the underlying qualitative data (the actual text of the original mentions) via the Altmetric donut visualization and Altmetric details page.

So, now that we’ve got all the data together, here’s some of the ways we’re seeing researchers put altmetrics to use:

Monitoring and tracking early engagement
Collated altmetrics mean you can identify how your work is being received immediately after publication, and respond to the conversation. By viewing online media attention, researchers can easily see attention from people and places they may not have been aware of previously.

This real-time data is particularly useful for early career researchers with a smaller body of work and a lower citation count, or those who are seeking to report back on their impact for a deadline soon after publication. It also presents an opportunity: whereas in the course of traditional citation and referencing methods the original author has little or no say in what context their work is represented, altmetrics offers stakeholders a chance to see who is saying what about a piece of research, and to respond accordingly.

@altmetric showed me that the BBSRC blog actually cites our CHEMINF paper http://blogs.bbsrc.ac.uk/index.php/2011/10/oxford-nanopore-technology-ibers/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+bbsrc+%28BBSRC+-+News+stories+and+features%29 … (would never have seen that otherwise)” – @egonwillighagen

Tools like the free Altmetric bookmarklet can save you a massive amount of time, enabling you to view all of the data simultaneously, rather than checking multiple platforms and trying to accumulate the relevant online attention piece by piece. When combined with citation data, this information can help researchers to understand social impact and engagement as well as academic recognition.

Showcasing Impact
By understanding, analysing and interpreting their altmetrics data, researchers can add interesting evidence of broader impact to funding applications and reports.bookmarklet

We’ve already seen a lot of activity that speaks to this need on the part of research funding bodies, who are attempting to position themselves as thought leaders in terms of narrowing the gap between research and the public.

According to the Research Excellence Framework,  grant funders are looking for proof of “broader impacts”, while The Royal Netherlands Academy of Arts and Sciences defines the “primary aim” of their SEP assessments as “to reveal and confirm the quality and relevance of the research to society”. In a short film on the Wellcome Trust’s funding website, Professor Jeremy Farrar said “researchers don’t exist in a bubble outside of society….to cocoon ourselves is no longer viable”.

This would suggest that if researchers can demonstrate that they have adopted the relevant practices to demonstrate the public visibility of their work, their applications and reports are likely to be stronger.

We’ve seen examples of this being done numerous times – you might like to take a look at this recent blog post, in which researcher Fernando Maestre details how he has used Altmetric data in his funding proposals.

Researchers can also boost their online profiles by adding data to personal websites, faculty profile pages, and CVs – as Marine Ecologist James Grecian has done on his page:

james

Having to show “evidence of research engagement” to their institutions can seem less of a chore if they can start hosting the data in one easy step. Click here for instructions on how to embed the free version of our donut badges on your own page.

Altmetrics for Discovery
Using altmetrics, researchers can find new and interesting content, identify and follow new research trends, and contribute to the conversation around the work of others as well as their own. Being able to see who has Tweeted or blogged about your work allows you to engage with a community of researchers outside the walls of your own institution.

“Only just discovered #altmetric. Very proud to discover that all of my (few) papers have been tweeted about at some point!” – @Frances_Tyler

This means you can start communicating with an online network of academics with similar research interests, which could potentially lead to new collaborative research partnerships. By identifying influential figures in their field through social media, researchers can adapt future outreach strategies for disseminating research accordingly.

Altmetrics can also be useful for identifying which journals in your field have effective outreach strategies for promoting their research. Take a look at other articles in your field to see which ones are getting the most attention, where they were published, and what channels or outlets have been used to disseminate the content. Equally, if you’re teaching, altmetrics data can be useful for identifying which articles are particularly noteworthy or might be good for your students to read as an example of engaging content.

Next month, we will be analyzing some more specific examples of how academics have used Altmetric data on personal web pages to showcase their impact. We will also be publishing some Q&A responses from altmetrics-savvy researchers, to provide a first-hand perspective on the benefits of altmetrics data.

We’d love to hear any ideas or experiences you might have on how researchers can get the most out of altmetrics – feel free to share any comments below or drop us a line to discuss further.

Want to know more? Join our free webinar at 3pm ET/8pm GMT on February 18th to hear direct from researchers how they have been putting altmetrics to good use. Click here to register.

Since its launch in 2001, Wikipedia has quickly grown to become the largest electronic encyclopedia in the world. The English version alone contains over 4.7 million articles, and as of February 2014 there were nearly 500 million unique users and over 73,000 active editors.

Within those millions of articles exist hundreds of thousands of links to academic research. Publishers, institutions and researchers are increasingly moving to leverage the exposure and traffic that a reference on Wikipedia can generate for their content. Although the value and relevance of that traffic is sometimes debated, for their part Wikipedia enforce strict editorial guidelines to try and ensure that quality and standards are consistent across articles, and that undue bias isn’t shown towards over-zealous posters.

There’s a long standing connection between Wikipedia and altmetrics – Dario Taraborelli at the Wikimedia Foundation is one of the authors of the altmetrics manifesto, and he’s doing a lot of great work with groups like CrossRef on helping to make Wikipedia data available for altmetrics tools and research projects, and to help set standards around its use and presentation.

Adding Wikipedia to the list of sources we track has been a goal of ours for a while, and in the course of working on an exciting new project with Springer, one of our customers (more on that at a later date), we had the chance to spend some time on it. We’re really pleased to be able to announce that any mentions of articles or other academic outputs in Wikipedia will now be reflected in a new ‘Wikipedia’ tab on the Altmetric details page.

details

 

 

 

 

 

 

 

 

 

In order to capture these mentions the academic output being mentioned must be referenced with proper Wikipedia citation tag – see http://en.wikipedia.org/wiki/Wikipedia:Citation_templates for information on how these work. We capture that as a mention and add it to the details page for that output. Users can then click on the heading to be taken to the original Wikipedia article, on the username of the person who wrote the mention to see their profile on Wikipedia, and on the date stamp to view the edit record for that article:

wiki

 

 

 

 

 

 

 

 

 

 

 

 

 

Because we’re looking for lots of different identifiers and have some constraints around auditability – every mention we collect has to have an author and a timestamp – we wrote our own system to get the data out in the format we need but we’re hoping to share tips, approaches and data with others in the community.

Excitingly, this means we now have a new colour to add to the Altmetric donut! From now on Wikipedia mentions of a research output will be visualized by a dark gray color in the donut. Any mention in Wikipedia (or even multiple mentions) will add a flat count of 3 to the score (so even if an article is mentioned 3 times across Wikipedia, it will still only add 3 to the score).

donut

For now we’ll just be tracking the English language version of Wikipedia, but we’ll be adding all of the other language versions over the next few months. Also please note that to begin with we’ve had a big load of historical data to update all at once, so some of it is still feeding through to the details pages – but give it a week or so and it should all be up to date.

Going forward the article details pages will reflect any Wikipedia references as soon as they are made (the only caveat being that we cache our API for an hour when the details page for any output is loaded, so sometimes there might be a bit of a delay). If a reference is deleted from Wikipedia the details page will update at the same speed to reflect that, and if all Wikipedia references for a research output are removed, the gray color would go from the donut and the score would drop by 3.

The addition of this source will lead to increased coverage in terms of the number of research outputs we’ve captured any online activity for. Wikipedia references are a key driver of referral traffic for publishers and institutions, and collating the associated references allows us to more accurately reflect the broader engagement surrounding a piece of research.

We’re certainly not the first to notice this trend. A number of scholarly content providers and research institutions have taken the step of employing a Wikimedian in Residence to provide outreach within their community, and to ensure that their research is represented appropriately. The Bodleian libraries in Oxford are currently hiring their very own curator, and several national libraries and museums have also invested in this role. At the Royal Society of Chemistry (RSC) in the UK, they have not only hired a Wikimedian, but also extended fully open access across all of their content to active Chemistry Wikipedia editors.

Andy Mabbett, Wikipedian in Residence for the RSC, notes that “Wikipedia is the first port of call for both students and lay people seeking information about chemicals and other chemistry topics. Recognising the importance of providing citations to reliable sources as part of Wikipedia’s policy of verifiability, by which readers can be assured that its content is reliable, the Royal Society of Chemistry is delighted when its publications are referenced by Wikipedia editors, and it is in the interests of all concerned that such citations are clear, precise, unambiguous and formatted in a standard manner, both for the convenience of humans, and to allow parsing by computers.”

Andy reports that the response to their efforts to date has been “overwhelmingly positive”, and that there is already a large amount of interest for a program of public engagement events that he is planning to run later in the year.

Combining the Wikipedia references together with mentions from the wide variety of sources that we already track will allow researchers, institutions and funders to quickly and easily see where their article has been referenced on Wikipedia, and how that fits in to the broader picture of the attention it’s received. This in turn can help identify gaps,  uncover opportunities for further outreach or more extensive curation, and give context to reading material.

If you’ve got questions or comments about this new source please do get in touch – you can email us at support@altmetric.com, or send us a tweet @altmetric.

On January 5th, Kelli Marshall published an article in the The Chronicle of Higher Education entitled “How to Curate Your Digital Identity as an Academic”, in which she discussed the importance of having an online presence. Marshal argues “as an academic or would-be academic, you need to take control of your public persona and then take steps to build and maintain it”. She also points out; “if you do not have a clear online presence, you are allowing Google, Yahoo and Bing to create your identity for you”.

Kelli talks about the need for academics to build and maintain websites about themselves that are separate from their institutional faculty pages, in order to increase search engine optimization and ensure they are the top hit on a list of Google results, not their doppelganger. She argues that once you have built a presence, you can build a network and “piggyback off sites whose web pages rank high in Google”.

So, how does this relate to Altmetric? We talk to a lot of researchers about how altmetrics can help them achieve their goals. One of the frequently raised issues is “lack of time”. A typical researcher at a university divides their time between teaching, writing papers, peer review, site or laboratory-based research, grant applications and committee duties. Because their existing workload is so high, many of them feel that the increasing requirement to “show evidence of research engagement” is details page simply another plate to spin, another chore to add to an already intimidating to-do list.

We recently talked to Terrie Moffitt, a researcher at Duke University, who said that “this kind of housekeeping can lessen the amount of time that can be allocated for genuine research”, and that “it is therefore both advantageous and convenient if researchers can outsource this kind of data curation”.

To elaborate further on Terrie’s points, I’ll be posting regularly to our blog, Facebook and Google+ pages with tips and ideas for how researchers can enhance their workflows with the use of altmetrics and Altmetric. We’ll feature a number of real-life case studies and conduct interviews with researchers who have already begun to use these tools.

From embedding badges on departmental websites to using Altmetric for teaching and supervision, we’ll be aiming to clearly demonstrate the uses of altmetrics data to faculty across an institution.

Ever wanted to know more about Altmetric and altmetrics? Here’s a list of the conferences we’ll be presenting or represented at over the first part of 2015 – please do come along to a session or set up a meeting with us to learn about the tools we offer and the approach we take to monitoring and reporting on the attention and engagement surrounding research and other institutional outputs:

Ontario Library Association meeting
28th – 31st January, Toronto

Digital Science rep Stuart Silcox will be attending and happy to meet to discuss how altmetrics can be useful for your institution. Please get in touch to arrange a meeting.

ALA midwinter
30th January – 3rd February, Chicago

Visit Digital Science at booth #2821 to learn about the Altmetric for Institutions platform and free tools we offer for researchers, librarians and institutional repositories. If you’d like to arrange a time to meet please email us.

PSP 2015 Annual Conference
4th – 6th February 2015, Washington D.C.

Head of Publisher outreach Phill Jones will chair a panel on Prospering in a Multimedia World, and Betsy and Adrian, the Digital Science Publisher Business Development team, will be on hand to discuss how altmetrics data can add value for your authors, readers, society partners and internal teams. Contact us to arrange a meeting.

AAAS Annual Meeting
12th – 16th February, San Jose, California

Digital Science Community Manager Laura Wheeler will be attending the AAAS meeting this year. Catch up with Laura to find out how altmetrics and the free tools Altmetric offer can help you better track and understand the attention surrounding your research.

SCELC colloquium
18th February, California
Digital Science rep Kortney Capretta will be attending and happy to fill you in on all things Altmetric! please do email us if you’d like to arrange to meet Kortney.

ER&L 2015
22nd – 25th February, Austin, Texas
Altmetric’s Sara Rouhi will be attending and running the workshop: You talking ‘bout me? Altmetrics: a practical guide on Mon Feb 23, 4:15pm–5:00pm, Room 301. Institutions are increasingly adopting altmetrics as a means to gaining a better understanding of the wider attention their research is attracting. In this session Sara will cover that why’s, what’s and how’s of these new metrics, how Altmetric.com provides this data, and explore some use cases to see how they can be used for reporting, discovery, and to further academic engagement.

American Chemical Society National Meeting
22nd – 26th March, Denver, Colorado
Product specialist Sara Rouhi will be running an altmetrics workshop at this event. Join her to learn about the data we track and how it can be useful for the researchers and management teams across your institution. Or she’ll be more than happy to arrange a time to meet to explore which of our solutions would best meet your specific needs.

eResearch New Zealand
23rd – 25th March, Queenstown, New Zealand
Digital Science rep Anne Harvey will be at eResearch New Zealand, and happy to chat about the different solutions and tools that Altmetric offer institutions to help support their research management workflows. Contact Anne to arrange a meeting or to see where you can find her.

ACRL
25th – 28th March, Portland, Oregon

Digital Science reps Kortney Capretta and Meg Baker will be attending and happy to fill you in on all things Altmetric! Please do email us if you’d like to arrange to meet Kortney or Meg.

UKSG
30th March – 1st April, Glasgow, UK

Altmetrics feature strongly in the UKSG 2015 programme, giving you a great chance to find out more about what they are see learn how they are already being used in research institutions and funding agencies around the UK. Visit Digital Science booth #95 to learn about the latest developments from all of the Digital Science portfolio companies – and do get in touch if you’d like to arrange a time to meet.

 

2014 saw some major developments in altmetrics: new products were launched, working committees formed and more and more researchers, publishers, funders and institutions started to investigate how they might put these new metrics to use.

As we go into 2015 we want to continue this trend. Much of the discussion last year focussed on what the numbers meant and how they should be interpreted. For us 2015 is about looking beyond the numbers. Although they help identify how much attention and engagement a research output has generated (which is why we have the Altmetric score), and where that attention came from, they do not offer many clues as to what people actually think about the work, and can often be misleading. A case of this has been brought to light in a story published in Nature this week – which found that when awarding funding grants the Medical Research Council make a point of looking beyond the external reviewers scores alone, and focus on their written comments instead.

You may also seen have seen Altmetric Founder Euan Adie’s blog post from last year, where he discussed how the term ‘metrics’ itself can seem to promise a false solution. A point first made by Stephen Curry following a HEFCE metrics review day in the UK, the assumption of finite measurement that is associated with ‘metrics’ is something we should address and be wary of as part of the process of research evaluation. So rather than ‘metrics’, we now think of altmetrics as indicators of engagement and attention. A high Altmetric score or a count of mentions of a research output from the sources we track act as indicators that there is activity to be further examined, which may then go on to be taken into consideration in evaluation.

We plan to add a number of new sources this year – these are the places that we track for mentions of academic articles, datasets, figures detailsand other research outputs. Each time we add a new source we’ll be considering what value it would offer our users, and ensuring it adds context beyond what numbers alone can provide. Within our interfaces we’re careful to make each original mention accessible, meaning that users can actually see who is talking about a piece of work, and what they are saying. Without this, knowing that your article has been picked up by 20 news sites, tweeted by hundreds of people, or received a comment on a post-publication peer-review platform has little relevance.

With the launch of our institutional platform last year we began including references from policy documents. We’re still growing the list of what we track in this space but have already uncovered and made available thousands of references to academic work. Feedback from research administrators and institutional partners tells us that the addition of this data has been incredibly beneficial for them – what would previously have taken weeks to collate in the search for evidence of the application of research is now easy to track and report on.

The debate about how and when altmetrics should or could be applied is ongoing. Steering group work is ongoing in the US and UK, and much discussion will take place at industry conferences over the year.

At the same time, researchers are increasingly using our data to provide evidence of their societal impact to funding committees and management, identify potential collaborators and new communities to engage with, identify research worth reading, and to monitor early uptake of their work. Librarians and publishers are considering how they can best support their faculty and readers – many will be keeping a close eye on the ongoing reviews and standards projects taking place in the UK and US. We’ll be offering support and training to these communities throughout the year, and hope to provide some useful case studies that may help generate ideas for your own institutions and platforms.

To begin the year and set things off on the right foot, there’s one thing we’d like to be clear on: quality, importance and impact are not numbers. A number cannot tell you what people think about a piece of research, why they were talking about it or sharing it, or if it was any good. Numbers can identify where there is something to be examined in more depth, but they alone cannot be used to measure the value of research. With altmetrics we have the chance to explore and question, and to make the numbers work for us – not the other way around. This is central to everything we do here, and an approach we’ll be encouraging others in the scholarly community to adopt.

Thinking back to the start of this year, it’s hard to take in how much has changed at Altmetric. We’ve had a brilliant, busy, fun 12 months – which now seem to have gone by so quickly! A few highlights to mention include:

Screen Shot 2014-12-18 at 11.27.26

- We built and launched our newest platform, Altmetric for Institutions. Take up has been really positive and we currently have over 15 institutions either already implemented or in the process of being set up across the UK, Europe, Australia and the US.

- We helped organise and run the 1:AM altmetrics conference – 2 days of great sessions that included presentations from publishers, researchers, librarians and funders. The event was more popular than we could have hoped for, and we’re already starting to discuss plans for next year.

- Our top 100 list, built by our developers with design help from the team at Digital Science, has Screen Shot 2014-12-08 at 15.24.24received over 15,000 unique visits since its launch in mid-December. The accompanying blog post has been viewed nearly 2,000 times, and we saw some great press coverage in major outlets including the Telegraph, the Times Higher Education,the Huffington Post and the Guardian.

- We ran our first ever publisher day in London. Organised in conjunction with figshare, the event brought together about 50 interested stakeholders for a day of discussion and brainstorming. Stay tuned for news on future events!

- Speaking of publishers, we’re continuing to see a growing demand for our badge visualisations. Publishers Wiley, Health Affairs, Frontiers, Springer and ScienceOpen are among a number of content providers that added the Altmetric data to their article pages. Additionally we announced further roll out with platform providers HighWire and Silverchair, meaning it is now even easier for their publishing partners to offer their readers and users the additional context and insight that Altmetric data offers.

IMG_3835

- Our team grew from 7 at the end of 2013 to 18. Our new team members include developers (Scott, Maciej, Shane, Matt M, Matt P, and Jakub), product sales managers Sara and Ben, customer support exec Fran, training and implementation manager Natalia, and Kathy who joined as our COO. All of these new colleagues have added a great amount of knowledge and experience to our day-to-day operations, and we’re feeling well-placed to take on any challenges and opportunities that next year will bring!

Alongside all of this we’ve continued to attend and present at a huge number of events and industry conferences. We’ve been actively engaged with the altmetrics discussion and debate via platforms such as our blog and Twitter, and continue to work closely with ongoing projects and reviews (such as those being undertaken by HEFCE and NISO) to help shape direction for the use of altmetrics.

Looking ahead to 2015, we plan to build on the standard we’ve set this year. We’ll be further involved with driving the discussion around the use of new metrics in research evaluation, and will work to ensure that our platforms and data continually offer our users the best and most valuable insights.

So, with that, a very happy New Year to all of you! We’ll see you in 2015…

At a recent conference workshop on altmetrics tools, librarians were compared to drug dealers: getting researchers hooked on tools to help them get the most out of their research. As the momentum around altmetrics in higher education gathers pace and Altmetric for Institutions is increasingly adopted by research institutions, there are huge opportunities for librarians to position themselves as a central support team – albeit promoting good scholarly communications practices rather than drugs – and leading the way to integrate altmetrics in institutions.

I joined the Altmetric team last month as the Training and Implementation Manager, having previously led the Research Support Services team at The London School of Economics Library. Part of our offering included providing altmetrics and bibliometrics support, and being the team researchers and professional departments could look to for advice, analysis and training on emerging metrics. But developing an altmetrics service doesn’t happen overnight, and there are various ways to get out there and be an expert in the field.

So, as I take up my new role and start planning how we can extend our education and outreach support, this seems like an ideal opportunity to introduce myself (hello!) and suggest ten ways librarians can support altmetrics:

1.   Be an expert

Get familiar with the practical applications of altmetrics for researchers and professional staff in institutions. Learn about monitoring altmetrics attention to new papers, embedding on a CV and demonstrating societal impact. Follow the #altmetrics discussion on Twitter and learn how other academic librarians are supporting altmetrics in their institutions. How can research administrators, communications teams and students also make use of altmetrics? Get to know the tools available and reach out to offer support. Researchers will look to you for advice about altmetrics if you’re able to demonstrate that the Library is the place to go with questions!

2.   Embed altmetrics training in existing programmes

Run altmetrics training as part of PhD development programmes, digital literacy sessions and talk about altmetrics for researchers at faculty away days or departmental meetings. It’s useful to set the scene by offering some background on the altmetrics movement, how they work alongside existing metrics and demonstrating the practical uses.

media

3.   Add article level metrics to your institutional repository or resource discovery system

We offer free Altmetric badge embeds for institutional repositories and resource discovery systems, it’s just a line of code and all the details can be found here. We’re increasingly seeing Altmetric badges in resource discovery systems such as Summon and Primo – see our blog post on this for more details.

4.   Provide altmetrics advice alongside traditional bibliometric analysis

Altmetrics help complement existing bibliometric indicators. Have a read around the correlation between altmetrics and citation counts. If you’re already providing bibliometrics advice and benchmarking reports for researchers, add altmetrics to this analysis to demonstrate how the data offers a broader view of attention to research. Talk openly about the limitations of using a single score to assess research papers, encourage users to dig deeper to the the original mentions.

5.   Encourage researchers to adopt open practices

You’re probably already out there talking to researchers about open access and research data management. Include altmetrics in these conversations – for example when discussing the advantages of making papers open access, explain the opportunities for tracking the subsequent altmetrics. You could also mention the potential for tracking the online attention to open access research data. Altmetrics are another great reason for researchers to be open and can help demonstrate what’s in it for them.

6.   Support researchers using altmetrics in grant applications

detailsTalk to researchers and research offices at your institution about adding altmetrics to grant applications and funder reports. Take a look at the Score tab in Altmetric details pages, which puts the Altmetric score of attention in context by ranking against papers of a similar age or other papers in that journal. Reporting back to a funder that a paper is top ranked in that journal according to altmetrics attention is much more meaningful than citing a single number.

7.   Inform collection development

Use altmetrics data to identify emerging research interests in your institution to inform purchasing or renewal decisions. For example, identify papers with high altmetrics attention in a particular field of research and check those are part of your library holdings.

8.   Identify collaboration opportunities

Support researchers in finding potential collaborators by identifying researchers in a particular field with high altmetrics attention. Do this by running a keyword search in the Altmetric Explorer for a specific discipline and identifying authors with a lot of mentions and engagement surrounding their work.

9.   Demonstrate altmetrics uses across disciplines

Altmetrics aren’t just for STEM subjects. Talk to humanities and social science departments during liaison meetings about the opportunities for using altmetrics data to track different types of attention to their outputs. For example, we track attention to scholarly papers in policy documents – valuable for social science researchers in demonstrating real world impact.

10.   Get in touch!

We offer free access to the Altmetric Explorer for librarians – contact us for more information and sign up for a webinar to find out more about Altmetric for Institutions!