On January 5th, Kelli Marshall published an article in the The Chronicle of Higher Education entitled “How to Curate Your Digital Identity as an Academic”, in which she discussed the importance of having an online presence. Marshal argues “as an academic or would-be academic, you need to take control of your public persona and then take steps to build and maintain it”. She also points out; “if you do not have a clear online presence, you are allowing Google, Yahoo and Bing to create your identity for you”.

Kelli talks about the need for academics to build and maintain websites about themselves that are separate from their institutional faculty pages, in order to increase search engine optimization and ensure they are the top hit on a list of Google results, not their doppelganger. She argues that once you have built a presence, you can build a network and “piggyback off sites whose web pages rank high in Google”.

So, how does this relate to Altmetric? We talk to a lot of researchers about how altmetrics can help them achieve their goals. One of the frequently raised issues is “lack of time”. A typical researcher at a university divides their time between teaching, writing papers, peer review, site or laboratory-based research, grant applications and committee duties. Because their existing workload is so high, many of them feel that the increasing requirement to “show evidence of research engagement” is details page simply another plate to spin, another chore to add to an already intimidating to-do list.

We recently talked to Terrie Moffitt, a researcher at Duke University, who said that “this kind of housekeeping can lessen the amount of time that can be allocated for genuine research”, and that “it is therefore both advantageous and convenient if researchers can outsource this kind of data curation”.

To elaborate further on Terrie’s points, I’ll be posting regularly to our blog, Facebook and Google+ pages with tips and ideas for how researchers can enhance their workflows with the use of altmetrics and Altmetric. We’ll feature a number of real-life case studies and conduct interviews with researchers who have already begun to use these tools.

From embedding badges on departmental websites to using Altmetric for teaching and supervision, we’ll be aiming to clearly demonstrate the uses of altmetrics data to faculty across an institution.

Ever wanted to know more about Altmetric and altmetrics? Here’s a list of the conferences we’ll be presenting or represented at over the first part of 2015 – please do come along to a session or set up a meeting with us to learn about the tools we offer and the approach we take to monitoring and reporting on the attention and engagement surrounding research and other institutional outputs:

Ontario Library Association meeting
28th – 31st January, Toronto

Digital Science rep Stuart Silcox will be attending and happy to meet to discuss how altmetrics can be useful for your institution. Please get in touch to arrange a meeting.

ALA midwinter
30th January – 3rd February, Chicago

Visit Digital Science at booth #2821 to learn about the Altmetric for Institutions platform and free tools we offer for researchers, librarians and institutional repositories. If you’d like to arrange a time to meet please email us.

PSP 2015 Annual Conference
4th – 6th February 2015, Washington D.C.

Head of Publisher outreach Phill Jones will chair a panel on Prospering in a Multimedia World, and Betsy and Adrian, the Digital Science Publisher Business Development team, will be on hand to discuss how altmetrics data can add value for your authors, readers, society partners and internal teams. Contact us to arrange a meeting.

AAAS Annual Meeting
12th – 16th February, San Jose, California

Digital Science Community Manager Laura Wheeler will be attending the AAAS meeting this year. Catch up with Laura to find out how altmetrics and the free tools Altmetric offer can help you better track and understand the attention surrounding your research.

SCELC colloquium
18th February, California
Digital Science rep Kortney Capretta will be attending and happy to fill you in on all things Altmetric! please do email us if you’d like to arrange to meet Kortney.

ER&L 2015
22nd – 25th February, Austin, Texas
Altmetric’s Sara Rouhi will be attending and running the workshop: You talking ‘bout me? Altmetrics: a practical guide on Mon Feb 23, 4:15pm–5:00pm, Room 301. Institutions are increasingly adopting altmetrics as a means to gaining a better understanding of the wider attention their research is attracting. In this session Sara will cover that why’s, what’s and how’s of these new metrics, how Altmetric.com provides this data, and explore some use cases to see how they can be used for reporting, discovery, and to further academic engagement.

American Chemical Society National Meeting
22nd – 26th March, Denver, Colorado
Product specialist Sara Rouhi will be running an altmetrics workshop at this event. Join her to learn about the data we track and how it can be useful for the researchers and management teams across your institution. Or she’ll be more than happy to arrange a time to meet to explore which of our solutions would best meet your specific needs.

eResearch New Zealand
23rd – 25th March, Queenstown, New Zealand
Digital Science rep Anne Harvey will be at eResearch New Zealand, and happy to chat about the different solutions and tools that Altmetric offer institutions to help support their research management workflows. Contact Anne to arrange a meeting or to see where you can find her.

ACRL
25th – 28th March, Portland, Oregon

Digital Science reps Kortney Capretta and Meg Baker will be attending and happy to fill you in on all things Altmetric! Please do email us if you’d like to arrange to meet Kortney or Meg.

UKSG
30th March – 1st April, Glasgow, UK

Altmetrics feature strongly in the UKSG 2015 programme, giving you a great chance to find out more about what they are see learn how they are already being used in research institutions and funding agencies around the UK. Visit Digital Science booth #95 to learn about the latest developments from all of the Digital Science portfolio companies – and do get in touch if you’d like to arrange a time to meet.

 

2014 saw some major developments in altmetrics: new products were launched, working committees formed and more and more researchers, publishers, funders and institutions started to investigate how they might put these new metrics to use.

As we go into 2015 we want to continue this trend. Much of the discussion last year focussed on what the numbers meant and how they should be interpreted. For us 2015 is about looking beyond the numbers. Although they help identify how much attention and engagement a research output has generated (which is why we have the Altmetric score), and where that attention came from, they do not offer many clues as to what people actually think about the work, and can often be misleading. A case of this has been brought to light in a story published in Nature this week – which found that when awarding funding grants the Medical Research Council make a point of looking beyond the external reviewers scores alone, and focus on their written comments instead.

You may also seen have seen Altmetric Founder Euan Adie’s blog post from last year, where he discussed how the term ‘metrics’ itself can seem to promise a false solution. A point first made by Stephen Curry following a HEFCE metrics review day in the UK, the assumption of finite measurement that is associated with ‘metrics’ is something we should address and be wary of as part of the process of research evaluation. So rather than ‘metrics’, we now think of altmetrics as indicators of engagement and attention. A high Altmetric score or a count of mentions of a research output from the sources we track act as indicators that there is activity to be further examined, which may then go on to be taken into consideration in evaluation.

We plan to add a number of new sources this year – these are the places that we track for mentions of academic articles, datasets, figures detailsand other research outputs. Each time we add a new source we’ll be considering what value it would offer our users, and ensuring it adds context beyond what numbers alone can provide. Within our interfaces we’re careful to make each original mention accessible, meaning that users can actually see who is talking about a piece of work, and what they are saying. Without this, knowing that your article has been picked up by 20 news sites, tweeted by hundreds of people, or received a comment on a post-publication peer-review platform has little relevance.

With the launch of our institutional platform last year we began including references from policy documents. We’re still growing the list of what we track in this space but have already uncovered and made available thousands of references to academic work. Feedback from research administrators and institutional partners tells us that the addition of this data has been incredibly beneficial for them – what would previously have taken weeks to collate in the search for evidence of the application of research is now easy to track and report on.

The debate about how and when altmetrics should or could be applied is ongoing. Steering group work is ongoing in the US and UK, and much discussion will take place at industry conferences over the year.

At the same time, researchers are increasingly using our data to provide evidence of their societal impact to funding committees and management, identify potential collaborators and new communities to engage with, identify research worth reading, and to monitor early uptake of their work. Librarians and publishers are considering how they can best support their faculty and readers – many will be keeping a close eye on the ongoing reviews and standards projects taking place in the UK and US. We’ll be offering support and training to these communities throughout the year, and hope to provide some useful case studies that may help generate ideas for your own institutions and platforms.

To begin the year and set things off on the right foot, there’s one thing we’d like to be clear on: quality, importance and impact are not numbers. A number cannot tell you what people think about a piece of research, why they were talking about it or sharing it, or if it was any good. Numbers can identify where there is something to be examined in more depth, but they alone cannot be used to measure the value of research. With altmetrics we have the chance to explore and question, and to make the numbers work for us – not the other way around. This is central to everything we do here, and an approach we’ll be encouraging others in the scholarly community to adopt.

Thinking back to the start of this year, it’s hard to take in how much has changed at Altmetric. We’ve had a brilliant, busy, fun 12 months – which now seem to have gone by so quickly! A few highlights to mention include:

Screen Shot 2014-12-18 at 11.27.26

- We built and launched our newest platform, Altmetric for Institutions. Take up has been really positive and we currently have over 15 institutions either already implemented or in the process of being set up across the UK, Europe, Australia and the US.

- We helped organise and run the 1:AM altmetrics conference – 2 days of great sessions that included presentations from publishers, researchers, librarians and funders. The event was more popular than we could have hoped for, and we’re already starting to discuss plans for next year.

- Our top 100 list, built by our developers with design help from the team at Digital Science, has Screen Shot 2014-12-08 at 15.24.24received over 15,000 unique visits since its launch in mid-December. The accompanying blog post has been viewed nearly 2,000 times, and we saw some great press coverage in major outlets including the Telegraph, the Times Higher Education,the Huffington Post and the Guardian.

- We ran our first ever publisher day in London. Organised in conjunction with figshare, the event brought together about 50 interested stakeholders for a day of discussion and brainstorming. Stay tuned for news on future events!

- Speaking of publishers, we’re continuing to see a growing demand for our badge visualisations. Publishers Wiley, Health Affairs, Frontiers, Springer and ScienceOpen are among a number of content providers that added the Altmetric data to their article pages. Additionally we announced further roll out with platform providers HighWire and Silverchair, meaning it is now even easier for their publishing partners to offer their readers and users the additional context and insight that Altmetric data offers.

IMG_3835

- Our team grew from 7 at the end of 2013 to 18. Our new team members include developers (Scott, Maciej, Shane, Matt M, Matt P, and Jakub), product sales managers Sara and Ben, customer support exec Fran, training and implementation manager Natalia, and Kathy who joined as our COO. All of these new colleagues have added a great amount of knowledge and experience to our day-to-day operations, and we’re feeling well-placed to take on any challenges and opportunities that next year will bring!

Alongside all of this we’ve continued to attend and present at a huge number of events and industry conferences. We’ve been actively engaged with the altmetrics discussion and debate via platforms such as our blog and Twitter, and continue to work closely with ongoing projects and reviews (such as those being undertaken by HEFCE and NISO) to help shape direction for the use of altmetrics.

Looking ahead to 2015, we plan to build on the standard we’ve set this year. We’ll be further involved with driving the discussion around the use of new metrics in research evaluation, and will work to ensure that our platforms and data continually offer our users the best and most valuable insights.

So, with that, a very happy New Year to all of you! We’ll see you in 2015…

At a recent conference workshop on altmetrics tools, librarians were compared to drug dealers: getting researchers hooked on tools to help them get the most out of their research. As the momentum around altmetrics in higher education gathers pace and Altmetric for Institutions is increasingly adopted by research institutions, there are huge opportunities for librarians to position themselves as a central support team – albeit promoting good scholarly communications practices rather than drugs – and leading the way to integrate altmetrics in institutions.

I joined the Altmetric team last month as the Training and Implementation Manager, having previously led the Research Support Services team at The London School of Economics Library. Part of our offering included providing altmetrics and bibliometrics support, and being the team researchers and professional departments could look to for advice, analysis and training on emerging metrics. But developing an altmetrics service doesn’t happen overnight, and there are various ways to get out there and be an expert in the field.

So, as I take up my new role and start planning how we can extend our education and outreach support, this seems like an ideal opportunity to introduce myself (hello!) and suggest ten ways librarians can support altmetrics:

1.   Be an expert

Get familiar with the practical applications of altmetrics for researchers and professional staff in institutions. Learn about monitoring altmetrics attention to new papers, embedding on a CV and demonstrating societal impact. Follow the #altmetrics discussion on Twitter and learn how other academic librarians are supporting altmetrics in their institutions. How can research administrators, communications teams and students also make use of altmetrics? Get to know the tools available and reach out to offer support. Researchers will look to you for advice about altmetrics if you’re able to demonstrate that the Library is the place to go with questions!

2.   Embed altmetrics training in existing programmes

Run altmetrics training as part of PhD development programmes, digital literacy sessions and talk about altmetrics for researchers at faculty away days or departmental meetings. It’s useful to set the scene by offering some background on the altmetrics movement, how they work alongside existing metrics and demonstrating the practical uses.

media

3.   Add article level metrics to your institutional repository or resource discovery system

We offer free Altmetric badge embeds for institutional repositories and resource discovery systems, it’s just a line of code and all the details can be found here. We’re increasingly seeing Altmetric badges in resource discovery systems such as Summon and Primo – see our blog post on this for more details.

4.   Provide altmetrics advice alongside traditional bibliometric analysis

Altmetrics help complement existing bibliometric indicators. Have a read around the correlation between altmetrics and citation counts. If you’re already providing bibliometrics advice and benchmarking reports for researchers, add altmetrics to this analysis to demonstrate how the data offers a broader view of attention to research. Talk openly about the limitations of using a single score to assess research papers, encourage users to dig deeper to the the original mentions.

5.   Encourage researchers to adopt open practices

You’re probably already out there talking to researchers about open access and research data management. Include altmetrics in these conversations – for example when discussing the advantages of making papers open access, explain the opportunities for tracking the subsequent altmetrics. You could also mention the potential for tracking the online attention to open access research data. Altmetrics are another great reason for researchers to be open and can help demonstrate what’s in it for them.

6.   Support researchers using altmetrics in grant applications

detailsTalk to researchers and research offices at your institution about adding altmetrics to grant applications and funder reports. Take a look at the Score tab in Altmetric details pages, which puts the Altmetric score of attention in context by ranking against papers of a similar age or other papers in that journal. Reporting back to a funder that a paper is top ranked in that journal according to altmetrics attention is much more meaningful than citing a single number.

7.   Inform collection development

Use altmetrics data to identify emerging research interests in your institution to inform purchasing or renewal decisions. For example, identify papers with high altmetrics attention in a particular field of research and check those are part of your library holdings.

8.   Identify collaboration opportunities

Support researchers in finding potential collaborators by identifying researchers in a particular field with high altmetrics attention. Do this by running a keyword search in the Altmetric Explorer for a specific discipline and identifying authors with a lot of mentions and engagement surrounding their work.

9.   Demonstrate altmetrics uses across disciplines

Altmetrics aren’t just for STEM subjects. Talk to humanities and social science departments during liaison meetings about the opportunities for using altmetrics data to track different types of attention to their outputs. For example, we track attention to scholarly papers in policy documents – valuable for social science researchers in demonstrating real world impact.

10.   Get in touch!

We offer free access to the Altmetric Explorer for librarians – contact us for more information and sign up for a webinar to find out more about Altmetric for Institutions!

 

It’s starting to feel Christmassy at Altmetric HQ, and we’ve been busy pulling together our annual list of which published research has been attracting the most attention online in 2014.

Based on data we’ve collated over the year, the list takes into account all mentions and shares of articles published from November 2013 onwards in mainstream and social media, blogs, post-publication peer-review forums, bookmarking sites and platforms such as Reddit and YouTube.

Access the full Top 100 list

We’ve excluded editorial, comment and review content, as we wanted to focus specifically on original research. The data can be filtered by discipline, journal, institution, country, and access type – and you can click through to view the Altmetric details page will all of the original mentions and shares for each article.

Screen Shot 2014-12-08 at 14.54.13

The 2014 Top 100 in summary

  • The top scoring article (at the point of data collection on the 14th of November 2014), published in PNAS in June 2014, was “Experimental evidence of massive-scale emotional contagion through social networks”.
  • 51 of the articles in the list have been published since the beginning of May this year (the others were published between November 2013 and the end of April 2014)
  • 37 of the articles in the top 100 were published as open access (63 were published under the paywall/subscription model, although some have now been made free)
  • 16 of the Top 100 were published in Nature, 11 in Science, 9 in PloS ONE, 8 in PNAS, and 4 in JAMA
  • 68 of the Top 100 had authors from the United States, 19 had authors from the UK, 10 from Canada, 11 from Germany (the most in Europe), 4 from China, and 9 from South or Central America
  • Top performing institutions include Harvard, Harvard Medical School, and Harvard School of Public Health – whose authors featured on 15 articles in total. Authors from institutions that are part of the University of California System featured on 10 articles. In the UK 3 articles featured authors from the University of Cambridge, and 3 from the University of Oxford.

A closer look
As we’d expect to see, a large proportion of the list reflects research that dominated the mainstream media agenda throughout the year – for example studies on Ebola, a new black hole theory from Stephen Hawking, and research which resulted in the manipulation of Facebook users timelines all rank highly.

It’s often the studies which have relevance or can be made easily accessible to a wide audience that receive a lot of coverage, and therefore is no surprise that studies which fall under medical and health sciences make up 44 of the 100 articles featured.

Geographically, authors from the US, Europe or UK are present on over 45% of articles which made the list (this is partly due to some of our coverage – there is a bias). Over 80% of articles which named a UK author were the result of an international collaboration, whilst in contrast just 46% of articles authored by US researchers featured input from overseas researchers.

map

 

 

 

 

 

 

 

 

 

 

Screen Shot 2014-12-08 at 14.55.43

Many people have questioned whether or not Open Access articles are more likely to get shared and discussed than those that are published behind a paywall – indeed we recently undertook a small scale study of our own to look at this. At the time the results we got proved positive, however we do not see the same trend reflected in this list; just 37 of the 100 articles featured were published under an Open Access license.

(An addendum here – in a comment on a recent Scholarly Kitchen post, it was rightly pointed out that due to the much smaller proportion of papers published open access overall, a 37% share in our list actually does represent a higher proportion of OA articles being shared – tying in with what we identified in our original study.)

This may suggest that those sharing these articles do not stop to consider whether or not their peers will be able to read the material, or it may be that the mainstream news agenda is driven to some extent by what journal publishers or institutions choose to highlight in through their press efforts.

It’s worth mentioning again that this list is of course in no way a measure of quality of the research, or of the researcher. Studies of the life span of chocolate on hospital wards, and an effort to search the internet for evidence of time travellers both feature high in the list this year – entertaining content which provides a bit of light relief and is quickly distributed. Similarly, a case of unfortunate author error (which we discussed in more detail here) is still generating new attention online now, months after publication and weeks after the error was spotted and rectified.

Nonetheless we hope that this breakdown and the Altmetric data available for each article will offer some insight into which research has captured the public imagination this year, and why that might have been.

tweet_papers

I agree with lots on the excellent ImpactStory blog but I really don’t agree with this post arguing that Nature’s new SciShare experiment is bad for altmetrics. It really isn’t. I figured it was worth a post here to explain my thinking.

In my view it’s mildly inconvenient for altmetrics vendors. :) But you can’t really spin it as “bad” in this context beyond that and given that there are good aspects too I think the title of the ImpactStory post is overkill.

I may be biased. We share an office in London with Nature Publishing Group and Digital Science (their sister company) who invested in Altmetric. I also used to work for NPG and have a lot of love for them. I have a lot of respect for Readcube too, who’ve done some awesome things on the technical side. So bear that in mind.

Anyway, I actually agree to an extent with what are maybe the two main points from Stacy and Jason’s post and those are the ones I’ll cover in a bit more detail later.

Here for reference is the list of recommendations that they make:

[NPG should...]

  • Open up their pageview metrics via API to make it easier for researchers to reuse their impact metrics however they want
  • Release ReadCube resolution, referral traffic and annotation metrics via API, adding new metrics that can tell us more about how content is being shared and what readers have to say about articles
  • Add more context to the altmetrics data they display, so viewers have a better sense of what the numbers actually mean
  • Do away with hashed URLs and link shorteners, especially the latter which make it difficult to track all mentions of an article on social media

First off I think the lack of pageviews API on nature.com and the look of the altmetrics widget on the ReadCube viewer sidebar are sort of irrelevant – sure, those points are definitely worth making, but these aren’t SciShare things, or even issues unique to Nature.

That same widget has been in the ReadCube HTML viewer (which hundreds of thousands of Google search users have seen on a regular basis) for years and to be fair you’ve got to admit that almost nobody – except for PLoS and eLife AFAIK – have an open pageviews API.

Leaving aside how useful or not pageviews actually are for most altmetrics use cases (I actually have problems with them, as they’re neither transparent nor particularly easy to extract meaning from) I’d love for there to be more APIs available so tool makers had options… but yeah, not really anything to do with SciShare per se.

The final recommendation contains the bits I agree with and it’s worth diving into.

The sharing URL doesn’t include a DOI or other permanent identifier

I’d definitely agree that it’d be useful for the link to include an identifier. It saves us work. That said, lots of publishers (the majority, even) have links without DOIs in them and we have to work round it. It’s not a big deal.

Some hard numbers from just us to back this up – I imagine other providers see similar ratios: there are 2,714,864 articles mentioned at least once in the Altmetric database.

Of them only 813,024 (~ 30%) had a DOI in at least one of the links used by people in a mention (this number will also include people who used dx.doi.org links rather than links to the publisher’s platform).

The URLs may break

Exact same deal here: I agree, it’d be nice to encourage the use of a more ‘permanent’ link, and it’d definitely be good to hear somebody clarify what’ll happen to the links after the experiment is over. I’m surprised somebody hasn’t already (update: I should have checked Twitter first, Tom Scott at NPG has said they will be persistent).

But… for whatever reason only a very small fraction of users on social media use dx.doi.org links.

We have 11,088,388 tweets that mention a recognized item in the database.

Only 25,132 (0.2%) of them contain a dx.doi.org or doi.org link (this actually really surprised me, I thought the figure would be more like 10%, but there you go).

You could say that SciShare doesn’t help these problems, and you’d be right. It’s really not going to make them noticeably worse though. I think altmetrics has bigger problems with links to content.

Non-SciShare problem A: News outlets not linking to content at all

I didn’t have any inside track on SciShare and we weren’t involved in any of the planning, but I did hear about it a little early when we got asked to help identify news and blog sources for whitelisting purposes (I don’t know if the data we put together is what eventually got used, or if some extra editing was involved).

My first thought was: if it means a single news source starts actually linking to content instead of just mentioning the journal in passing then it’ll probably be worthwhile.

The biggest problem with news outlets and altmetrics is that even when they’re covering a specific article they usually don’t link to it (happily there are a growing number of exceptions, but they’re still just that, exceptions). Publishers usually blame journalists, and journalists blame publishers or the fact that they’re writing for print and online is an afterthought.

We end up having to rely on text mining to track mentions in news sources which works but means we have to balance precision and recall. Anything that helps supplement this with links and make mainstream media data more reliable sounds good to me.

Non-SciShare problem B: Lack of machine readable metadata

This topic came up a lot at the PLoS almetrics workshop last week.

I’d argue that the single biggest problem for altmetrics vendors when it comes to collecting data is actually insufficient machine readable metadata – and especially identifiers – on digital objects containing or representing scholarly outputs, especially in PDFs.

Incidentally NPG has actually always been a leader here. If you curl a SciShare URL you’ll notice machine readable metadata including the DOI come back.

Unfortunately it’s not a particularly interesting issue unless you like scholarly metadata. It doesn’t have an exciting open access angle and there’s unfortunately not a hashtag to use to campaign for it, but there probably should be.

We often get asked during demos, webinars and conference sessions for more detail on the news sources we track for mentions of academic research – and in particular how we do it, and how global our coverage is.  

So firstly, what do we track, and why?
We see mentions in mainstream media and news as a crucial part of the wider engagement a piece of research achieves. If an article, book, or dataset is picked up and getting a lot of coverage amongst mainstream media outlets, chances are it has a notable societal significance and the potential to generate further discussion or study.

Being able to provide this data helps researchers:

  • easily see and report on how their work has been communicated by the press to the general public
  • get ideas for new outlets to engage with in future
  • identify where a new approach might be needed in order to ensure their work is fairly and accurately reported

To this end, we maintain a curated list of over 1,000 news outlets, which we are adding to every week. These come from all corners of the globe, and in many different languages. Our current regional coverage is represented here:

countries

 

You can see a full list of all the news outlets we track on the website.

How do we do it?
We combine a mixture of automated scanning for links and text mining to try and ensure our news coverage is as comprehensive as possible. As long as the news outlet and domain of the content it refers to (for example The Washington Post and an article on nature.com) are on our whitelist, we’ll pick up the mention.

You can read more about the technical detail in this blog post contributed by one of our developers earlier this year, and this one from our Product Development Manager which gives a little more detail on the text mining.

Any mentions we find relating to a particular research output are then displayed on the ‘news’ tab of the Altmetric details page for that piece of research – you’ll see the title of the news piece that mentioned the research and be able to click through to read the original news article in full.

Want to suggest news sources we might not be tracking?
Tell us about it! Please make sure it’s news and not personal opinion, and has a working RSS feed – we’ll add it as long as all checks out ok.

Identifying the right literature to spend time reading has long been a challenge for researchers – often it is driven by table of contents alerts sent straight to an inbox, or a recommendation from a superior or colleague. Libraries have invested in systems to make the most relevant content easily accessible and above all, easily discoverable. But a search in a discovery platform can draw hundreds of results, and it is sometimes difficult just from those to make an informed decision about what might be worth digging further in to.

Screen Shot 2014-11-20 at 14.49.24This is where altmetrics might be able to help. Including the Altmetric badges and data for an article within a discovery platform makes it easy for a researcher to determine which of those articles have been generating a buzz or picking up a lot of attention online, and with just a few clicks they can view the full Altmetric details page to identify if the attention is coming from news articles, blogs, policy makers, or being shared a lot on a social network such as Twitter or Facebook.

But it’s not just about what’s popular – it’s about context: this level of detail makes it easy to understand who is talking about the research and what they thought of it. Insight such as this may be particularly useful for younger researchers who are still building their discipline knowledge and looking for new collaborators and wider reading material.

At Altmetric we’re already supporting the implementation of our data and badges in platforms such as Primo (from ExLibris) and Summon (from ProQuest).

 

Primo
There’s a free plugin which can be added to any Primo instance. Anybody can download primoand install it, enabling  their users to see scores and mentions for any articles matched in the system via a new “metrics” tab on the item details page.

Clicking through on the donut brings you to the Altmetric ‘details page’, which displays the original mentions for the article. If you get in touch we can open up the data so that your users can see all of the mentions from each source – otherwise they’ll see just 3 of each type.

You can find the documentation that details the long and short form badge embeds on the Primo Developer Network . Here’s an example of an implementation at Wageningen UR:

wageningen

 

Summon
Summon-logo-withtextSummon clients using a custom interface (like Heidelberg and University of Toronto) can easily integrate the Altmetric badges themselves .

You’ll need to use the JSON API, and as long as the results have identifiers (such as a DOI or PubMed ID) you’ll be able to display the altmetrics data for your articles.

And again, please do let us know once you’ve got them up and running so that we can ensure your users can click through and see all of the mentions for each article (not just the first 3 from each source).

 

If you’re running another discovery service and would like to find out if you can integrate the Altmetric badges, please drop us a line and we’ll see what we can do to help.

Articles and other research outputs don’t always get attention for the reasons we might first assume. There’s a reason you shouldn’t ever rely on numbers alone…

This was demonstrated in spectacular form once again this week when the Twittersphere jumped on a recent article that contained a rather unfortunate error – an offhand author comment asking “should we cite the crappy Gabor paper here”?

The article got a lot of attention – it is now one of the most popular items we’ve picked up mentions for this week (here’s another), rocketing to near the top of the rankings for the journal as the error was shared.

Indicators like the attention score we use reflect the fact that lots of people were talking about the article but not that the attention was, and here we’re just guessing, probably unwanted.

This isn’t the first time we’ve seen cases like this. As you would expect articles get attention for all sorts of reasons which aren’t just to do with the quality of the research.

A few favourite examples we’ve come across over the years include this paper authored by a Mr Taco B. Monster – currently claiming an Altmetric score of 485, with almost 600 mentions to date, and also brought to our attention this week was the tale of the disappearing teaspoons – which is still causing quite a stir ten years after it was first published:

teaspoons 

Flawed Science
A more serious example of attracting attention for all the wrong reasons which comes to mind is in relation to this article published in Science in 2011. The researchers suggested that a type of bacteria could use arsenic, as opposed to the phosphorus used by all other life on the planet, to generate DNA. The article initially received a huge amount of press attention but other scientists quickly pointed out errors – you can dive into some of the relevant mentions by looking at the Altmetric details page

neutrino

Similarly, a suggestion that neutrinos may have been measured as travelling faster than the speed of light did not stand up to further scrutiny, although the truth was only uncovered months later following numerous (successful, but flawed) re-tests.

 

probably

Amongst the blogs, news outlets, general public and other scientists questioning the results coming out of CERN, this article, published just weeks after the original data was made available, generated some impressive altmetrics of its own, most likely due to its humorous abstract. 

 

Playing politics
Typically we’ll also see a high volume of attention around research that is particularly topical or controversial at the time. An article published in the Lancet this year which examined the privatisation of the NHS in Scotland with relation to a yes or a no vote in the recent referendum received a very high volume of tweets as those in the ‘yes’ campaign shared it to encourage their followers to vote in favour of independence:

We’ll be releasing our Top 100 most mentioned articles for 2014 in a couple of weeks (you can see the results for 2013 here) – it’ll be interesting to explore why and how those that make the list caught the public and academic imagination this year.