The Altmetric 500 dataset has been put together by Kathryn Weber Boer, Technical Product Specialist at Digital Science.

Context is crucial to understanding and measuring attention, and that is the most important aspect of  Altmetric 500: a dataset that allows a direct look at the number of mentions per attention source, in order to shine a light at global engagement with research. The Altmetric 500 dataset gives insights into the types of attention that scientific articles and books have garnered across different platforms. It includes attention from news media and blogs, policy documents, YouTube, Wikipedia, StackExchange and patents, and from social media outlets including X (Twitter), Reddit and Facebook. In addition, the dataset reveals research that was not only highly mentioned overall, but also had broad attention across multiple areas of attention including policy sources, social media and scientific citations.

In doing so we hope to give readers a sense of which research gained traction on different public platforms, and how that might compare with the articles that resonated within the scholarly space. 

Customizable attention analysis

The Atmetric 500 covers 11 attention sources. These are:

  1. blog posts
  2. news articles
  3. scientific citations
  4. StackExchange posts
  5. patent application citations
  6. policy citations
  7. Wikipedia pages
  8. Facebook walls
  9. X (Twitter) profiles
  10. Subreddits
  11. breadth of attention.

For each of these attention sources, the research output with the most mentions were identified in 53 categories across four areas: 

  1. Sustainable Development Goals (SDGs)
  2. Field of Research
  3. Geographic region
  4. And, a top overall category

With this approach, we have highlighted the most influential output in 2023 (for those published from December 2022 to December 2023)–close to 500 research outputs. 

Why do these analyses?

The Altmetric 500 aims to explore the different meanings of “attention” to research. It is not a definitive list of the “best” papers published in 2023 but rather a conversation starter about the varying impact of and engagement with research across different platforms.

Initial observations

An interesting feature of the attention to research is that specific research outputs often gain attention in one channel, whilst being barely acknowledged in another attention source (so for example, shared a lot on X/Twitter, but not at all mentioned in the mainstream media). This holds true for the 2023 list, in general, and it was interesting to note that the publications with the most attention in public engagement channels were not necessarily those that achieved a relatively high academic citation count.

Notable examples

The kind of information drawn from having a deeper look at attention include  five publications notable for having garnered attention both by policy organizations and in scientific citation, and five with attention from both policy makers and news/blog writers.

Next steps

In the coming months, various aspects of this dataset will be explored. The diversity of attention sources makes it an interesting dataset, and its connection to Altmetric on GBQ will allow for sortable and aggregatable results by publisher, funder, university, and author. This Altmetric 500 dataset could be a step in providing a deeper insight into the multifaceted impact of research, the diverse ways in which scientific work resonates with varied audiences, and a better understanding into what attention means.

Explore the dataset yourself, or if you want to learn more about Altmetric, contact our team.

In our busy world, scientists must work harder than ever to ensure their research receives the attention it deserves, writes Alex Zhavoronkov in Forbes. And while there are various ways to boost a paper’s visibility, there is a singular ‘blueprint’ for tracking attention –  the Altmetric Attention Score.

Altmetric tracks the visibility of articles through monitoring social networks including X (formerly Twitter), and Facebook. It also tracks top-tier mainstream media, mainstream science blogs, policy documents, patents, Wikipedia articles, peer review websites, F1000, Syllabi, Stack Exchange sites and Youtube. The score an article receives gives an early indication of its impact and can be a powerful way of attracting wider attention (learn more about how the score is calculated). 

Zhavoronkov writes that the key to boosting a paper’s score is by increasing its visibility and there are several key ways researchers can do this:

Write a press release and distribute to science-focused media

“If your paper is significant, for example, you elucidated a novel disease biology, discovered a new drug, developed a new fancy algorithm, designed a new material, or developed a new application for a quantum computer, it is worthwhile investing some time and resources in writing a press release,” he explains.

Create a blog post

Posts can be longer and more comprehensive than the press release and can be shared with journalists along with the press release. The blog may serve as inspiration for third party news coverage. “Make sure to reference your paper’s DOI and URL.”

Tweet and ask your team members to tweet

“Each post on X gives a quarter of an Altmetric point. If a paper goes viral on X, its Altmetric score can be considerable. Plus, once journalists notice that it went viral, they will be more likely to cover the story, further increasing the score,” explains Zhavoronov. But beware – retweets count less than original tweets and Altmetric will reduce the weight of a tweet where it finds markers that suggest bias or promiscuity – such as a publisher only tweeting their own articles.

Experiment, learn, repeat

“The popularity of a paper depends on many factors. Ones which capture the public imagination or have widespread appeal are of course, much more likely to gain traction online.” Papers published by top-tier institutions and in top journals naturally attract more attention. By experimenting with different approaches and measuring success through Altmetric, scientists can develop new strategies to popularize scientific research and raise its profile. 

For more information, read LSE blog post tips on getting more out of Altmetric or contact the Altmetric team.

Wikipedia, with its vast repository of citations for scholarly publications, serves as a significant source driving traffic to referenced websites. Additionally, it plays a crucial role in quantifying attention received by research outputs cited on its platform. Altmetric’s expansion to include multi-language tracking was a step in the direction of gaining a deeper comprehension of attention beyond the English-speaking world. Drawing information from the webinar Tracking multiple languages in the Altmetric Explorer,” this blog gives insights into why it is important to track different languages.

Expansion of Altmetric tracking

Improving diversity has been an important principle for Altmetric as it not only underlines the commitment to open research and inclusivity but also allows Altmetric to ensure that it quantifies attention as widely as possible. That is why, two years ago, Altmetric expanded its tracking of Wikipedia citations to beyond English language Wikipedia. Before this expansion, the Altmetric Data Insights team investigated how the different languages describe a broader view of research and to what extent academics trust Wikipedia. 

Investigating language diversity

There has been an assumption that the English language Wikipedia, being larger and more active than other language versions, is the primary one to consider. However, conversations with academics in other languages means that excluding other languages could result in overlooking crucial sources.

To explore diversity, Mike Taylor, Head of Data Insights at Digital Science, and his colleagues examined 21 different language versions of Wikipedia. Although numerous other Wikipedia editions exist, and Altmetric encompasses more languages than those examined in the study, each language version studied required a minimum of 1,000 editors and 100,000 citations to academic research for inclusion in the research.

Of the 2 million research citations found in the English language Wikipedia, 1.1 million citations were unique to English and not present elsewhere. However, non-English Wikipedia editions cited 1.45 million research articles and books, constituting 42% of the total citations.

Individual language sites can be examined in detail. For instance, the French language site cites 108,000 research publications exclusively, and the team identified similarly substantial numbers for German, Italian, Japanese, and Spanish. However, the team observed that each language, even those with lower citation counts, makes a unique contribution. For example, Farsi/Persian, Turkish, Vietnamese, and Serbian each cite approximately 12,500 publications that are not referenced elsewhere.

Disciplinary insights 

The team broke the data down into six different disciplines, based on Dimensions fields of research, and looked at how strongly unique their citations were. They found, for example, that Polish has a strongly unique citation base in engineering technology, medical and health sciences and social sciences.

In terms of fields that differ across languages, the team found that engineering technology is probably overall the most the most different, followed by medical health. The initial assumption was that there would be greater variability across languages for citations for social sciences and humanities due to expected differences in linguistic and cultural research, as well as references to heritage. However, contrary to this assumption, the team found less differentiation in these areas compared to the sciences.

Taylor says that, “Investigating the publications that are cited in Wikipedia gave us fascinating insights into how different languages represent their own research. The data shows that different languages contribute a unique voice to the representation of research. Excitingly this enables us to start to establish a new way of thinking about Wikipedia content through the eyes of research of scientific citations.”

Getting a deeper view on topics

Looking at individual pages shows that some Wikipedias celebrate the lives of researchers who do not appear in English language Wikipedia. Elsewhere, there are differences in approaches across languages. For example, the French philosopher and mathematician Émilie du Châtelet has a large page in the English language Wikipedia with many more academic references than the French entry. Yet the French Wikipedia has separate, individual pages for many of her works, and each of these have numerous academic citations. 

Taylor adds, “By increasing the diversity that we cover in Altmetric we represent a broader coverage of scholarship and scholars globally than we could by indexing only the English language Wikipedia.”

Altmetric now tracks 35 languages, including the latest Swahili, Afrikaans, Egyptian Arabic, and Uzbek, with each language contributing substantially to the overall count. If you want to listen to the complete webinar, sign in to Tracking multiple languages in the Altmetric Explorer.

For more details about Altmetric contact the team.

So you think you know Altmetric badges? These distinctive, colourful donuts are a common sight across many research publications today. Developed to provide an understanding of attention beyond the traditional bibliometrics, the Altmetric badges have become an important tool for publishers, institutions and researchers alike. They unveil the layers of research attention across platforms ranging from news and blogs to social media. 

Before Altmetric and its colourful badges, or the Altmetric donuts, were born, researchers often had to wait months or years for metrics to know how much attention their research output had garnered. And the measure of this attention was usually limited to citations and attention within the academic circles. Today, Altmetric badges, which can be seen alongside thousands of journal articles, institutional and preprint repository pages, provide real-time updates on attention far beyond academic citations.  

The badges, or ‘donuts’ as they are sometimes known, capture metrics from blogs, posts on X (tweets), news reports, policy and patent documents, academic sources and more. These ‘mentions’ combine to contribute to the Altmetric Attention Score – a weighted count of the attention an item has received, designed to give an at-a-glance indicator of its reach and influence. And because they update in real time, there are no delays in finding out who is talking about a particular piece of research and where the conversation is taking place. 

Broadening publisher engagement: Altmetric for author support and beyond

“We are committed to providing a wide range of impact metrics about our publications beyond citations and the Impact Factor. We believe Altmetric plays an important role in this.” Andri Johnston, Cambridge University Press (Source: https://www.altmetric.com/case-studies/cambridge-university-press/

Many publishers use Altmetric badges to support authors by helping showcase the wider influence of their research output across different platforms thanks to the instantly recognizable visualization. 

This research output has received attention from a range of sources. Whereas this one is attracting citations from predominantly news, and policy

In addition, publishers are using Altmetric more broadly, in search results, on author profile pages, when viewing articles and some even include the badge when users save articles to their personalised pages or accounts on their sites. For example, Wiley includes Altmetric in author service information and it is prominent in their infographic about journal and article metrics and Taylor & Francis provides information about Altmetrics as an editor resource

In the January 2024 CEO’s letter, MDPI CEO Stefan Tochev underlined the important role Altmetric plays. “At MDPI, we are committed to providing our authors with the essential tools to publish, promote, and track their research, “ he wrote. “Our collaboration integrates …[the] Altmetric tool, offering us and our authors the ability to track a variety of sources that monitor and report attention surrounding publications.” The utility of Altmetric for publishers, however, goes beyond just supporting authors. It helps publishers identify the nature of this attention, the sources, and the platforms it originates from. The Journal of Consumer Research provides a glimpse into how publishers are tapping into the uses of Altmetric in their blog post Are We Getting Attention?

Showcasing attention: Altmetric badges on institutional and individual researcher webpages

“We wanted researchers to easily see who is attracted by their research.”- Oliver Renn, Lecturer at the Department of Chemistry and Applied Biosciences, ETH Zürich (Source: https://www.altmetric.com/case-studies/eth-zurich/

For institutions and researchers having quick access to the information gleaned from the Altmetric badges are useful for putting together grant applications, tenure submissions, website biographies and more. “Altmetric data can be used to demonstrate the attention and impact of your research, for example, in grant applications, CVs, promotion applications, etc.,” states the University of Dundee LibGuides.

Bethanne Wilson Director of Journal Business and Operations at the non-profit Radiological Society of North America (RSNA) says that, “We like to support our authors as well as we can, and one way we can do that is by helping them promote their works,” she explains. “Having more detailed metrics is very useful for this, and the Altmetric badges allow authors to show all the online attention their paper has received at a glance.” (Source: https://www.altmetric.com/case-studies/radiological-society-of-north-america/

The Smithsonian Research Online web page (see screen grab below) gives a clear illustration of how institutions can showcase the attention their researchers receive.

This kind of information can also be accessed and displayed by individual researchers, and having quick access to this information is helpful and convenient. Knowing who is talking about their research allows for researchers to respond in real time on the platform where research is discussed, and also provides useful contacts to follow up with for potential collaborations.  Moreover, Altmetric badges on the personal webpage or publications list allows visitors to be able to click on the donuts to access the full Altmetric Details Page, where they can explore all of the original online mentions and shares of the work. 

Here below are two examples of researchers who use the Altmetric badges on their personal webpages.

Webpage, Steve Davis, Associate Professor, Earth System Science.Website, Marine Biologist, James Grecian.

The Altmetric badges have been used and loved by researchers for over a decade and continue to provide value to publishers and researchers alike. In a recent LinkedIn post by Pitch Science, a science communication and digital marketing consultancy helping scientists, research institutes, not-for-profits, and other science brands communicate their work to the public, stated that, “Altmetric badges are a great way to show the online attention a piece of research is receiving. And it’s not just academic journals who can use Altmetric badges. It’s actually free and easy for researchers to embed their publications’ Altmetric badges into their personal websites.“

For more information about Altmetric badges and how they can be used to interpret and leverage data, we welcome you to sign up and watch a recently held on-demand webinar.
If you want more general information about Altmetric, don’t hesitate to contact the Altmetric team.