Altmetric Blog

How to save time finding your "broader impacts" for grant applications

Stacy Konkiel, 19th November 2015

Screencap of tweets, policy documents, and news articles showcasing various types of research impact.

Has your research introduced popular new methods to your discipline, had an influence on public policy, or changed the way the public understands complex topics like knowledge transfer? If so, such evidence likely exists online and can be used to make a case for funding.

Did you know that the average science PI spends 116 hours preparing a single grant proposal?

How much of that time is spent finding and documenting evidence of “broader impacts” and engagement?

Luckily, it’s now possible streamline at least some of the grant preparation process, so researchers can spend less time on paperwork and more time doing research. Tools like the Altmetric bookmarklet and Impactstory can help you discover your “broader impacts” evidence, without a lot of work. And that evidence will help you stand out when applying for grants.

Think about it: so much of a grant proposal is an explanation of why you are qualified to advance research in a particular area. It follows that if you can provide hard evidence of your past success in doing outreach to the public and delivering other types of “broader impacts”, in addition to excelling in other areas of the application process, that you could give yourself an advantage over those whose claims to greatness remain unsubstantiated.

This is the first of two posts that will offer practical advice (from experts including an NSF program officer, a microbiologist, and librarians that work regularly with NIH-funded faculty) for the best use of metrics in grant applications.

However, as this area of application for research metrics is so new, it is as much a thought exercise as anything else. I’d welcome your feedback in the comments below!

In this post, I’m going to describe some of the types of potential impact funding agencies are looking for in fundable projects, how researchers are currently using metrics in grant applications, and explain the surprising reason why most funding agencies do not tell you how to use metrics in your proposals.

The types of impact granting agencies seek

Increasingly, funders want to know how your work is having an effect upon society. Is it making a difference in the lives of everyday people? This type of impact is often described as “broader impacts”. Broader impacts can be:

  • Increasing diversity in STEM
  • Bettering scientific literacy among the public
  • Improvement of public health or safety
  • and many more things

Funders are also interested in supporting research of “intellectual merit”: research that advances knowledge in a discipline. Examples of intellectual merit include:

  • Making new connections between disciplines
  • Applying new approaches to existing questions
  • Developing new tools or methods for data analysis

Not all granting agencies will use the terms “broader impacts” and “intellectual merit” in their own programs. For example, the Wellcome Trust simply wants to “improve health for everyone by helping great ideas to thrive.” But the idea is usually the same, no matter the funder: they want to support research that will change the world.

OK, but how on earth does one prove that they’re changing the world?

How researchers are currently using metrics in their grant applications

Many diverse metrics tend to be included in grant applications, including:

  • Amount of grant dollars previously awarded
  • Number of graduate and undergraduate mentees
  • Reach of previous research, in terms of people educated, lives saved, or other benefits

Why do people use these metrics? A simple answer can be found in this NIH guideline for writing a proposal:

[Applicants should] capture the reviewers’ attention by making the case for why NIH should fund your research. Tell reviewers why testing your hypothesis is worth NIH’s money, why you are the person to do it [emphasis mine], and how your institution can give you the support you’ll need to get it done. Be persuasive.

Is there anything more persuasive to a scientist than well-applied data?

The key phrase here is well-applied: the metrics you use have to match the point you’re trying to make about the importance of your work. The following things must align for your metrics to be useful:

  • The metric(s) you use
  • Must relate to the grant program you’re applying for
  • And must inform the type(s) of impact you’re arguing you’ve had.

Here’s an example: are you an excellent mentor who’s applying for an NSF “Professional Formation of Engineers: REvolutionizing engineering and computer science Departments” grant? Then the number of graduates you’ve helped and their recent grants, publications, and job offers could be good data to include in your application, as it provides specific evidence of your impact in changing student-oriented practices in your department.

“But,” you might be asking, “what about citations?” Though their use is pervasive in other areas of academia, citation-based metrics are not often used in grant applications. For example, citation counts for articles tend not to be included in applications for some NSF directorates, and many people agree that the journal impact factor should never be used to judge grant applications. A research group within the NIH did recently propose the Relative Citation Ratio, but the metric has got its drawbacks and doesn’t appear to actually be in use for evaluations.

Citation-based metrics don’t tell grant reviewers how your work has made a contribution to your field. Research is cited for many, many reasons, after all. And those metrics can’t be used to describe “broader impacts” upon larger society–how research is translated into practice for the benefit of humanity–because they only measure the discussion of research among scholars.

That’s where altmetrics come in. Altmetrics are data that can better illustrate the various types of impact that research might have: educational, policy, public health, technology commercialization, and more. They’re especially useful to help describe the impact of research that comes in forms other than a journal article (datasets, software, interactive websites, etc).

Microbiologist Holly Bik told me via email,

“If you’re creating public outreach tools like blog posts or things that are often used by other scientists like software, it can be a problem because these newer outputs aren’t valued like journal articles are, and aren’t cited in the same ways. But if you have metrics for how those outputs are being reused, you can prove how valuable your work is: you can show people charts of regular website visitors, number of software citations, interesting collaborations that result, and more. Altmetrics are the only way to communicate those important but non-traditional impacts.

And as computer scientist Noah Smith explains over on Quora,

“The only thing I can think of that’s vaguely related [to using metrics in grant applications] is providing data on how often a software package released to other researchers was downloaded or (maybe) cited as having been used.  This can be taken as an indicator that the researcher can produce tools that others value.

Noah and Holly confirm my previous point: the metrics you use must be appropriate to your goals.

What the grant guidelines won’t tell you

Grant preparation guidelines rarely give instruction on how to use metrics in your grant application; that’s a fact. The omission is intentional, and it’s for a very good reason.

As NSF Program Manager Daniel S. Katz explains, “There is a group working on gathering metrics that have proven useful for [NSF Software Infrastructure for Sustained Innovation] projects, but I’m hesitant to provide examples myself, with the fear that new proposers will read my examples and decide they are the ‘right’ ones for them to use too.

Dan knows that it’s human nature to want to know (and use) the proven, “right” metrics (even sometimes if they’re not very applicable to one’s own work).

Be aware of this inclination and try not to succumb to it when preparing your own grant applications! Of course, knowing what others done can sometimes be useful, but it can be irrelevant equally as often.

So what metrics should you use?

There’s a lot of data out there that can be used to illustrate the many types of broader impacts and intellectual merit your work has had. That includes: your influence upon public policy, widespread use of your software in your discipline, or the fact that your articles and books are used to teach students worldwide.

In my next post, I’ll describe the time-saving tools you can use to collect this data in one place, as-it-happens. Stay tuned!

9 Responses to “How to save time finding your "broader impacts" for grant applications”

Altmetric (@altmetric)
November 19, 2015 at 12:00 am

How to save time finding your “broader impacts” for grant applications: #altmetrics #highered #libchat #phdchat

Digital Science (@digitalsci)
November 19, 2015 at 12:00 am

How to save time finding your “broader impacts” for grant applications via @altmetric #altmetrics

Jean Peccoud (@peccoud)
November 20, 2015 at 12:00 am

[@altmetric blog] How to save time finding your “broader impacts” for grant applications

Paul Fisher (@DrFishmarketing)
November 20, 2015 at 12:00 am

"How to save time finding your “broader impacts” for grant applications" #altmetrics #feedly

Tina Griffin (@tmcgriffin)
November 20, 2015 at 12:00 am

"How to save time finding your “broader impacts” for grant applications" #data #medlibs #datalibs

Stacy Konkiel (@skonkiel)
November 20, 2015 at 12:00 am

Save time finding broader impacts for your next grant app — Use these impact #altmetrics!

レタープレス株式会社 (@LppsTweet)
November 23, 2015 at 12:00 am

"How to save time finding your “broader impacts” for grant applications"

Digital Science (@digitalsci)
November 24, 2015 at 12:00 am

How to save time finding your “broader impacts” for grant applications #altmetrics

[…] Sourced through from: […]

Leave a Reply

Your email address will not be published. Required fields are marked *