NSERC 2030: Discovery. Innovation. Inclusion.

Discussion papers

Improving funding efficiency and reducing administrative burden

This document is one of a series of discussion papers generated by NSERC staff to foster discussion during the development of the NSERC 2030 strategic plan. Items presented do not represent policy directions; they are meant to elicit discussion among NSERC’s stakeholders. Similarly, all themes discussed in these papers are cost-neutral: they would not require new program funding or cuts to existing programming in order to fund new initiatives.

On this page


Researchers are spending a significant portion of their time trying to obtain funds to conduct researchFootnote 1 (Schneider, 2020, p.8). As funding instruments multiply, so do the numbers of proposals and administrative requirements for reviewing them. Furthermore, with increased competition and thus declining success rates, the resources allocated to preparing and reviewing unsuccessful proposals increase (CCA, 2021).

The results of a survey that the NSERC conducted this year with the Canadian research community support the need to reduce the administrative burden. Fewer than 10% of respondents responded that our research administrative processes were a “low burden,” and over a third responded that one or more processes are a “high burden.”

According to Canada’s Fundamental Science Review, “the principle of simplicity should govern all programs and competitions to avoid wasting the scarcest non-renewable resources of some of Canada’s brightest people: the waking and working hours of our scientists and scholars” (Advisory Panel on Federal Support for Fundamental Science, 2017, p.13). This paper explores opportunities for improving efficiency, with a focus on reducing administrative burden for the research community, through the two themes: application and review processes, and support for organizations that administer research grants (called “administering organizations”).

Opportunities — Application and review process

Joint grant application and review

Canada’s natural sciences and engineering (NSE) funding landscape has multiple research funding organizations, including federal and provincial/territorial agencies. The number of funding organizations increases the funding opportunities available to researchers. However, the coordination of numerous funding opportunities is both a challenge and an opportunity. NSERC has already implemented several joint funding opportunities (e.g., NSERC Alliance-Mitacs Accelerate Grants, DND/NSERC Discovery Grant Supplements, etc.), but these are typically limited in size (funding) and scope (topics). In addition to looking for more opportunities to launch joint funding calls, including international ones, funding organizations may also reuse peer review results rather than undertaking a completely new review.

Alternative approaches for review

“To fund research more efficiently, effectively, and transparently, agencies are experimenting with alternative approaches” (CCA, 2021). This is in line with one of the recommendations from Canada’s Fundamental Science Review, that there is “support for experimentation and evaluation to study new approaches to peer review, including use of iterative review processes” (Advisory Panel on Federal Support for Fundamental Science, 2017). In this regard, multi-step applications and/or pre-screening to reduce applicant pools have been used in several current and past NSERC programs (e.g., Discovery Horizons, Strategic Partnership Grants for Networks, Technology Access Centres Grants). However, ample opportunity remains to explore other approaches, including the following examples:

  • partial lotteries among applications deemed equally excellent; this would involve randomized selection around cut-offs after merit review had been conducted. These are currently being used for other international competitions (CCA, 2021)
  • restrictions on resubmissions: If a previous application scored poorly, applicants would not be able to submit a new one to the same funding opportunity for a specified period of time (CCA, 2021)
  • distributed peer review, or self-review, involves asking candidates to review other applications in the same funding call, which can be used for small competitions or for programs centred on a specific facility or topic (CCA, 2021)
  • automatic renewal of awards, especially for discovery-based research, while ensuring that new applicants have opportunities to obtain grants
  • use of artificial intelligence (AI) in various areas, such as helping find and match reviewers to applications, verifying research security requirements, providing feedback to applicants, helping reviewers with their evaluations, etc. (Andersen, 2020; Kerzendorf et al.,2020 as cited in CCA, 2021)

Persistent identifiers

Strategy development requires understanding and leveraging digital assets. The research enterprise generates a great deal of information — from descriptiveinformation about researchers to publications and datasets resulting from a research project. To increase the FAIRness (Findability, Accessibility, Interoperability, Reusability) of these outputs, persistent identifiers (PIDs, long-lasting references to digital resources) can facilitate links between the related pieces of information by referring to a specific entity in the information landscape, such as an object, organization, person or dataset. A system for managing PIDs (e.g., software and associated repository) enables the easy discovery of objects related to a PID. PIDs could be used to help facilitate the application and review process by linking an author (through PIDs) to a set of publications, grants, students, etc. (Research Data Canada ID’s Working Group, Standards and Interoperability Committee, 2016).

Opportunities — Support for administering organizations

Currently, integration between NSERC and administering organizations is limited. Although research administrators are expected to help their organization’s researchers with applications, budgets, ongoing reporting requirements, etc., NSERC provides little functionality to assist research administrators through its online systems as well as only limited other resources. Opportunities to help research administrators include:

  • providing greater access to functionality in our online systems
  • integrating their current processes and procedures with NSERC’s
  • granting them direct access to data that is applicable to them (currently planned for the Tri-Agency grants management solution and being piloted in the Convergence platform)
  • improving training and support for research administrators
  • providing more collaborative tools, such as ones that would allow researchers and partner organizations to easily find each other, interact and determine the potential to collaborate or to better support non-traditional research by allowing more community-based interactions

Discussion questions

  • What types of joint funding and Tri-Agency funding opportunities would be of most benefit? Could NSERC’s peer review results for one program be used by other NSERC programs or by other organizations?
  • With respect to different methods of merit review that can be used to reduce the burden on reviewers, are there situations where these methods are or are not suited? What thresholds or criteria should be used? Are there any considerations when implementing them?
  • Where could AI technologies most benefit the Canadian research funding ecosystem, and, conversely, are there areas where AI could be applied but should be avoided? In the context of the government’s Directive on Automated Decision-Making, what other factors should we consider when implementing them?
  • What would be the best use of PIDs for NSERC? Are there risks?
  • What can NSERC do to better support research administrators throughout the entire grants management lifecycle (including the implementation of collaborative tools)? How would administering organizations like to be more integrated in NSERC’s processes and systems (e.g., integrate with research administration software in use at administering organizations)?


Advisory Panel on Federal Support for Fundamental Science. (2017). Investing in Canada’s Future: Strengthening the Foundations of Canadian Research. Ottawa, ON: Canada’s Fundamental Science Review. Retrieved from http://www.sciencereview.ca/eic/site/059.nsf/vwapj/ScienceReview_April2017-rv.pdf/$file/ScienceReview_April2017-rv.pdf

Council of Canadian Academies. (2021). Powering Discovery: The Expert Panel on International Practices for Funding Natural Sciences and Engineering Research. Ottawa, ON: Council of Canadian Academies. Retrieved from https://cca-reports.ca/wp-content/uploads/2021/05/Powering-Discovery-Full-Report-EN_DIGITAL_FINAL.pdf

Research Data Canada ID’s Working Group, Standards and Interoperability Committee. (2016). Persistent Identifiers: Current Landscape and Future Trends, version 1,10, 5281. Research Data Canada, Zenodo. Retrieved from: https://zenodo.org/record/557106#.YUnytWBYaUk

Schneider, S.L. (2020). 2018 Faculty Workload Survey, Research Report: Primary Findings. Federal Demonstration Partnership. Retrieved from http://thefdp.org/default/assets/File/Documents/FDP%20FWS%202018%20Primary%20Report.pdf

Thank you for your interest in NSERC 2030. The comment form is now closed.



Contact Newsletter

Get highlights of things happening at NSERC delivered to your email inbox. View all Newsletters

  • Twitter
  • Facebook
  • LinkedIn
  • Youtube
  • Instagram