NSERC 2030: Discovery. Innovation. Inclusion.


Discussion papers

Improving the evaluation of the natural sciences and engineering funding practices

This document is one of a series of discussion papers generated by NSERC staff to foster discussion during the development of the NSERC 2030 strategic plan. Items presented do not represent policy directions; they are meant to elicit discussion among NSERC’s stakeholders. Similarly, all themes discussed in these papers are cost-neutral: they would not require new program funding or cuts to existing programming in order to fund new initiatives.

On this page


Overview

To support decision-making, accountability and transparency, the CCA report challenges research funders to advance their practices in program evaluation by expanding the scope of how research funding for the natural sciences and engineering (NSE) is evaluated. Information from this work plays an important role in advancing the science of science.


How NSERC evaluates

The Council of Canadian Academies report Powering Discovery (CCA, 2021) highlights the conceptual and methodological limitations of conventional types of evaluation (e.g., ex-post evaluations, which are conducted after the fact), especially when the broader social and economic impacts of research and research funding are of interest. Such limitations include the attribution of outcomes to specific funding practices, the time lag between research investment and impact, difficulties in identifying counterfactual measures (what would have happened to beneficiaries in the absence of the intervention) against which to compare observed outcomes and the fact that researchers often receive funding from multiple sources. NSERC’s Evaluation Division attempts to mitigate these limitations by applying best practices in program evaluation, including the use of quantitative and qualitative indicators, triangulation of data across multiple sources, bibliometric indicators, prizes, qualitative assessments of outcomes and aspects of the research process, and counterfactual measures (when feasible). Currently, NSERC’s Evaluation Division is building its capacity to use altmetrics (which include alternatives to traditional citations, such as mentions on websites and blogs, mainstream media coverage, bookmarks on reference managers, and mentions on social networks), in line with the San Francisco Declaration on Research Assessment (DORA).

With the implementation of the Policy on Results in 2016, federal evaluation functions have greater opportunities to use an expanded range of evaluation approaches.Footnote 1 For instance, rapid evaluation and assessment methods may be used to deliver trustworthy and actionable findings to decision-makers at critical moments (McNall & Foster-Fishman, 2007), such as when making submissions to Treasury Board, while participatory and culturally responsive approaches can be applied to evaluate funding for Indigenous research.


What NSERC evaluates

In addition to calling on research funders to re-examine how they evaluate their funding practices, the Powering Discovery report describes the importance of expanding the scope of evaluations by developing inclusive evaluation frameworks to assess a broader range of research impacts. In NSERC’s context, this could include expanding its lens beyond how well an individual funding opportunity is doing to whether the agency is producing the right portfolio of funding (i.e., across a set of funding opportunities) or how well research that NSERC funds ultimately aligns with its overarching thematic objectives. Experts in organizational learning also urge evaluators to ask whether the organization is doing the right things and whether it makes sense to keep doing what it is doing, regardless of it may be doing it well (e.g., Leeuw & Sonnichsen, 2019).

In expanding the scope of evaluation questions, there is a need to re-conceptualize a linear notion of knowledge acquisition (Leeuw & Sonnichsen, 2019) and to focus on evaluating individual funding opportunities, recognizing that NSE funders are operating in a dynamic and evolving landscape. While it may be more straightforward to evaluate results generated at the the funding opportunity level, NSERC generates impacts that extend well beyond this level. Consequently, there are times when it may be beneficial to ask and answer questions about the complex, adaptive and interdependent relationships among funding opportunities, particularly when decision-makers require information at a higher level. In other words, this involves taking a strategic-level perspective by conducting thematic evaluations or evaluations of funding portfolios. Thematic evaluations would be conducted when feasible and relevant to support strategic decision-making.

An important consideration for NSERC is that evaluation coverage is governed by the Financial Administration Act (1985), which states that all transfer payment programs must be evaluated within a five-year period. Additionally, evaluations of grants and contribution programs must always include an assessment of relevance, effectiveness and efficiency (Government of Canada, Treasury Board Secretariat, 2016).

Within these parameters, NSERC’s Evaluation Division is piloting approaches to fulfil legislative requirements while addressing questions of strategic importance to the agency. For instance, a current evaluation is studying how NSERC, SSHRC and CIHR support graduate student training across more than 15 funding opportunities. It will provide an evidence-based analysis of the strengths and limitations of the current portfolio of funding to support the training of students for research-intensive careers. The results of this evaluation are expected to support strategic decisions about talent-related funding moving forward, while fulfilling the federal requirement for evaluation of the portfolio’s program components. Similarly, a thematic evaluation related to equity, diversity and inclusion (EDI) has been identified in NSERC’s Departmental Evaluation Plan.


Opportunities

NSERC’s goal is to improve the evaluation of NSE funding practices and the evidence available to inform decision-making. This goal ultimately advances knowledge about the science of science. In this paper, we propose that developing an organizational learning agenda provides the best possible opportunity to improve how and what NSERC evaluates. It would support a systematic approach to leverage evaluations and other research activities as a tool for continuous improvement and innovation agency-wide, improving decision-making, accountability and transparency.

An organizational learning agenda may be considered as a set of prioritized research questions and related activities that enable organizations to work more efficiently and effectively with regard to evaluation and decision-making (Evidence-Based Policymaking Collaborative, 2018). Its development involves gathering stakeholders to participate in a consultative process to identify areas of interest for decision-making. Existing research and evaluation findings are reviewed to inform the development of key questions that will guide the organization’s short- and long-term research and evaluation projects. The intent is to guide and strengthen an organization’s knowledge and evidence base to support better decision-making (US Agency for International Development, n.d.).

The final report of the US Commission on Evidence-Based Policymaking (2017) recommended that federal organizations implement learning agendas, because they would benefit from an approach in which they regularly identify short- and long-term priority research, policy and evaluation questions relevant to the organization’s mission and legal responsibilities. The questions identified in the agenda could be prioritized and pursued by the organization’s leadership over a given period and be applied as a coordination tool by the organization’s evidence-building community. However, two of the challenges identified by the commission were: (1) that federal organizations’ capacity to support the full range of evidence-building functions was uneven and, where capacity existed, was often poorly coordinated; and (2) that federal organizations frequently lack an integrated approach or a long-range plan for evidence building.

NSERC has the required organizational conditions and processes for organizational learning that will enable the development of a learning agenda (Sonnichsen, 2019). Through the engagement of NSERC management, staff and stakeholders in developing a set of broad questions of interest at the organizational level, a learning agenda can:

  • facilitate coordination, collaboration and evidence-sharing among the various offices involved in evaluation, performance measurement and data activities, such as the Office of the Chief Data Officer (Evidence-Based Policymaking Collaborative, 2018)
  • assist in planning and scoping evaluations at a more strategic level, including broader evaluations questions, while balancing the constraints of the federal accountability requirements for evaluation with the information needs of senior management
  • improve the quality and relevance of the information presented to senior management to support decision-making
  • facilitate strategic engagement across all programs and working levels

There are also opportunities to leverage existing information-gathering initiatives within the agency, such as the Departmental Evaluation Plan and NSERC 2030, to help develop this agenda. For example, there are anticipated synergies between NSERC 2030 and an organizational learning agenda, as the identifying information needs and developing questions of interest for NSERC 2030 are similar to the activities undertaken for learning agendas. Consequently, NSERC 2030 may identify some of the questions for consideration in the learning agenda. The resulting learning agenda could then be deployed to assess the progress and outcomes of NSERC 2030.


Discussion questions

  • What recent or novel steps has your organization taken to improve the evaluation of your funding practices and/or to develop organization-wide research agendas?
  • What measures do you have in place to increase agility and be more responsive to your community (e.g., funders improving support for researchers, or research institutions developing and supporting strategy and operations)?
  • What value would you see in developing an organizational learning agenda for your organization?
    • What kinds of questions posed by your senior management are challenging to answer?
    • What information needs to support decision-making within your organization are currently not being met. What steps are you taking to address these needs?

References

Council of Canadian Academies. (2021). Powering Discovery: The Expert Panel on International Practices for Funding Natural Sciences and Engineering Research. Ottawa, ON: Council of Canadian Academies. Retrieved from https://cca-reports.ca/wp-content/uploads/2021/05/Powering-Discovery-Full-Report-EN_DIGITAL_FINAL.pdf

Evidence-Based Policymaking Collaborative. (2018). Evidence Toolkit: Learning Agendas. Retrieved from https://www.urban.org/research/publication/evidence-toolkit-learning-agendas/view/full_report

Government of Canada, Justice Canada. (1985). Financial Administration Act. Retrieved from https://laws-lois.justice.gc.ca/eng/acts/F-11/page-9.html#h-228742.

Government of Canada, Treasury Board Secretariat. (2016). Directive on Results. Retrieved from https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=31306

Government of Canada, Treasury Board Secretariat. (2016). Policy on Results. Retrieved from https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=31300

Leeuw, F.L. & Sonnichsen, R.C. (2019). Introduction: Evaluation and Organizational Learning: International Perspectives. In F. L. Leeuw, R. C. Rist and R. C. Sonnichsen (Eds.), Can Governments Learn? Comparative Perspectives on Evaluation and Organizational Learning (pp. 1–13). Routledge.

McNall, M. & Foster-Fishman, P.G. (2007). Methods of Rapid Evaluation, Assessment, and Appraisal. American Journal of Evaluation, 28(2), 151–168.

Sonnichsen, R.C. (2019). Effective Internal Evaluation: An Approach to Organizational Learning. In F.L. Leeuw, R. C. Rist and R. C. Sonnichsen (Eds.), Can Governments Learn? Comparative Perspectives on Evaluation and Organizational Learning (pp. 125–141). Routledge.

US Agency for International Development. (n.d.). Implementing a Learning Agenda Approach, Retrieved from https://usaidlearninglab.org/sites/default/files/resource/files/defning_a_learning_agenda.pdf

US Commission on Evidence-Based Policymaking. (2017). The Promise of Evidence-Based Policymaking. Retrieved from: https://bipartisanpolicy.org/download/?file=/wp-content/uploads/2019/03/Appendices-e-h-The-Promise-of-Evidence-Based-Policymaking-Report-of-the-Comission-on-Evidence-based-Policymaking.pdf


Thank you for your interest in NSERC 2030. The comment form is now closed.


Contact

nserc2030crsng@nserc-crsng.gc.ca

Contact Newsletter

Get highlights of things happening at NSERC delivered to your email inbox. View all Newsletters

  • Twitter
  • Facebook
  • LinkedIn
  • Youtube
  • Instagram