1. Measuring S&T Performance at Environment Canada Eric Gagné A/ Director, Science Policy Division Environment Canada PIPSC Science Policy Symposium May 13, 2010
2.
3.
4.
5. Overview of R&D / RSA at EC Science-Metrix (2006) RSA ~45% of EC expenditures ($450M) and ~40% of personnel (2500 full-time equivalents). R&D ~25% of EC expenditures ($250M) and ~15% personnel (1000 full-time equivalents) EC is the largest performer of natural science RSA in the federal government. EC is one of the top 10 environmental R&D institutions in the world, and the top producer of environmental research in Canada (2006). Directly allows EC to deliver on its mandate, through monitoring, testing, prediction, etc Provides the basic knowledge, credibility to support EC’s regulations, policies, programs, etc. Related Scientific Activities (RSA) R&D
12. Collaborations between EC branches / directorates and with partners (Figure produced by Science-Metrix, 2009) Linkages: National collaboration networks Note: Thickness of lines and size of circles (EC Directorates / Branches) indicate the number of collaborations http://www.ec.gc.ca/scitech : “Managing S&T”
13.
14.
15.
16.
17.
18.
Notes de l'éditeur
I’ll first quick context as to why we undertook this performance measurement report Share results from our 2009 report on R&D performance, with a focus on the process and methods Provide update on work currently underway surrounding the measurement of related scientific activities Finally, I’ll conclude by discussing goals and opportunities for the federal S&T community associated with this ongoing work
EC undertook these activities for three main reasons 1- To respond to key commitments from EC Science Plan and the Federal S&T Strategy. 2- To better communicate the S&T story and its impacts 3- Better manage S&T as a horizontal cross-cutting activity
R&D – associated with the innovation chain and knowledge creation RSA - associated with “ public good” science – monitoring, risk assessments, scientific service delivery, surveys, standards development Basically, RSA is all S&T that is not R&D In measuring performance, there is generally more focus on R&D , because of its links to innovation and economic prosperity Refer to Benoit Godin’s important work on this… Though R&D and RSA are closely linked, it is difficult to measure how RSA contributes to the innovation system or to economic prosperity
EC is a large producer of both R&D and RSA 70% of our expenditures go towards S&T We also know that EC is the largest producer of environmental research in Canada , ranking 7 th in the world, according to an independent 2006 study by Science-Metrix StatCan data reveals that EC is the largest natural science RSA performer in Canada –
Current government reporting structures are vertical, but S&T is a horizontal activity Though EC has tree S&T strategic outcomes we did not have any mechanisms to measure S&T within the department Our approach to S&T measurement is a step towards overcoming this challenge
Because various components of S&T (R&D, RSA, Science policy) require different methodologies to assess performance, EC is undertaking a Phased approach. R&D was a good starting point, because of the readily-accessible indicators. We completed this Phase last December. With the experiences and competencies gained in the performance measurement field, we are now tackling RSA. I’ll talk about the process, in a few slide Phases 3 aims to take it one step further and look at how we manage our science. For example, are our current governance structure effective ? Basically, evaluating the science policy capacity. Gaining experience in measuring three components , the last step would be to integrate all three components into one report . This is currently a loafty goal – and we hope to achieve this vision one day.
To measure the R&D performance within EC, a new framework was created. The four principles of the framework (alignment, linkages, excellence, enabling the environment) are aligned with the Federal S&T Framework of 2005 and with EC Science Plan. Each principle implied the development of several key qualitative and quantitative indicators: Bibliometrics (collaborations, citation rates, etc.) Surveys Quantitative HR information Expenditures and personnel (statistics)
Qualitative and Quantitative data was used to assess indicators for each of the four Principles. I will provide an example for each principle in the following four slides For more details on results, see the report itself (on-line http://www.ec.gc.ca/scitech : “Managing S&T” )
To assess alignment, we surveyed R&D performers, users and funders across the Department to examine the timeliness and responsiveness of EC’s R&D This slide represents the links between performers and users. The arrows point from R&D producers to users and the thickness of lines represents the intensity of links (i.e. the number of times links between OPs were reported). The “ hubs” or circles happen to be priority areas for EC and the federal government. For example, people under the Risk Assessment group are both RD producers and users, for instance, they are exchanging R&D data with Aquatinc ecosystems protection and conservation group.
We have found that EC’s collaboration rate in publication is increasing, particularly through new international collaborations Though, national collaboration rates were already very high [Transition]: We were also able to map collaborations with key Canadian partners across all sectors
This graph demonstrate the collaboration between EC and collaborators using peer-reviewed publications and co-authorships. Thicker the lines or larger the circle, the greater the number of collaboration At close to 50%, universities account for the majority of collaborations Other important connections are with other major players in environmental research (University of Toronto, DFO, UBC)
Average relative citations index (ARC) was used to measure the impact of science ARC reflects the number of times a paper is cited compared to others in the same field On average, Canadian environmental research publications are cited 10% more often than the world average EC publications in this area are cited 40% more often than the world average This type of result demonstrate our contributions on the national and international environmental science field is strong
Our Enabling environment was rated as “fair” for several reasons. One was the Federal and EC intramural (i.e. in-house) R&D funding in O&M and salaries has remained relatively constant since 2004 . Funding for R&D infrastructure (brick and mortar equipment) appears to be ion the rise, however, these are usually for specific programs, but are not sufficient enough to address the general rust out issue we are all facing in the federal science community In addition, some new investments (not shown here) include: $14 million under Budget 2009 for modernizing EC laboratories $770K (through INAC) for upgrading Arctic research facilities Personnel: How can we mitigate risks of retirement – are we hiring enough people to fill the gap Categories for R&D Personnel based primarily on classifications
[Transition slide – read quickly] Focussed on R&D What about everything else we do in Federal S&T ? Let’s look at RSA.
Why measure RSA? To better account for $1.5B of expenditures on natural science RSA performed across the federal government. To better understand and communicate the role of “public good” science To situate RSA in the federal science and innovation system. To better manage RSA as a cross-cutting horizontal function, addressing potential gaps/challenges and taking advantage of opportunities. Basically We need to take stock of RSA and RSA performance
Taking a different approach than the R&D report, we create an opportunity to work with other SBDA on a product that would benefit the larger S&T community. EC hosted and facilitated a series of workshops that eventually lead to an outline of a white paper on RSA meausement . These workshops revealed common challenges but also new ideas on how to proceed. Botton line: Participants across SBDAs have agreed that this is important issue to work on – together.
Why are we here today? 1- We have lessons to share to anyone considering this type of work. 2- This type of tool has provided us with evidence that now allow us to “tell the S&T Story” more effectively to key stakeholders. 3- At EC, Before we launch the RSA assessment internally, we would like to ensure that the tool box we create is transferable to the entire science community . So that by using the same tools, we can share best practices and learn from one another. Your contribution to the draft White Paper on GCPedia will be greatly appreciated until the end of May.