perspectives on metrics-based research evaluation: an era with …241924/batterham_16_may_pm.pdf ·...
TRANSCRIPT
Perspectives on metrics-based research evaluation: An ERA with impact
Professor Robin J. Batterham Kernot Professor of Engineering
The University of Melbourne
16th May 2011
Perspectives on metrics-based research evaluation: An ERA with impact
• Broad acceptance of the need for an ERA
• The results
• Innovation and the need for output measures
• Impact – a way forward
Perspectives on metrics-based research evaluation: An ERA with impact
Broad acceptance of the need for an ERA • Public funding of R&D accepted as having great value • Benefits highly variable and not always clear • Government and community expect assessment • Many countries have assessment schemes • Excellence in and of itself seen as worthwhile
Perspectives on metrics-based research evaluation: An ERA with impact
Welcomed: • Hardly a surprise when funding will follow rankings
But significant concerns: • Human assessment is needed to complement bibliometrics
• Bibliometrics require expert interpretation
• Interdisciplinary work a challenge (eg on journal rankings)
• Need more emphasis on peer review and esteem indicators
• Concerns that applied research not well covered
Perspectives on metrics-based research evaluation: An ERA with impact
• Broad acceptance of the need for an ERA
• The results
• Innovation and the need for output measures
• Impact – a way forward
Perspectives on metrics-based research evaluation: An ERA with impact
Overall results impressive: • There is clearly much in Australia of world class
0
2
4
6
8
10
12
14
1 2 3 4 5
Series
The 25 2 digit research (sub) codes
Num
ber r
epor
ted
in c
ateg
ory
Wel
l bel
ow w
orld
sta
ndar
d
Wel
l abo
ve w
orld
sta
ndar
d
At w
orld
ave
rage
Perspectives on metrics-based research evaluation: An ERA with impact
And similarly for the more detailed analysis
0
10
20
30
40
50
60
1 2 3 4 5
Series1
The 131 applicable 4 digit research codes
Num
ber r
epor
ted
in c
ateg
ory
At w
orld
ave
rage
Wel
l bel
ow w
orld
sta
ndar
d
Wel
l abo
ve w
orld
sta
ndar
d
Perspectives on metrics-based research evaluation: An ERA with impact
An alternative evaluation: • 1999-2009 for 10 Institutions, 20 research codes, Essential Science Indicators
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Top 10 institutions in Australia ranked on the basis of total publications and citations listed in the Thomson Essential Science Indicators.
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Perspectives on metrics-based research evaluation: An ERA with impact
Performance of the 10 Institutions by ESI categories
0
10
20
30
40
50
60
1 2 3 4 5
Series1
Num
ber r
epor
ted
in c
ateg
ory
Wel
l bel
ow w
orld
sta
ndar
d
At w
orld
ave
rage
Wel
l abo
ve w
orld
sta
ndar
d
Perspectives on metrics-based research evaluation: An ERA with impact
• Broad acceptance of the need for an ERA
• The results
• Innovation and the need for output measures
• Impact – a way forward
Perspectives on metrics-based research evaluation: An ERA with impact
Source: Treasury, Australian Bureau of Statistics
Labour productivity growth cycle
Perspectives on metrics-based research evaluation: An ERA with impact
Sources of Ideas for Innovation
Source: ABS 2006/07
Perspectives on metrics-based research evaluation: An ERA with impact
Company financial performance correlates with strength of underpinning science.
Source: Financial Analysts Journal – May/June 1999. Page 24. “Science and Technology as Predictors of Stock Performance” Zhen Deng, Baruch Lev and Francis Narin
Perspectives on metrics-based research evaluation: An ERA with impact
Perspectives on metrics-based research evaluation: An ERA with impact
Measures of innovation • R&D data is unsatisfactory, measures an input to innovation, rather than an output. • Similarly, data on scientific personnel • Bibliometric data has been used
• It can be difficult to develop efficient search strategies • the link between publication and value is likely to be imperfect at best
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Perspectives on metrics-based research evaluation: An ERA with impact
Patent data as a measure of innovation • Patent data however relates to outputs of the inventive process
o there are very few examples of economically significant inventions which have not been patented o use patents that are claimed elsewhere in world, known to be more valuable
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Perspectives on metrics-based research evaluation: An ERA with impact
Correlation between Government funding of R&D and Innovation for climate change mitigation technologies (CCMT)
(Number of CCMT claimed priorities worldwide by inventor country and priority year; IEA’s Energy Technology R&D expenditures by country and year)
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Perspectives on metrics-based research evaluation: An ERA with impact
General innovation capacity is a key determinant of innovation, more than R&D spend
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Perspectives on metrics-based research evaluation: An ERA with impact
Effectiveness of R&D spend varies with area • For a well developed area, eg wind and biomass
• Increase stock of knowledge by 10% • Investment in production increases by 1%
• For emerging areas, eg solar PV • Increase stock of knowledge by 10% • Investment in production increases by 5%
This is an extraordinary multiplier of a factor of 5
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Perspectives on metrics-based research evaluation: An ERA with impact
The ERA is not a strong pointer for innovation
Research Outputs >330,000 Books, Journals, Chapters, Conf. Papers +
Units of Evaluation 2,435
Staff Numbers 39,668 FTE 55,842 head count
HERDC Cat. 1 Income $3.2 billion Competitive grants
HERDC Cat. 2 Income $1.9 billion Other Public Sector Research Income
HERDC Cat. 3 Income $2.1 billion Industry & Other Research Income
HERDC Cat. 4 Income $381 million CRC Research Income
Research Commercialisation Income $358 million i.e. 4.7% of total income
Patents Sealed 671 *
Plant breeders Rights 31 *
Registered Designs 1 *
NHMRC Endorsed guidelines 49 *
Esteem Measures ~2,000 *
Research outputs include books, book chapters, conference papers, journal articles, curated or exhibited events, live performances, original creative works, recorded rendered works.
HERDC Higher Education Research Data Collection
Perspectives on metrics-based research evaluation: An ERA with impact
• Broad acceptance of the need for an ERA
• The results
• Innovation and the need for output measures
• Impact – a way forward
Perspectives on metrics-based research evaluation: An ERA with impact
UK Higher Education Funding Bodies: • Impact pilot exercise. • Part of the Research Excellence Framework • Aims to identify and reward the impact that excellent research has had on society and the economy
• Encourage the sector to achieve the full potential impact across a broad range of research activity in the future • Proposed 25% of funding for 2014 then to rise!!!
• A very different approach to the ERA
Research Excellence Framework impact pilot exercise: Findings of the expert panels. http://www.hefce.ac.uk/research/ref/pubs/other/re01_10/
Perspectives on metrics-based research evaluation: An ERA with impact
Pilot exercise covered a wide range of disciplines: • Clinical medicine • Physics • Earth Systems and Environmental Sciences • Social Work and Social Policy • English Language and Literature
Perspectives on metrics-based research evaluation: An ERA with impact
Findings of the Expert Panels • It is possible to assess impacts across these disciplines • Expert review of case studies is an appropriate means for assessing impact • It is possible to have a common broad approach with generic criteria and the same weighting for impact • Impact covers social, economic, cultural, environmental, health and quality of life benefits • Impact within academia is not included • 15 years allowed for impact, but assessed only for the current review period
Perspectives on metrics-based research evaluation: An ERA with impact
Feedback from the participating institutions • In the main – supportive • Common menu of indicators light on for measures relevant to the social sciences and humanities • Subject specific assessment panels means a generic model can work for all disciplines • Feedback confirmed the validity of the approach
REF Research Impact Pilot Exercise Lessons-Learned Project: Feedback on Pilot Submissions http://www.hefce.ac.uk/research/ref/pubs/other/re02_10/re02_10.pdf
Perspectives on metrics-based research evaluation: An ERA with impact
In conclusion • The ERA appears to be heavily weighted on research outputs • Independent analysis of the Essential Science Indicators is in line with the ERA findings for much of Australia’s research • The ERA does not help much in outlining the impact of our R&D • The feedback to date from the sector suggests more human assessment is needed to complement bibliometrics • The UK pilot suggest that expert panels evaluating case studies can assess broad impacts of R&D So: let’s discuss how impact might be incorporated in the next ERA, or should it be a separate exercise?
© Copyright The University of Melbourne 2008
Perspectives on metrics-based research evaluation: An ERA with impact
R. Batterham. A 10 year citation analysis of major Australian research institutions Australian Universities Review Vol. 53 (1), 2011 35-41
Ranking of Australian institutions by field against the rest of the world
Perspectives on metrics-based research evaluation: An ERA with impact
DE GB CN CH AU
US 309 181 132 43 27
DE -‐ -‐ 23 102 17
NL 47 48 1 3 5
GB 58 22 7 6
JP 46 38 31 13 6
FR 44 3 26 1
IT 26 13 1 14 -‐
KR 22 6 9 -‐ -‐
DK 35 -‐ 18 4 2
TW 6 3 44 -‐ -‐
IN 13 16 11 1 1
SE 13 2 1 12 -‐
RU 12 15 2 2 2
ES 16 -‐ -‐ 5 -‐
CH -‐ -‐ -‐ -‐ 12
CN -‐ -‐ -‐ 4 5
CA -‐ -‐ -‐ -‐ 1
Other 90 82 27 14 11
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS
Number of inventions (priorities) that involve co-invention between a pair of countries, 1978-2007
Perspectives on metrics-based research evaluation: An ERA with impact
International Research Collaboration in Selected CCMT Technologies (1988-2007
ENV/EPOC/GSP(2010)10/FINAL CLIMATE POLICY AND TECHNOLOGICAL INNOVATION AND TRANSFER: AN OVERVIEW OF TRENDS AND RECENT EMPIRICAL RESULTS