the role of evaluation and ranking of universities in the quality culture
DESCRIPTION
The role of evaluation and ranking of universities in the quality culture. Professor Jean-Marc Rapp EUA President 2 July 2009. Overview. Evaluation vs. Ranking –different concepts Evaluation – an EUA perspective EUA´s quality policy - PowerPoint PPT PresentationTRANSCRIPT
The role of evaluation and ranking of universities in the
quality culture
Professor Jean-Marc Rapp EUA President
2 July 2009
…2…
OverviewEvaluation vs. Ranking –different concepts
Evaluation – an EUA perspective EUA´s quality policy EUA´s activities focusing on the development of a quality
culture
Rankings – an EUA perspective Reflection on current initiatives Rankings and their impact on quality culture
Conclusion – the way forward
RankingRelative positions in participating groups <- mathematic “formulas”
<- performance on a number of selected measures
Independent data or obtained or usually verified by HEI
Usually no site visit
Quality Assurance/evaluation
Judgement of strengths & concerns in a number of measures related to input, process and output of HE -> quality enhancement
Data always obtained from HEI as a self-evaluation report
Almost always involve site visit by peers
…3…
Evaluation & Quality Assurance
Key EUA activities in the field of QAInstitutional Evaluation Programme since 1994Quality Culture –project 2002 – 2006Creativity project 2007Quality Assurance for Higher Education Change Agenda (QAHECA) 2008-2009
European Quality Assurance Forum in co-operation with other E4 partners (ESU, ENQA and EURASHE)Founding member of EQAR
…5…
Quality Culture
…6…
QUALITY CULTURE
Quality Management Technocratic element Tools and mechanisms to measure, evaluate, assure, and enhance quality
Top-down
Quality commitment Cultural element
Individual level: personal commitment to strive for quality
Collective level: individual attitudes add up to culture
Bottom-up
Communication Participation
Trust
Facilitate
EUA’s Policy Positions on QA/Evaluations
Main responsibility for quality assurance lies with the institutionsContext sensitive (institutional and disciplinary diversity)Fitness for purpose approachEnhancement orientedInternal and external evaluations or QA processes should be complementaryTransparency and co-operation
…7…
Rankings
The present landscape 1-Global initiatives
Global rankings: Shanghai ARWU Times-SQ World University Ranking Leiden Ranking
Newspaper drivenEmerging Global Model (EGM) of a ‘world class university’
Therefore, rankings increasingly reflect the prestige and reputation of HEIs according to one specific modelOECD feasibility study for the international assessment of HE Learning Outcomes: AHELO
…9…
The present landscape 2 - European initiatives
European Commission feasibility study to develop multi-dimensional university ranking EU Commission supported statistical database on Higher Education (via Eurostat)European Commission has supported projects to develop a Classification of European HEIsDG Research Expert group is working on methodologies for University-Based Research AssessmentCHE (D) - various classification initiatives with different foci (universities, research rankings, departmental excellence, employability rating)
10…
The present landscape – some observations..
Significant limitations of existing rankings: Not comprehensive: provide an incomplete & once-off snapshot of small segment of a rapidly changing
sector ‘One-size-fits-all’ methodology: do not take account of increasingly differentiated HE landscape in Europe Lack of transparency in the way they are compiled Compilers use available data rather than compiling data Reflect largely reputational factors (40% THES) Dominance of research and metrics – little focus on other missions of the university
Therefore, existing rankings typically favor old, large, Anglo-Saxon research intensive institutions with +/- 24,000 students and a $2 billion annual budget
Therefore, existing rankings typically favor old, large, research intensive institutions with about 24,000 students and a 2 Billion annual budget
…11…
The present landscape – some observations..
Despite the commonly acknowledged limitations, rankings are increasingly used
Institutions:
Governments:
…12…
-seek to influence compilers (HEFCE 2008)-senior managment KPIs are influenced (HEFCE 2008)-change promotion & marketing efforts (HEFCE 2008)-argue for a “value added” approach
- increased interest in transparency instruments at the HE system level (Leuven Communique 2009)- key priorities such as LLL, widening access not accounted for in the rankings explore alternatives
Rankings – what is their purpose?The common “politically correct” purpose:Providing transparent information to students & reflect the prestige of institutions
What are the “real” purposes?- drive research and teaching performance- to allocate and lobby for resources- to identify stakeholders and partners- to promote other policy objectives
…13…
Rankings & quality issuesRankings increasingly equated with quality standards, which is a danger, as:
1.externally defined indicators that are not necessarily linked to an institution’s core mission and objectives
2. some HEIs are tempted to chase rankings and focus on improving what can be measured/indicators rather than focus on their core mission
3. rankings are based on a one-size-fits-all methodology that does not take account of diversity
4. Poor positioning in the rankings can have a negative impact on staff morale (HEFCE 2008)
…14…
Rankings – the way forward?For those developing rankings: promote the use of Berlin Principles ( CEPES, CHE, IHEP, 2006) :
1. Recognise the diversity of HEIs & take account of different missions & goals
2. Be transparent regarding methodology3. Measure outcomes in preference to inputs4. Use audited & verifiable data wherever possible5. Provide consumers with a clear understanding of the
factors involved & offer a choice in how they are displayed i.e. attach their own weightings
…15…
Ranking – EUA´s response The debate on rankings has been launched in EUA policy bodies –Board and CouncilThere has also been discussion in policy dialogue with Asian universitiesEUA has established an internal working group on rankings to consider next steps – Proposals will be made to October 09 Council meeting EUA has commissioned a study on institutional diversity EUA continues to advocate that rankings should not be used as a proxy for quality & thus for QA purposes ..
…16…
Conclusions ...There is a fundamental difference between Quality Assurance and rankings: QA process should always be internally driven (even if
there are external incentives) and aim at enhancing the quality of activities (usually through recommendations) and therefore foster a quality culture.
rankings are externally driven and only state the current situation of an institution in comparison to other institutions on the basis of selected indicators.
…17…
Conclusions ..QA and evaluations usually take into account the variety of missions (diversity of HE) and processes behind the indicators
Rankings measure the performance of an institution against a certain (ideal) model of an institution reflected in the choice of selective indicators by the compilers Whilst the compiler may use objective indicators, combining these indicators is always subject to judgement and hence subjective
…18…