a history and overview of the pbrf (performance-based

33
1 A History and Overview of the PBRF Executive summary This report provides: an overview of tertiary education and research funding prior to the establishment of the PBRF an outline of the development of the PBRF in the context of the early 2000s review of tertiary education a description of the PBRF in practice. Prior to reforms in the 2000s, tertiary education institutions received funding based on equivalent full-time students adjusted by weighting for different course costs. This funding covered capital and operating costs, as well as tuition and research. In allocating funding, there was little attention to accountability, capacity building, and governance. The Tertiary Education Advisory Commission was established by the Government in April 2000 to devise a long-term strategic direction for the tertiary education system. The overall aim of the strategic direction was to make New Zealand a world-leading knowledge society by providing all New Zealanders with opportunities for lifelong learning. The TEAC concluded that there was a strong case for a greater concentration of research effort within the tertiary education system in the interests of enhancing quality and building a greater research capacity. Specifically, it recommended the introduction of a performance- based research fund for tertiary education providers. The PBRF was established in 2002. It is intended to ensure that excellent research in the tertiary education sector is encouraged and rewarded. This entails assessing the research performance of tertiary education organisations (TEOs) and then funding them on the basis of their performance. Degree-granting TEOs are eligible to participate in the PBRF. All universities and some institutes of technology and polytechnics, wānanga, and private training establishments participate in the PBRF.

Upload: others

Post on 18-Dec-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A History and Overview of the PBRF (Performance-Based

1

A History and Overview of the PBRF

Executive summary

This report provides:

• an overview of tertiary education and research funding prior to the establishment of the PBRF

• an outline of the development of the PBRF in the context of the early 2000s review of tertiary education

• a description of the PBRF in practice.

Prior to reforms in the 2000s, tertiary education institutions received funding based on equivalent full-time students adjusted by weighting for different course costs. This funding covered capital and operating costs, as well as tuition and research. In allocating funding, there was little attention to accountability, capacity building, and governance.

The Tertiary Education Advisory Commission was established by the Government in April 2000 to devise a long-term strategic direction for the tertiary education system. The overall aim of the strategic direction was to make New Zealand a world-leading knowledge society by providing all New Zealanders with opportunities for lifelong learning.

The TEAC concluded that there was a strong case for a greater concentration of research effort within the tertiary education system in the interests of enhancing quality and building a greater research capacity. Specifically, it recommended the introduction of a performance-based research fund for tertiary education providers.

The PBRF was established in 2002. It is intended to ensure that excellent research in the tertiary education sector is encouraged and rewarded. This entails assessing the research performance of tertiary education organisations (TEOs) and then funding them on the basis of their performance. Degree-granting TEOs are eligible to participate in the PBRF. All universities and some institutes of technology and polytechnics, wānanga, and private training establishments participate in the PBRF.

Page 2: A History and Overview of the PBRF (Performance-Based

2

Reviewing research funding in the early 2000s

1. Prior to reforms in the 2000s, tertiary education institutions (TEIs) received funding based on equivalent full-time students (EFTS) adjusted by weighting for different course costs. This funding covered capital and operating costs, as well as tuition and research. In allocating funding, there was little attention to accountability, capacity building, and governance. This bulk funding system is also described as research top-ups.

2. A review of tertiary education in the late 1990s made recommendations to change research funding (The White Paper, 1998). The review recognised a need to hold institutions accountable for at least a portion of research funding. It recommended allocating $20 million (approximately 20% of the funding that institutions then used to support research activities) through a contestable pool for advanced research.

3. Appendix 1 provides an overview of tertiary education and research funding prior to 2000.

Reviewing tertiary education in the early 2000s

4. The Tertiary Education Advisory Commission (TEAC) was established by the Government in April 2000 to devise a long-term strategic direction for the tertiary education system. The overall aim of the strategic direction was to make New Zealand a world-leading knowledge society by providing all New Zealanders with opportunities for lifelong learning.

5. The TEAC issued a series of reports making a range of recommendations, including creating the Tertiary Education Commission (TEC), developing a tertiary strategy document, and developing a new funding formula with research and teaching separated.

6. In its initial deliberations, the TEAC concluded that there was a strong case for a greater concentration of research effort within the tertiary education system in the interests of enhancing quality and building a greater research capacity. It recommended that the Government establish a series of Centres of Research Excellence (CoREs).

7. The TEAC believed, however, that more significant changes were needed to the way research in the tertiary education system was funded.

8. In its last report, the TEAC recommended the introduction of a performance-based research fund for tertiary education providers. This led to the establishment, in July 2002, of the PBRF Working Group to advise the then Transition TEC and the Ministry of Education on the detailed design and implementation of a performance-based system for funding research in New Zealand’s degree-granting institutions. The report of the Working Group, Investing in Excellence, was delivered in late 2002, and Cabinet endorsed the report’s recommendations in December 2002.

9. Appendix 2 describes the TEAC in more detail.

Strengths and weaknesses of research funding prior to the early 2000s

10. The TEAC’s 2001 report, Shaping the Funding Framework, identified a number of strengths of the then research funding arrangements, summarised in Table 1.

Page 3: A History and Overview of the PBRF (Performance-Based

3

Table 1: Strengths of the research funding framework as perceived by the TEAC

Strength Comment Compliance costs Compliance costs associated with the allocation of EFTS subsidies were low,

significantly lower (per dollar allocated) than would be the case under a PBRF.

Wide-ranging funding options

Tertiary education organisations (TEOs) had access to a range of funding sources and were not solely dependent upon public funding or a single granting agency.

TEOs had discretion over spending

Funding allocated through EFTS subsidies was delivered to TEOs as a bulk grant, giving a high degree of discretion to TEOs to easily support: • new academic staff who were not established to compete for external

funding • disciplinary areas where little external funding was available • new areas of research which may have struggled to attract funding • academic staff in their role of critic and conscience of society.

Links between teaching and research

Because research funding was driven by EFTS, close links between teaching and research were reinforced, especially at the postgraduate level.

11. The TEAC also noted that the quality and productivity of research was relatively high, especially compared to OECD countries, though performance against bibliometric measures of New Zealand universities’ research output was static in the late 1990s, and New Zealand’s record of registering patents was comparatively poor.

12. The same report also listed several problems with research funding to that point, summarised in Table 2.

Table 2: Problems with the research funding framework as perceived by the TEAC

Problem Comment Limited research funding By international standards, the total quantum of funds (both public

and private) available for research within New Zealand’s tertiary education system was very modest.

Limited sources of research funding

By international standards, the sources of non-governmental funding were limited, because New Zealand had relatively small industrial and charitable sectors and a relative lack of large firms with the ability to support long-term research projects.

Lack of incentives for research excellence

The primary source of public funding for research was based on the level and pattern of student demand, with no weighting for the quantity or quality of research being undertaken.

Lack of public accountability for research funding

The EFTS funding system was characterised by inadequate transparency and public accountability.

Lack of concentrated research funding

The available research funding was spread thinly across all disciplines and institutions, with little scope for existing research providers to coordinate their funding allocations in the interests of a more concentrated approach.

Volatile and short-term nature of most research funding*

By international standards, most of the research funding available was relatively short term in nature, thus reducing funding certainty and predictability.

Page 4: A History and Overview of the PBRF (Performance-Based

4

Problem Comment Large imbalance between disciplinary areas in allocated contestable research funding

Only a modest proportion of the funding provided by research granting agencies was devoted to the humanities, social sciences, law and commerce, despite these areas having constituted a large proportion of research activity within the tertiary sector.*

Inappropriate and inadequate subsidy rates for research training

The subsidy rates for postgraduate students were significantly below the costs of provision.

Weak incentives for private sector research funding

The funding framework provided only modest incentives for leveraging private sector contributions.

Source: Jonathan Boston, “Rationale for the Performance-Based Research Fund and its Evaluation,” Evaluating the Performance-Based Research Fund: Framing the Debate, edited by Leon Bakker, Jonathan Boston, Lesley Campbell and Roger Smyth (Wellington: Institute of Policy Studies, 2006): 13.

* These statements relate to research funding allocated outside of Vote Education, largely by Vote Research, Science and Technology.

13. The TEAC compared New Zealand’s funding systems with a number of other jurisdictions.1 By comparison with the countries reviewed, New Zealand’s funding arrangements for the tertiary education system were found to be distinctive on a number of counts, including:

a. a relatively narrow focus on funding projects and placements, and a corresponding lack of attention to developing or supporting new organisational arrangements, such as centres of excellence, shared infrastructure funds, co-operative research centres, business incubators, post-graduate research schools, and collaborative arrangements between higher education institutions, business and the community

b. a relative lack of funding predictability and stability

c. no separate funds for supporting investments in research infrastructure

d. limited emphasis placed on quality and accountability.

14. It was also noted in the review that a growing number of countries had moved to separate (to varying extents) the funding of research from the funding of tuition, and to allocate a significant proportion of research funds in universities on the basis of various measures of research performance.

Alternative approaches for assessing research quality

15. In parallel with the TEAC’s work, the New Zealand Vice-Chancellors’ Committee (NZVCC) undertook an international survey of research funding mechanisms for higher education. The NZVCC survey informed the TEAC’s decisions.

16. The survey led the TEAC to focus on two types of performance-based research funding, each utilised in two countries:

a. the performance-indicator approach taken by the Australian Institutional Grants Scheme (IGS) and the Israeli research funding model

1 Australia, Britain, Israel, Finland, Hong Kong, and the Netherlands.

Page 5: A History and Overview of the PBRF (Performance-Based

5

b. the peer-review approach taken by the Research Assessment Exercise (RAE) conducted periodically in Britain since 1986 and in Hong Kong since 1993.

17. The performance-indicator approach involved using a series of quantity measures as proxies for assessing the quality of research undertaken within tertiary institutions. Three indicators measured volume of publications, funds for research from external sources, and completions of postgraduate degrees.

18. The approach was relatively low cost. However, the measures used could disadvantage certain institutions. For instance, if the available external research funds from contestable pools and private sources tended to be concentrated in a limited number of disciplinary areas (such as medical and biological sciences), certain institutions were automatically favoured (institutions with medical schools, for example), while others suffered a decline in their share of funding (institutions without medical schools, for example).

19. The peer-review approach took into account the three measures of the performance-indicator approach, though primary emphasis was placed upon the assessment of research quality, as judged by expert disciplinary panels.

20. Research quality in the RAE was assessed by discipline-specific peer-review panels at the level of each university department. Departments would submit information on their research-active staff, research outputs, research students, external research income and research infrastructure.

21. This was generally seen as providing a more accurate assessment of research quality than the performance-indicator approach. However, the peer-review approach involved significantly higher costs from researchers, TEIs, and government agencies, as research had to be collected and researchers’ outputs had to be individually analysed.

Developing the PBRF

TEAC recommendations

22. Weighing the advantages and disadvantages of each international approach, the TEAC determined that neither was ideal for New Zealand’s conditions. Instead, the TEAC, in conjunction with the NZVCC, developed a mixed model, adopting elements of both the performance-indicator and the peer-review approaches.

23. The mixed model the TEAC recommended involved a quality rating of academic staff (weighted at 50% of the fund), a measurement of external research income (weighted at 25% of the fund), and a measurement of post-graduate research degree completions (weighted at 25% of the fund). External research income and post-graduate research degree completions would be measured yearly, with funding determined by a two-year rolling average.

24. Quality ratings would be determined using a system of institutional self-assessment, subject to five-yearly external audits (of a random sample of about 10% of staff), conducted by about 11 independent, multi-disciplinary assessment panels.

25. Rather than academic departments comprising the unit of assessment, as with the RAE, TEAC proposed that each individual eligible academic staff member be evaluated. The TEAC felt that a department- or disciplinary-based assessment regime would be problematic in the context of New Zealand’s small tertiary system (relative to the United

Page 6: A History and Overview of the PBRF (Performance-Based

6

Kingdom), where there are frequently only two or three departments in particular disciplinary areas.

26. Relative to the peer-review approaches undertaken in Britain and Hong Kong, the TEAC viewed its mixed model as having six advantages, including:

a. lower transaction costs as a result of: the use of self-assessment, the reliance upon random audits rather than the external peer review of every research-active academic, and the use of a relatively small number of panels

b. greater incentives for providers to secure external research income

c. a reduced risk of measurement bias by the use of three separate measures.

27. The TEAC acknowledged several drawbacks in its proposal, including increased transaction costs compared with the status quo, and the potential for the weighting given to the measure of external research income to favour universities with medical schools (which attract significant external research income). Further, the TEAC acknowledged that the proposed system for assessing the quality of the research of individual academic staff was relatively novel, and that there could be unforeseen implementation difficulties and unintended impacts on institutional cultures and human resource management.

28. The TEAC recommended that funding for the PBRF consist of the amount then distributed through research top-ups – approximately $120 million – plus at least $20 million per year of additional funding.

PBRF Working Group recommendations

29. Based on the TEAC’s advice on the development of a PBRF, a PBRF Working Group, including representatives of a wide range of stakeholders, was established in July 2002 to provide advice to the Transition TEC and Ministry of Education on the detailed design and implementation arrangements for a PBRF.

30. Members of the Working Group met and corresponded regularly from July to October 2002 to discuss options and issues, consulting with their networks in the process. In addition, Ministry of Education and Transition TEC officials consulted widely with providers, representative bodies, Māori, Pasifika, and other stakeholders. International experts were also consulted.

31. The Working Group developed the PBRF based on the TEAC’s recommendations, the objectives agreed by Cabinet, and the objectives and directions for research set in the Tertiary Education Strategy 2002-2007.

32. The Working Group recommended external peer review to evaluate the quality of researchers, rather than self-assessment audited by external panels. It was determined that this would build the credibility of the PBRF and develop widespread understanding of the research quality standards.

33. The Working Group recommended a two-step quality evaluation process. First, eligible staff would be internally reviewed by their provider and provisionally placed in a quality category, in accordance with the generic descriptors. Second, the relevant external peer-review panel would either confirm the provisional category or classify the individual in a new category. Staff deemed under internal review to have produced research at a level insufficient for recognition by the PBRF would not be re-addressed by the external peer-review panel.

Page 7: A History and Overview of the PBRF (Performance-Based

7

34. This two-step process was designed to:

a. reduce the workload of external peer-review panels by introducing an element of self-evaluation

b. build expertise within the tertiary sector about the quality evaluation process

c. build common understandings of the relative levels of quality

d. enhance opportunities for the quality evaluation process to be linked to providers’ internal staff development systems.

35. The Working Group thought that once expertise and common understandings reached robust and reliable levels, more weight could be placed upon the categories nominated by providers, and external peer-review panels could play an audit role, rather than directly assessing all research outputs, thereby reducing compliance costs.

36. Further, the Working Group designed the peer-review measure to be more comprehensive than the TEAC’s recommendations. The result was an integrated assessment of individuals’ contributions by way of research output, esteem by peers, and contribution to the wider research environment and research training, as opposed to the narrower scrutiny of research outputs envisaged by the TEAC.

37. To increase the importance of quality assessments, the Working Group proposed that a slightly higher proportion of funding be allocated by peer assessment (60%), and a correspondingly lower proportion of funding be allocated in the external research income measure (15%).

38. The Working Group anticipated potential effects of the PBRF, identifying and exploring the following concerns, among others, during the consultation process:

a. impacts on new and emerging researchers

b. impacts on incentives to engage in new and/or “risky” research

c. impacts on Māori and Pacific research

d. impacts on teaching.

The PBRF in practice

39. In sum, the PBRF is intended to ensure that excellent research in the tertiary education sector is encouraged and rewarded. This entails assessing the research performance of research staff at TEOs and then funding the TEOs on the basis of their performance. New Zealand-based, degree-granting TEOs are eligible to participate in the PBRF, as are all subsidiaries that are wholly-owned by a New Zealand-based degree-granting TEO. All universities and some institutes of technology and polytechnics (ITPs), wānanga, and private training establishments (PTEs) participate in the PBRF.

40. The key principles underpinning the participation of a TEO in the PBRF are:

a. the TEO has the authority to grant degrees

b. participation in the PBRF is voluntary

Page 8: A History and Overview of the PBRF (Performance-Based

8

c. TEOs that participate in the PBRF must do so in all three measures even if their funding entitlement in one or more of the measures is zero, or likely to be zero

d. if a PBRF-eligible TEO does not participate in a Quality Evaluation round, then they are unable to make claims for the Research Degree Completion and External Research Income measures until the next Quality Evaluation.

41. Appendix 3 provides the PBRF’s Definition of Research.

42. The PBRF funding formula is based on three indicators (described in more detail in Appendix 4), which together assess both quantity and quality of research:

a. Quality Evaluation (QE): the assessment of the research quality of TEO staff, based largely on peer review of a researcher’s Evidence Portfolio (EP) of research outputs, accounting for 60% of the fund

b. Research Degree Completions (RDC): the number of postgraduate research-based degrees completed in the TEO, accounting for 25% of the fund

c. External Research Income (ERI): the amount of income for research purposes received by the TEO from external sources, accounting for 15% of the fund.

PBRF funding appropriations

43. In the four-year period from 2003/04 to 2006/07, the PBRF was phased in so that a decreasing portion of research funding was allocated as EFTS-based research top-ups, and an increasing portion was allocated through the PBRF. The EFTS-based research top-ups were uncapped. From 2007/08, no funding was allocated through EFTS-based research top-ups.

44. The PBRF has been capped since 2007/08 and can only increase through government budget decisions. The PBRF budget comes from Vote Tertiary Education. Table 3 shows funding for the PBRF from 2003/04 through 2016/17 in nominal and real terms.

45. Graph 1 below shows that PBRF funding in real terms has decreased slightly since 2009/10 but is expected to increase from 2013/14.

Table 3: Funding (in millions, GST exclusive) allocated per year through the PBRF

2003/04 2004/05 2005/06 2006/07 2007/08 2008/09 2009/10

Nominal $115.104 $134.012 $165.196 $200.376 $220.231 $236.114 $244.294 Real

($2003/04) $115.104 $130.302 $154.476 $183.699 $194.100 $204.248 $207.863

2010/11 2011/12 2012/13 2013/14 2014/15 2015/16 2016/17

Nominal $250.000 $250.000 $256.250 $268.750 $281.250 $293.750 $300.000 Real

($2003/04) $206.165 $204.223 $203.975 $209.283 $214.052 $218.497 $217.916

Notes:

Figures for 2003/04 to 2006/07 represent EFTS-based research top-ups plus funding allocated through the PBRF.

Page 9: A History and Overview of the PBRF (Performance-Based

9

The June quarter CPI has been used to calculate funding in real terms. CPI forecasts from Budget Economic and Fiscal Update (BEFU) 2012 have been used to deflate the funding for the 2012/13 to 2016/17 years.

Graph 1: funding (in millions, GST exclusive) allocated per year through the PBRF

46. At $250 million in 2011/12, the PBRF is the largest single fund by which TEOs receive funding from the Government to support research. The Government also funds research through a number of other mechanisms, many of which benefit TEOs. These include the Centres of Research Excellence and Vote Science and Innovation, which are summarised in Appendix 5.

Objectives

47. The main aims of the PBRF as agreed by Government in 2002 are to:

a. increase the average quality of research

b. ensure that research continues to support degree and postgraduate teaching

c. ensure that funding is available for postgraduate students and new researchers

d. improve the quality of public information on research outputs

e. prevent undue concentration of funding that would undermine research support for all degrees or prevent access to the system by new researchers

f. underpin the research strength in the tertiary education sector.

48. The prime focus of the PBRF is to reward and encourage excellence. Excellence is not just about the production of high quality research articles, books and other forms of research output, but also includes:

a. the production and creation of leading-edge knowledge

b. the application of that knowledge

c. the dissemination of that knowledge to students and the wider community

Page 10: A History and Overview of the PBRF (Performance-Based

10

d. supporting current and potential researchers (e.g. postgraduate students) in the creation, application and dissemination of knowledge.

49. The PBRF aims towards cultural inclusiveness, and does so in part through a series of mechanisms to support Māori and Pacific researchers and research.

50. The Māori Knowledge and Development Panel evaluates research into distinctly Māori matters, such as: research into aspects of Māori development, te reo Māori, and tikanga Māori. The panel also provides advice on research that has a significant Māori component but is assessed by other panels.

51. An esteemed group of Pacific researchers helps to define excellence in Pacific research and develops guidance for the peer review panels and specialist advisors on Pacific research. (As discussed at paragraph 66, a Pacific Research Expert Advisory Group has since been established.)

52. Growth in Māori and Pacific research capability is encouraged through a higher equity weighting for research degree completions by Māori and Pacific students – two to one compared to all other ethnicities.

Reporting

53. A reporting framework was built into the PBRF to ensure public access to a wide range of information relating to the research performance and activities of participating TEOs. This was expected to enhance accountability and to improve the capacity of relevant stakeholders in the tertiary education sector to make informed decisions.

54. At the conclusion of each QE round, a major report on the overall results is prepared by the TEC and publicly released. The report includes a brief summary of the QE process, a commentary on the major findings and a detailed description of the results. The results of each QE round are reported at the following levels: each participating TEO, each peer review panel, each subject area, and each academic unit nominated by participating TEOs.

55. Quality categories assigned to individual staff members are reported only to the staff member’s TEO and are not publicly released. TEOs share quality categories with staff if requested or they can be requested from the TEC, including a detailed breakdown of the component scores for the different elements of their evidence portfolio (EP).

56. The TEC publicly reports the annual funding allocated to each participating TEO via the PBRF, including information on the funding generated by each of the three performance measures. The TEC also publishes each year the most recent information concerning the number of research degrees completed in each TEO and the level of PBRF eligible external research income generated by each TEO. PBRF funding allocations by institution from 2004 to 2011 are provided in Appendix 6.

57. The Ministry of Education considers research performance in its regular monitoring of the sector, including analysing the performance of the PBRF in its annual report on the inputs, outcomes, and outputs of the Government’s tertiary education expenditure (What we get for what we spend). The Ministry has also produced reports on topics including research funding shifts since the PBRF was introduced, the impact of the PBRF on the retention of doctoral students, and the impact of the PBRF on research productivity at universities.

Page 11: A History and Overview of the PBRF (Performance-Based

11

Reviews of the PBRF

58. There have been three rounds of QE so far – in 2003, 2006 (partial round)2, and 2012. In establishing the fund, Cabinet agreed that the PBRF be reviewed in 2005, in advance of the 2006 QE round. Soon after, a more comprehensive evaluation strategy was established with evaluation occurring over three phases:

a. The first was to be completed by April 2004 and was to include an evaluation of the implementation process, paying particular attention to the first QE round in 2003, and the short-term impacts of the PBRF on the tertiary education sector.

b. The second was to be completed by 2005 and was to provide a more detailed review and evaluation of the wider impacts of the PBRF on the tertiary education sector. This review actually took place in 2008.

c. The third was initially set to take place between the 2006 and 2012 QE rounds, but was later planned to occur after the 2012 QE. This phase was to determine whether the PBRF achieved its stated objectives and whether the overall benefits exceeded the costs. Note that the current review is beginning simultaneously with the 2012 QE round.

59. Results of the first phase evaluation were published in July 2004 and showed no evidence for a fundamental redesign of the PBRF or of the QE for three reasons:

a. there was no evidence of major design failure reported or observed in the evaluation

b. while participants identified problems with design elements and aspects of the implementation, these were seen as remedial concerns, rather than immediate or fatal threats

c. it was too soon to observe or evaluate the impact of the PBRF upon TEOs and individuals.

60. The evaluation recommended a number of adjustments around staff participation, timing of the 2006 QE, administrative costs, TEO self-assessment, peer-review panels, the definition of research, EP assessment guidelines, scoring systems, and information gathering and reporting.

61. Results of the second phase evaluation were published in June 20083 and showed that the PBRF had been effective in delivering beneficial outcomes in financial, reputational and formative terms. The evaluation also raised issues regarding the relative status of different kinds of research, the individual as a unit of assessment, and a narrowing of types of research outputs.

2 Researchers did not have to participate in the 2006 round. The Working Group had recommended that a QE three years after the first would help to ensure a managed transition, to develop good practices in performance evaluation, and to acknowledge the need to learn from experience after the first QE. In reviewing the 2003 QE, it was determined that a full round was not necessary. Instead, the partial round assessed staff who had not been assessed in 2003, staff who wished to be reported under a different subject area, and staff who wished to be reassessed in the hopes of achieving a higher quality rating. 3 Jonathan Adams, Strategic Review of the Performance-Based Research Fund: The Assessment Process (June 2008).

Page 12: A History and Overview of the PBRF (Performance-Based

12

62. In addition to the planned reviews, all PBRF participants (panellists, moderators, etc.) had opportunities after each QE to provide feedback on the process, which fed into both the TEC’s evaluations of the QEs and the subsequent redesign and implementation processes.

Changes to the PBRF from 2003 to today

63. The PBRF has been altered in a number of ways since its introduction, resulting from information gained through reviews and from issues arising between reviews.

64. The refinement of the PBRF in preparation for the 2006 QE resulted in a number of changes to the QE measure. The most significant of these changes included:

a. the “partial” round provision (quality categories assigned to the EPs of most staff assessed as part of the 2003 QE were carried over to the 2006 QE)

b. changes to the definition of research (clarifying what constitutes research in the creative and performing arts)

c. a specific assessment pathway for new and emerging researchers (recognising that new and emerging researchers were unlikely to have had an opportunity to develop extensive evidence of peer esteem or contribution to the research environment).

65. The refinement of the PBRF in preparation for the 2012 QE resulted in minimal changes. Consultation conducted by a Sector Reference Group determined that there was a strong sector preference for continuity between QEs, because the sector expressed that the QE was not substantially flawed and because the sector expressed that it had built procedures on the assumption that no major change would occur.

66. However, several changes were made, including forming two Expert Advisory Groups (EAGs) – the Professional and Applied Research EAG and the Pacific Research EAG – to ensure that these two types of research receive appropriate assessment.

67. The Professional and Applied Research EAG, in particular, was recommended in the process of reviewing the PBRF in 2008. The review, in part, investigated concerns that the PBRF was biased against professional, practice-based and applied research, thereby not upholding its principle of comprehensiveness. A Professional and Applied Research EAG was established help to uphold comprehensiveness. The Professional and Applied Research EAG has four sub-groups: Commercial, Professional Practice, Social and Environmental. Addressing the same issue, the Definition of Research was updated for the 2012 QE to clarify how it reflects the principle of comprehensiveness (see Appendix 3).

68. There have been ongoing concerns with the application of the staff eligibility rules by institutions as part of their participation in the staff census requirement of the QE. An audit of institutions’ preparedness for the 2012 QE found that there was the potential for different human resource practices applied by TEOs, along with differences in the application of the staff eligibility criteria, to affect institutions’ Average Quality Score (AQS).

Page 13: A History and Overview of the PBRF (Performance-Based

13

Appendix 1: An overview of tertiary education and research funding prior to 2000

To 1990

1. Prior to the late 1980s, New Zealand had what has been described as an “elite” system of tertiary education, in which relatively few students were subsidised at relatively high rates. There was clear differentiation between providers. Universities were funded through the University Grants Committee, which negotiated funding levels with the government based on EFTS, with a separate capital fund. Polytechnics and colleges of education were funded directly by the Department of Education based on staffing levels and requests for equipment and capital expenditures.

2. Along with many other countries, New Zealand shifted over the 1980s to a “mass” system of tertiary education, known as the “learning for life” reforms, in which a higher number of students are subsidised at a lower rate per student. The shift to broader participation in tertiary education was largely a result of tertiary education becoming more and more necessary due to changing labour market conditions and the need for a more highly skilled and flexible labour force.

3. In 1988, the then Government commissioned reviews covering the entire education system, resulting in the Education Act 1989 (with Part 15, relating to tertiary education, inserted via the Education Amendment Act 1990). Various topical reviews of post-school education and training culminated in the 1988 “Report of the Working Group on Post Compulsory Education and Training”, known as “the Hawke report”. The Hawke report recommended, among other things, decentralised decision-making, more effective use of resources, enhanced accountability, a centralised policy and implementation structure, and a national educational qualifications system.

4. Regarding research, the report recommended a distinct system for funding scholarship and research. It recommended that reforms to the post-compulsory education and training system include the creation of a “Public Scholarship and Research Agency” to disburse research funding within the sector.

Early 1990s Reforms

5. Stemming from the recommendations in the Hawke report, a new bulk funding system for institutions was introduced around 1990, applying the same rules to all public tertiary institutions. Institutions were given more decision-making authority over how they spent the funds they received. Institutions received a certain amount per EFTS adjusted by weighting for different course costs. This amount covered capital and operating costs, as well as tuition and research.

6. Reforms in the early 1990s removed enrolment caps and moved towards a more competitive, market-based approach for tertiary education. Private contributions to tuition fees increased, institutions had more autonomy over setting their own fees, and income-contingent student loans were established. Polytechnics expanded their offerings, enrolments in PTEs increased, and wānanga were established as public institutions.

7. The reforms encouraged significant change, especially in terms of increased enrolments, but technical assistance to develop stronger management and financial capabilities

Page 14: A History and Overview of the PBRF (Performance-Based

14

among TEIs was missing, as were accountability measures to encourage successful change.

Late 1990s Reviews

8. A further review of tertiary education in 1997 and 1998 resulted in a series of recommendations outlined in the 1998 White Paper, including more and better information for students, providers and Government; improved accountability, governance, quality assurance and audit; and changes in research funding. Regarding research funding, the White Paper recommendation sought to put in place appropriate policies to assure research quality and accountability, while keeping compliance costs and administrative complexity to a minimum.

9. The White Paper analysis of expenditure patterns showed that providers allocated about $100 million of total tuition subsidies towards research activities. Rather than allocating all of this research funding from Vote Education via subsidies per EFTS, the White Paper recommended allocating $80 million via subsidies per EFTS and the remaining $20 million through a contestable pool for advanced research. It was also recommended that the proportion change over three to five years, such that the end result was the opposite: $20 million would be allocated via subsidies per EFTS, and $80 million through the contestable pool. Subsidies per EFTS would be research-contingent – that is, institutions would be eligible for subsidies based on the quantity and quality of research they produced.

10. Recommendations around research funding, among others, were deemed inadequate. The pool of contestable funds was too small to be cost effective, given administrative and compliance costs required to support it.

11. After the 1999 election, the incoming Government’s policy towards tertiary education shifted from a hands-off approach to an approach emphasising lower costs of student borrowing and lower fees as well as more coordination, strategic direction and central steering. This reflected a change from market-led growth to an emphasis on meeting national objectives.

12. The incoming Government focused on reducing costs for students by eliminating interest on student loans and by incentivising institutions not to raise tuition fees. Regarding broader strategic issues, the Government commissioned the review undertaken by the Tertiary Education Advisory Commission (TEAC), discussed throughout this paper.

Page 15: A History and Overview of the PBRF (Performance-Based

15

Appendix 2: The Tertiary Education Advisory Commission (TEAC)

1. The TEAC provided advice to the Minister on the strategic direction of the tertiary sector, and carried out reviews as agreed with the Minister. It was formed in April 2000 to investigate and provide advice on five issues:

a. the shape of the tertiary system

b. the promotion of collaboration and cooperation

c. relevant courses of study and learning opportunities

d. research in tertiary education, and in particular the principles which should underpin its funding

e. the principles which should underpin funding within the tertiary education system generally.

2. From these themes, the TEAC developed a lengthy list of the tertiary sector’s relative positions, problems and opportunities, including, for example, that a lack of good performance data prevents openness, that unequal access and outcomes prevent equitable and affordable access, and that a lack of strategic planning undermines sustainability.

3. The TEAC gave detailed attention to the question of how research in the tertiary sector should be funded. It consulted extensively with the New Zealand Vice-Chancellors’ Committee and other relevant stakeholders.

4. Specific to research and related issues, the TEAC identified a number of problems with the tertiary sector, including, among others, a lack of stable, long-term resourcing for research-intensive tertiary education providers; the vulnerability of important research areas to volatile learner demand; and the limited range of funding sources for research, including in the humanities and social sciences.

5. Similarly, the TEAC identified a number of challenges and opportunities for the tertiary sector regarding research, including, among others, developing durable and predictable ways of resourcing research and research training within the system; maintaining a broad and evolving research capability while also nurturing centres or networks of excellence; and building a vibrant research environment capable of attracting and retaining excellent researchers.

6. The TEAC produced a series of reports:

a. Shaping a Shared Vision (July 2000) outlined the terms of the review, recommended new directions for tertiary education, and argued that the challenge of ensuring access to lifelong learning in a knowledge society will require new ways of organising, delivering, and recognising tertiary education and learning.

b. Shaping the System (February 2001) described system-wide structures and instruments that could bring about change, and recommended that the tertiary system required mechanisms, policy instruments, and structures that would allow for

Page 16: A History and Overview of the PBRF (Performance-Based

16

more effective engagement and steering of the system by the Government and stakeholders.

c. Shaping the Strategy (July 2001) outlined strategies and priorities for the tertiary education system.

d. Shaping the Funding Framework (November 2001) established a funding framework for the tertiary education system.

e. Shaping the Funding Framework: Summary Report (November 2001).

7. The second report addressed broad concerns about research, indicating needs for:

a. greater specialisation and concentration of research activity within the tertiary education system

b. increased collaboration and cooperation in research across the system, as well as improved linkages between centres and networks of research excellence and other parts of the tertiary education system and with those outside it, so as to bring about knowledge transfer and application

c. supporting research-led teaching as a prerequisite for degree and post-graduate teaching programmes

d. system-wide involvement in knowledge production and dissemination of research to some extent, even if not all parts of the system have a role at the cutting edge

e. universities to remain the primary providers of post-graduate education

f. more emphasis upon the development of a research workforce, including that of Māori and Pasifika

g. greater investment in research infrastructure.

8. The third and fourth reports addressed funding. The TEAC determined that the then funding system, along with regulatory policies, was a barrier to the goals of higher quality, greater capacity, and better industry linkage. It noted that overseas jurisdictions had moved to allocate funding based on specified quality criteria and had established accountability mechanisms.

9. In shaping a new funding framework, the TEAC’s key proposals were:

a. to develop a new unified and coherent funding framework involving the following measures:

i. the use of charters and profiles in conjunction with quality and desirability tests to steer the system

ii. the development of a new single funding formula

iii. a comprehensive cost and funding category review

iv. a separate fund for financing adult and community education

Page 17: A History and Overview of the PBRF (Performance-Based

17

b. to separate much of the funding of tuition and research, and to re-allocate the funding for research by means of a performance-based assessment system

c. to introduce a series of funds to support research:

i. a performance-based research fund

ii. two funds to support centres/networks of research excellence

d. to introduce a strategic development fund to assist in system innovation and management, to be partly funded by the discontinuation of the base grants for TEIs

e. to develop mechanisms to improve the quality, efficiency, and effectiveness of the system

f. to implement specific measures to support the learning of Māori and Pasifika

g. to review policies surrounding student financial support.

10. The TEAC’s recommendation to introduce a performance-based research fund for tertiary education providers lead to the establishment, in July 2002, of the PBRF Working Group to advise the then Transition TEC and the Ministry of Education on the detailed design and implementation of a performance-based system for funding research in New Zealand’s degree-granting institutions.

Page 18: A History and Overview of the PBRF (Performance-Based

18

Appendix 3: The PBRF’s Definition of Research

Definition

1. For the purposes of the PBRF, research is original investigation undertaken in order to contribute to knowledge and understanding and, in the case of some disciplines, cultural innovation or aesthetic refinement.

2. It typically involves enquiry of an experimental or critical nature driven by hypotheses or intellectual positions capable of rigorous assessment by experts in a given discipline.

3. It is an independent4, creative, cumulative and often long-term activity conducted by people with specialist knowledge about the theories, methods and information concerning their field of enquiry. Its findings must be open to scrutiny and formal evaluation by others in the field, and this may be achieved through publication or public presentation.

4. In some disciplines, the investigation and its results may be embodied in the form of artistic works, designs or performances.

5. Research includes contribution to the intellectual infrastructure of subjects and disciplines (eg. dictionaries and scholarly editions). It also includes the experimental development of design or construction solutions, as well as investigation that leads to new or substantially improved materials, devices, products or processes.

Excluded activities

6. The following activities are excluded from the Definition of Research except where they are used primarily for the support, or as part, of research and experimental development activities:

a. preparation for teaching

b. the provision of advice or opinion, except where it is consistent with the PBRF’s Definition of Research

c. scientific and technical information services

d. general purpose or routine data-collection

e. standardisation and routine testing (but not including standards development)

f. feasibility studies (except into research and experimental development projects)

g. specialised routine medical care

h. the commercial, legal and administrative aspects of patenting, copyrighting or licensing activities

4 The term ‘independent’ here should not be construed to exclude collaborative work.

Page 19: A History and Overview of the PBRF (Performance-Based

19

i. routine computer programming, systems work or software maintenance (but note that research into and experimental development of, for example, applications software, new programming languages and new operating systems is included)

j. any other routine professional practice (eg. in arts, law, architecture or business) that does not comply with the Definition.5

Professional and Applied Research6

7. The definition of research given above is specifically intended to be a broad characterisation that includes original investigation of a professional and applied nature.

8. The PBRF Quality Evaluation explicitly recognises that high-quality research is not restricted to theoretical inquiry alone but occurs across the full spectrum of original investigative activity.

9. With this in mind, the 2012 QE will introduce a Professional and Applied Research expert advisory group to be consulted by peer review panels for assistance in the assessment of EPs containing Research Outputs (ROs) of a professional and/or applied nature.

10. In addition, the panel specific guidelines for each peer review panel will contain information to assist the Evidence Portfolio preparation of researchers working in applied fields.

5 Clinical trials, evaluations and similar activities will be included, where they are consistent with the definition of research. 6 This section was added to the Definition of Research for the 2012 QE.

Page 20: A History and Overview of the PBRF (Performance-Based

20

Appendix 4: The PBRF’s measures

Quality Evaluation

1. The assessment of research quality – Quality Evaluation (QE) – is undertaken by interdisciplinary peer review panels consisting of disciplinary experts from both within New Zealand and overseas. These panels provide expert coverage of the subject areas within each panel’s respective field of responsibility.

2. Each individual researcher presents their research in the form of an evidence portfolio (EP). The EP has three components:

a. research outputs: the outputs of a staff member’s research (each staff member nominates up to four of their best research outputs for primary consideration by the panel, and up to 30 other research outputs (ORO))

b. peer esteem: an indication of the quality of the research of the staff member, as recognised by their peers in the form of fellowships, prizes, awards, memberships of learned societies, participation in editorial boards, invitations to present at conferences, favourable reviews, etc. (each staff member determines their top 30 examples, providing a list and details to the peer review panel)

c. contribution to the research environment: the staff member’s contribution to a vital, high-quality research environment, both within the TEO and beyond it, as evidenced by membership in research consortia, generation of external research income, supervision of student research, etc. (each staff member determines their top 30 examples, providing a list and details to the peer review panel).

3. In assessing the EP, the scores assigned to each component are weighted to calculate a weighted total score, which corresponds to a quality category. There are six quality categories:

a. Quality Category “A”: For an EP to be assigned an “A” it would normally be expected that the staff member has, during the assessment period in question, produced research outputs of a world-class standard, established a high level of peer recognition and esteem within the relevant subject area of their research, and made a significant contribution to the New Zealand and/or international research environments.

b. Quality Category “B”: For an EP to be assigned a “B” it would normally be expected that the staff member has, during the assessment period in question, produced research outputs of a high quality, acquired recognition by peers for their research at least at a national level, and made a contribution to the research environment beyond their institution and/or a significant contribution within their institution.

c. Quality Category “C”: For an EP to be assigned a “C” it would normally be expected that the staff member has, during the assessment period in question, produced a reasonable quantity of quality-assured research outputs, acquired some peer recognition for their research, and made a contribution to the research environment within their institution. (This Quality Category is available for the EPs of all PBRF-eligible staff members except new and emerging researchers.)

Page 21: A History and Overview of the PBRF (Performance-Based

21

d. Quality Category “C(NE)”: For an EP to be assigned a “C(NE)” a new or emerging researcher would normally be expected, during the assessment period in question, to have produced a reasonable platform of research, as evidenced by having: either (a) completed their doctorate or equivalent qualification and produced at least two quality-assured research outputs, or (b) produced research outputs equivalent to a doctorate and at least two quality-assured research outputs. (This Quality Category is available for the EPs of new and emerging researchers only.)

e. Quality Category “R”: An EP will be assigned an “R” when it does not demonstrate the quality standard required for a “C” Quality Category or higher. (This Quality Category is available for the EPs of all PBRF-eligible staff members except new and emerging researchers.)

f. Quality Category “R(NE)”: An EP will be assigned an “R(NE)” when it does not demonstrate the quality standard required for a “C(NE)” Quality Category or higher. (This Quality Category is available for the EPs of new and emerging researchers only.)

4. EPs are evaluated through a rigorous, collaborative process. EPs are assigned to a primary and secondary panellist who independently assess the EP and then agree an initial score together. This score is then discussed at the panel meeting and a final score is decided. The all the scores are moderated by panel and between the panels.

5. Funding in relation to the QE is based on:

a. the quality categories assigned to EPs

b. the funding weighting for the subject area to which EPs have been assigned

c. the full-time-equivalent (FTE) status of the participating TEO’s PBRF-eligible staff as at the date of the PBRF Census.7

6. QEs are conducted every six years. However, given the need for a managed transition, the second QE round took place three years after the first, but was a partial round. Thus, QEs have taken place in 2003 and 2006 (partial). The third QE is taking place this year (2012).

7. Table 4 shows the twelve peer review panels that assess EPs and the subject areas that each panel is responsible for assessing.

8. There are two key principles underpinning the eligibility of a TEO’s staff member to participate in a QE:

a. The individual is expected to contribute to the learning environment at the degree level.

AND/OR

b. The individual is expected to make a sufficiently substantive contribution to research activity.

7 Funding is generated in proportion to FTE status of eligible staff. This calculation takes into account such things as staff who are concurrently employed at more than one institution and staff who changed their employment status in the year leading up to the QE.

Page 22: A History and Overview of the PBRF (Performance-Based

22

Table 4: PBRF EP assessment panels and subject areas

PANEL SUBJECT AREA Biological Sciences Agriculture and other applied biological sciences

Ecology, evolution and behaviour Molecular, cellular and whole organism biology

Business and Economics Accounting and finance Economics Management, human resources, industrial relations, international business and other business Marketing and tourism

Creative and Performing Arts Design Music, literary arts and other arts Theatre and dance, film and television and multimedia Visual arts and crafts

Education Education Engineering, Technology and Architecture

Architecture, design, planning, surveying Engineering and technology

Health Dentistry Nursing Other health studies (including rehabilitation therapies Pharmacy Sport and exercise science Veterinary studies and large animal science

Humanities and Law English language and literature Foreign languages and linguistics History, history of art, classics and curatorial studies Law Philosophy Religious studies and theology

Māori Knowledge and Development Māori knowledge and development Mathematical and Information Sciences and Technology

Computer science, information technology, information sciences Pure and applied mathematics Statistics

Medicine and Public Health Biomedical Clinical medicine Public health

Physical Sciences Chemistry Earth sciences Physics

Social Sciences and Other Cultural/Social Studies

Anthropology and archaeology Communications, journalism and media studies Human geography Political science, international relations and public policy Psychology Sociology, social policy, social work, criminology and gender studies

9. Other elements underpinning the staff-participation criteria are:

a. The staff member has an explicit requirement to teach and/or undertake research as one of their employment functions, as at the date of the PBRF Census.

b. A sufficiently substantive contribution is determined by applying the substantiveness test.

Page 23: A History and Overview of the PBRF (Performance-Based

23

c. The full time equivalent (FTE) counted in the QE for each PBRF-eligible staff member is generally that contained in their employment agreement.

d. Employment history in the 12-month period prior to the PBRF Census date is to be apportioned on a FTE basis to ensure fair representation of staff time, and to minimise “poaching”.

e. Staff employed in wholly owned subsidiaries and fully controlled trusts of the TEO are PBRF-eligible, since these bodies operate under the control of the participating TEO.

f. Provision has been made to allow staff members based overseas, and staff members sub-contracted to TEOs by non-TEOs, to be PBRF-eligible under certain conditions.

Research Degree Completions

10. Research degree completions (RDC) is a measure of the number of research-based postgraduate degrees (e.g. masters and doctorates) that are completed within a TEO and that meet the following criteria:

a. The degree has an externally assessed research component of 0.75 EFTS value or more.

b. The student who has completed the degree has met all compulsory academic requirements by the end of the relevant year (the year preceding the return).

c. The student has successfully completed the course.

11. The use of RDC as a performance measure in the PBRF serves two key purposes:

a. It captures, at least to some degree, the connection between staff research and research training – thus providing some assurance of the future capability of tertiary education research.

b. It provides a proxy for research quality. The underlying assumption is that students who choose to undertake lengthy, expensive and advanced degrees (especially doctorates) tend to search out departments and supervisors who have reputations in the relevant fields for high-quality research and research training.

12. Within the RDC component of the PBRF, a funding allocation ratio calculated on a rolling average basis determines the amount allocated to each TEO annually. For example, in 2009, the funding allocation ratio for each TEO was 15 percent of its RDC figure for 2005, 35 percent of its RDC figure for 2006, and 50 percent of its RDC figure for 2007.

13. The funding formula for the RDC component includes weightings for the following factors:

a. the funding category of the subject area (a cost weighting; the same as applies in the QE part of the PBRF; the funding categories are also the same as in the Student Achievement Component (SAC) funding) Māori and Pacific student completions (an equity weighting, aimed to encouraged TEOs to enrol and support Māori and Pacific students who have little representation at higher levels of the qualifications framework)

Page 24: A History and Overview of the PBRF (Performance-Based

24

b. the volume of research in the degree programme (a research-component weighting using a volume of research factor (VRF) to represent the amount of research associated with the qualification completed).

External Research Income

14. External Research Income (ERI) is a measure of research income received by a TEO and/or any wholly-owned subsidiary.

15. ERI is included as a performance measure in the PBRF on the basis that it provides a good proxy for research quality. The underlying assumption is that external research funders are discriminating in their choice of whether and who to fund and that they allocate their limited resources to those they see as undertaking research of a high quality.

16. Only research funding from outside the tertiary sector (and contestable funding from within the tertiary sector) can be included as ERI. All eligible forms of ERI are treated equally in the funding formula. Income cannot be included in the ERI calculation until the work has been “undertaken”.

17. Government funding secured for research from sources other than the PBRF – such as the Foundation for Science Research and Technology, New Zealand Trade and Enterprise, and Marsden Funding – is declared by each TEO in their ERI returns.

18. This measure excludes income from TEO employees who receive external research income in their personal capacity (i.e. the external research income is received by them and not their employer). Also excluded is income from controlled trusts, partnerships, and joint ventures.

19. Within the ERI component of the PBRF, a funding allocation ratio calculated on a three-year rolling average basis determines the amount allocated to each TEO annually. This is the same as with the RDC component.

20. Each participating TEO submits an ERI return to the TEC. This return shows the TEO’s total PBRF-eligible ERI for the 12 months ending December 31 of the previous year. In addition, in support of each ERI calculation, the TEO must provide an independent audit opinion and a declaration signed by the TEO’s Chief Executive.

Page 25: A History and Overview of the PBRF (Performance-Based

25

Appendix 5: Other government research funding

Fund/Organisation Amount (2011/12 in millions,

GST excl)

Established Purpose

Vote Tertiary Education Centres of Research Excellence (CoREs)

$33.7 2002 produce world-class research at TEIs that contributes to New Zealand’s future development

Wānanga Research Capability Fund

$1.5 20088 provide funding for research within the wānanga sector, particularly within mātauranga Māori

Vote Science and Innovation9 Crown Research Institutes (CRIs)

$220.1 1992 address New Zealand’s most pressing issues through the CRIs and achieve economic growth

Health Research Council

$76.6 1990 support research that has the potential to improve health outcomes and delivery of healthcare, and to produce economic gain for New Zealand

Marsden Fund $46.8 1994 fund excellent fundamental research that benefits society as a whole by contributing to the development of researchers with knowledge, skills and ideas

Vision Mātauranga Capability Fund

$6.6 2010 support the Vision Mātauranga policy that aims to unlock the science and innovation potential of Māori knowledge, resources and people for the environmental, economic, social and cultural benefit of New Zealand

8 The current funding determination provides funding from 1 January 2013 to 31 December 2014. Funding for the initiative within the tertiary education baseline is ongoing. 9 In addition to the Vote Science and Innovation funds listed in the table, the Government is forecast to have spent approximately $330 million in 2011/12 through Vote Science and Innovation for science-led research and business-led research and development. Further, the Ministry of Science and Innovation is developing the National Science Challenges (NSC), which provide a means to address the most pressing complex issues facing New Zealand, by concentrating scientific effort and providing additional focus on these key areas.

Page 26: A History and Overview of the PBRF (Performance-Based

26

Appendix 6: PBRF funding (GST exclusive, nominal values) allocated per year

Table 6.1 PBRF total allocation

TEO 2004 2005 2006 2007 2008 2009 2010 2011

Auckland University of Technology $292,427 $689,104 $2,258,785 $5,437,322 $6,505,517 $6,420,128 $7,580,719 $8,106,404

Lincoln University $557,361 $1,350,025 $4,367,831 $7,035,924 $8,302,565 $8,597,344 $8,622,299 $8,481,944

Massey University $2,338,067 $5,607,100 $16,927,897 $31,011,666 $34,574,343 $35,350,726 $35,016,295 $34,663,490

University of Auckland $4,783,807 $12,006,894 $38,371,320 $62,730,249 $69,518,099 $69,799,017 $73,244,499 $73,956,926

University of Canterbury $1,971,450 $4,952,909 $14,410,713 $20,623,571 $22,833,213 $24,713,383 $27,130,968 $27,138,898

University of Otago $3,728,635 $8,450,375 $25,018,092 $42,156,338 $48,908,987 $50,623,041 $52,946,805 $52,519,856

University of Waikato $1,216,926 $2,805,316 $8,661,027 $13,296,585 $15,166,348 $15,251,174 $15,628,083 $15,367,926

Victoria University of Wellington $1,430,912 $3,326,702 $10,263,260 $18,830,313 $19,648,563 $21,487,096 $23,217,071 $23,088,969

Unitec New Zealand $155,854 $358,976 $1,102,732 $2,275,564 $2,607,837 $2,772,155 $2,916,711 $3,115,684

Waikato Institute of Technology $31,138 $70,991 $242,450 $435,077 $583,670 $640,263 $614,320 $577,497

Otago Polytechnic $14,006 $46,969 $487,967 $589,675 $593,863 $696,044 $662,137

Manukau Institute of Technology $424,492 $472,652 $484,795 $503,977 $509,338 Christchurch Polytechnic Institute of Technology $329,812 $378,677 $423,837 $424,254 $403,389

Open Polytechnic of New Zealand $209,227 $237,083 $212,688 $193,970 $178,525

Eastern Institute of Technology $132,429 $149,179 $154,148 $165,732 $218,013 Nelson Marlborough Institute of Technology $70,443 $79,071 $81,490 $85,350 $85,360

Whitireia Community Polytechnic $57,247 $67,216 $77,237 $76,728 $70,273

Northland Polytechnic $46,595 $52,693 $54,866 $61,623 $59,737

Page 27: A History and Overview of the PBRF (Performance-Based

27

TEO 2004 2005 2006 2007 2008 2009 2010 2011

Te Wānanga o Aotearoa $8,088 $19,422 $59,631 $143,670 $156,279 $156,823 $162,641 $162,661

Te Wānanga o Raukawa $362 $404 $329

Te Whare Wānanga o Awanuiarangi $7,003 $28,377 $202,132 $294,782 $279,214 $277,078 $275,864

Whitecliffe College of Arts and Design $9,946 $22,230 $79,072 $141,603 $218,253 $282,753 $210,713 $158,057 Laidlaw College (originally Bible College of NZ) $4,565 $10,962 $33,656 $94,743 $91,114 $60,208 $51,233 $61,010

Carey Baptist College $3,064 $7,357 $22,587 $42,711 $47,943 $49,790 $52,001 $51,853

Bethlehem Institute of Education $0 $0 $0 $25,335 $28,200 $28,094 $29,258 $28,891

Anamata $612 $1,471 $4,518 $45,145 $45,512 $27,518 $16,012 $12,939

AIS St Helens $1,226 $2,943 $9,035 $18,305 $20,547 $21,175 $22,178 $22,181 Te Whare Wānanga o Te Pihopatanga o Aotearoa $1,226 $2,943 $9,035 International Pacific College New Zealand $1,222 $1,313

Good Shepherd College $18,305 $20,547 $21,175 $22,178 $22,181

Total $16,536,889 $39,708,445 $121,917,317 $206,322,767 $231,598,565 $238,664,001 $249,968,740 $250,000,003

Page 28: A History and Overview of the PBRF (Performance-Based

28

Table 6.2 PBRF QE allocation

TEO name 2004 2005 2006 2007 2008 2009 2010 2011 Auckland University of Technology $210,721 $505,984 $1,553,528 $3,390,225 $3,805,474 $3,921,882 $4,107,649 $4,108,163 Lincoln University $297,108 $713,415 $2,190,408 $3,861,572 $4,334,554 $4,467,147 $4,678,741 $4,679,326 Massey University $1,325,959 $3,183,899 $9,775,564 $18,017,865 $20,224,767 $20,843,440 $21,830,726 $21,833,456 University of Auckland $3,015,170 $7,240,038 $22,229,174 $33,443,196 $37,539,456 $38,687,782 $40,520,297 $40,525,364 University of Canterbury $1,177,932 $2,828,456 $8,684,243 $13,118,260 $14,725,038 $15,175,475 $15,894,288 $15,896,276 University of Otago $2,158,243 $5,182,379 $15,911,520 $27,758,620 $31,158,612 $32,111,749 $33,632,779 $33,636,984 University of Waikato $690,105 $1,657,082 $5,087,760 $7,896,033 $8,863,172 $9,134,295 $9,566,957 $9,568,154 Victoria University of Wellington $869,633 $2,088,167 $6,411,324 $12,049,109 $13,524,934 $13,938,659 $14,598,889 $14,600,714 Unitec New Zealand $136,369 $327,450 $1,005,373 $1,924,044 $2,159,709 $2,225,774 $2,331,202 $2,331,493 Waikato Institute of Technology $18,689 $44,876 $137,784 $285,615 $320,599 $330,406 $346,056 $346,099 Otago Polytechnic $0 $0 $413,322 $463,947 $478,139 $500,787 $500,849 Manukau Institute of Technology $410,027 $460,248 $474,327 $496,795 $496,857 Christchurch Polytechnic Institute of Technology $310,388 $348,405 $359,063 $376,071 $376,118 Open Polytechnic of New Zealand $144,242 $161,909 $166,862 $174,765 $174,787 Eastern Institute of Technology $131,795 $147,937 $152,462 $159,684 $159,704 Nelson Marlborough Institute of Technology $70,433 $79,071 $81,490 $85,350 $85,360 Whitireia Community Polytechnic $52,474 $58,901 $60,703 $63,578 $63,586 Northland Polytechnic $45,029 $50,545 $52,091 $54,559 $54,566 Te Wānanga o Aotearoa $8,088 $19,422 $59,631 $134,235 $150,677 $155,286 $162,641 $162,661 Te Wānanga o Raukawa $0 $0 $0 Te Whare Wānanga o Awanuiarangi $0 $164,620 $184,784 $190,437 $199,457 $199,482

Page 29: A History and Overview of the PBRF (Performance-Based

29

TEO name 2004 2005 2006 2007 2008 2009 2010 2011 Whitecliffe College of Arts and Design $3,425 $8,225 $25,253 $35,389 $39,724 $40,939 $42,878 $42,883 Laidlaw College (originally Bible College of NZ) $4,565 $10,962 $33,656 $21,356 $23,971 $24,705 $25,875 $25,878 Carey Baptist College $3,064 $7,357 $22,587 $42,711 $47,943 $49,409 $51,749 $51,756 Bethlehem Institute of Education $0 $0 $0 $18,305 $20,547 $21,175 $22,178 $22,181 Anamata $612 $1,471 $4,518 $10,677 $11,986 $12,352 $12,937 $12,939 AIS St Helens $1,226 $2,943 $9,035 $18,305 $20,547 $21,175 $22,178 $22,181 Te Whare Wānanga o Te Pihopatanga o Aotearoa $1,226 $2,943 $9,035 International Pacific College New Zealand $0 $0 Good Shepherd College $18,305 $20,547 $21,175 $22,178 $22,181 Total $9,922,133 $23,825,067 $73,150,393 $123,786,160 $138,948,004 $143,198,399 $149,981,244 $149,999,998

Page 30: A History and Overview of the PBRF (Performance-Based

30

Table 6.3 PBRF RDC allocation

TEO 2004 2005 2006 2007 2008 2009 2010 2011 Auckland University of Technology $66,028 $136,773 $519,782 $1,610,426 $2,031,786 $1,747,509 $2,750,001 $3,297,160 Lincoln University $151,212 $330,007 $1,046,292 $1,288,303 $1,996,892 $2,111,036 $1,847,373 $1,805,332 Massey University $704,195 $1,633,511 $4,723,206 $8,977,919 $9,958,596 $9,953,870 $8,553,337 $8,097,302 University of Auckland $880,778 $2,546,391 $9,081,237 $17,533,849 $18,941,949 $17,952,070 $18,880,667 $19,688,355 University of Canterbury $610,146 $1,694,895 $4,679,754 $5,748,401 $5,954,158 $7,105,856 $8,628,278 $8,231,019 University of Otago $893,745 $1,791,331 $4,819,082 $7,199,552 $9,861,896 $10,479,672 $10,729,964 $10,603,094 University of Waikato $390,257 $815,826 $2,549,434 $3,693,500 $4,485,196 $4,289,982 $4,272,878 $3,979,003 Victoria University of Wellington $402,076 $900,734 $2,796,146 $4,883,820 $3,687,545 $4,905,701 $5,727,470 $5,543,847 Unitec New Zealand $16,303 $17,507 $53,819 $282,105 $375,723 $419,245 $431,075 $683,148 Waikato Institute of Technology $11,738 $23,809 $81,414 $93,030 $199,466 $263,376 $238,888 $211,225 Otago Polytechnic $14,006 $46,969 $56,120 $101,843 $95,517 $179,314 $119,303 Manukau Institute of Technology $0 $0 $0 $0 $0 Christchurch Polytechnic Institute of Technology $0 $0 $0 $0 $0 Open Polytechnic of New Zealand $0 $0 $0 $0 $0 Eastern Institute of Technology $0 $0 $0 $0 $43,462 Nelson Marlborough Institute of Technology $0 $0 $0 $0 $0 Whitireia Community Polytechnic $0 $0 $0 $0 $0 Northland Polytechnic $0 $0 $0 $0 $0 Te Wānanga o Aotearoa $0 $0 $0 $0 $0 $0 $0 $0 Te Wānanga o Raukawa $0 $0 $0 Te Whare Wānanga o Awanuiarangi $7,003 $28,377 $32,391 $73,846 $65,290 $59,771 $47,446 Whitecliffe College of Arts and Design $6,521 $14,006 $53,819 $106,213 $178,529 $241,814 $167,835 $115,174 Laidlaw College (originally Bible College of NZ) $0 $0 $0 $71,939 $66,137 $35,063 $25,334 $35,132 Carey Baptist College $0 $0 $0 $0 $0 $0 $0 $0

Page 31: A History and Overview of the PBRF (Performance-Based

31

TEO 2004 2005 2006 2007 2008 2009 2010 2011 Bethlehem Institute of Education $0 $0 $0 $0 $0 $0 $0 $0 Anamata $0 $0 $0 $0 $0 $0 $0 $0 AIS St Helens $0 $0 $0 $0 $0 $0 $0 $0 Te Whare Wānanga o Te Pihopatanga o Aotearoa $0 $0 $0 International Pacific College New Zealand $1,222 $1,313 Good Shepherd College $0 $0 $0 $0 $0 $4,134,222 $9,927,111 $30,479,332 $51,577,567 $57,913,562 $59,666,001 $62,492,185 $62,500,002

Page 32: A History and Overview of the PBRF (Performance-Based

32

Table 6.4 PBRF ERI allocation

TEO 2004 2005 2006 2007 2008 2009 2010 2011 Auckland University of Technology $15,678 $46,347 $185,475 $436,671 $668,257 $750,737 $723,069 $701,081 Lincoln University $109,043 $306,603 $1,131,131 $1,886,049 $1,971,119 $2,019,161 $2,096,185 $1,997,286 Massey University $307,913 $789,690 $2,429,127 $4,015,882 $4,390,980 $4,553,416 $4,632,232 $4,732,732 University of Auckland $887,860 $2,220,465 $7,060,909 $11,753,204 $13,036,694 $13,159,165 $13,843,535 $13,743,207 University of Canterbury $183,372 $429,558 $1,046,716 $1,756,910 $2,154,017 $2,432,052 $2,608,402 $3,011,603 University of Otago $676,647 $1,476,666 $4,287,490 $7,198,166 $7,888,479 $8,031,620 $8,584,062 $8,279,778 University of Waikato $136,564 $332,408 $1,023,833 $1,707,052 $1,817,980 $1,826,897 $1,788,248 $1,820,769 Victoria University of Wellington $159,203 $337,801 $1,055,790 $1,897,384 $2,436,084 $2,642,736 $2,890,712 $2,944,408 Unitec New Zealand $3,182 $14,019 $43,539 $69,416 $72,405 $127,136 $154,434 $101,043 Waikato Institute of Technology $710 $2,305 $23,253 $56,432 $63,605 $46,481 $29,376 $20,173 Otago Polytechnic $0 $0 $18,525 $23,885 $20,207 $15,943 $41,985 Manukau Institute of Technology $14,466 $12,404 $10,468 $7,182 $12,481 Christchurch Polytechnic Institute of Technology $19,425 $30,272 $64,774 $48,183 $27,271 Open Polytechnic of New Zealand $64,985 $75,174 $45,826 $19,205 $3,738 Eastern Institute of Technology $635 $1,242 $1,686 $6,048 $14,847 Nelson Marlborough Institute of Technology $0 $0 $0 $0 $0 Whitireia Community Polytechnic $4,773 $8,315 $16,534 $13,150 $6,687 Northland Polytechnic $1,565 $2,148 $2,775 $7,064 $5,171 Te Wānanga o Aotearoa $0 $0 $0 $9,436 $5,602 $1,537 $0 $0 Te Wānanga o Raukawa $362 $404 $329 Te Whare Wānanga o Awanuiarangi $0 $5,120 $36,152 $23,487 $17,850 $28,936 Whitecliffe College of Arts and Design $0 $0 $0 $0 $0 $0 $0 $0 Laidlaw College (originally Bible College of NZ) $0 $0 $0 $1,449 $1,006 $440 $24 $0 Carey Baptist College $0 $0 $0 $0 $0 $381 $252 $97

Page 33: A History and Overview of the PBRF (Performance-Based

33

TEO 2004 2005 2006 2007 2008 2009 2010 2011 Bethlehem Institute of Education $0 $0 $0 $7,030 $7,653 $6,919 $7,080 $6,710 Anamata $0 $0 $0 $34,468 $33,526 $15,166 $3,075 $0 AIS St Helens $0 $0 $0 $0 $0 $0 $0 $0 Te Whare Wānanga o Te Pihopatanga o Aotearoa $0 $0 $0 International Pacific College New Zealand $0 $0 Good Shepherd College $0 $0 $0 $0 $0 $2,480,533 $5,956,266 $18,287,592 $30,959,040 $34,736,999 $35,799,601 $37,495,311 $37,500,003