pbrf sector reference group – consultation paper 1
TRANSCRIPT
PBRF Sector Reference Group – Consultation Paper 1
Approach to the design of the PBRF Quality Evaluation 2025
Contents
Purpose 2
Review of the PBRF and Cabinet decisions 2
Role of the SRG 3
Operational design of the PBRF Quality Evaluation 2025 4
Table 1: Proposed consultation paper topics 6
Consultation feedback 9
Appendix 1: Key background information about the PBRF 10
Appendix 2: Quality Evaluation 2018 information 14
Name Status Distribution
PBRF Sector Reference Group – Consultation Paper 1
Approach to the design of the PBRF Quality Evaluation 2025
CONSULTATION PAPER
Public
Direct feedback to: https://www.surveymonkey.com/r/BHCMJVX
Feedback due 5pm, 5 November 2021
2
Purpose
1 This paper sets out the proposed approach of the Performance-Based Research Fund (PBRF) Sector Reference Group (SRG) to the design of the Quality Evaluation 2025, and invites feedback from the tertiary education sector and other stakeholders. Specifically, it:
› provides information on the outcomes of the 2019 Review of the PBRF, and Cabinet’s decisions on changes to the PBRF;
› sets out the scope of the SRG’s role in supporting the design of the Quality Evaluation 2025;
› identifies issues arising from Cabinet’s decisions that require sector consultation, along with other issues identified in the PBRF Quality Evaluation 2018 which the TEC and SRG propose to consult upon;
› proposes a series of issues papers and an indicative timetable for consultation on these papers; and
› invites feedback on any other matters that should be considered as part of the design process.
Review of the PBRF and Cabinet decisions
2 Following the PBRF Quality Evaluation 2018, the Ministry of Education set up an independent PBRF review panel. Comprehensive review and evaluation of the PBRF has occurred following each of the Quality Evaluation rounds and has identified issues to be addressed for subsequent Quality Evaluations.
3 The Ministry of Education review of the PBRF drew on peer review and moderation panel reports from the Quality Evaluation 2018, along with public submissions. The review recommendations formed the basis for Cabinet’s decisions on changes to the PBRF. The review noted that the fundamentals of the PBRF are working well in many respects and Cabinet’s decisions, released in July 2021, reflect this conclusion. As a result, many major aspects of the design of the Quality Evaluation, including its focus on excellence, will remain unchanged for the 2025 round.
4 The changes Cabinet has agreed include measures to better support and recognise Māori and Pacific research and researchers and directions to the TEC to revise extraordinary circumstances criteria, new and emerging researcher criteria, and to broaden the PBRF definitions of research and research excellence. Collectively the changes reflect Government’s equity and wellbeing objectives, its vision for a sustainable and diverse research workforce, and its commitment to Māori-Crown partnership.
5 Cabinet’s decisions include a number of policy changes which will be implemented by the Ministry of Education and the TEC:
› add the following PBRF objective: ‘Support a robust and inclusive system for developing and sustaining research excellence in Aotearoa New Zealand’;
› delete the existing ‘Cultural Inclusiveness’ principle and add the following three PBRF principles:
3
Partnership: the PBRF should reflect the bicultural nature of Aotearoa New Zealand and the special role and status of the Treaty of Waitangi | Te Tiriti o Waitangi; Equity: different approaches and resources are needed to ensure that the measurement of research excellence leads to equitable outcomes; Inclusiveness: the PBRF should encourage and recognise the full diversity of epistemologies, knowledges, and methodologies to reflect Aotearoa New Zealand’s people;
› fix the minimum allocation for Te Pūkenga in the next Quality Evaluation round in 2025
at 90% of the Institutes of Technology and Polytechnics allocation from the 2018 round;
› amend the External Research Income component by:
increasing the weighting for external research income in the ‘Overseas Research Income’ category from 1.5 to 3.5; and increasing the weighting for external research income in the ‘Aotearoa New Zealand Non-Government Income’ category from 2 to 4
› apply a funding weighting of 2.5 for Evidence Portfolios submitted by Māori staff;
› apply a funding weighting of 2 for Evidence Portfolios submitted by Pacific staff;
› increase the subject area weighting for Evidence Portfolios assessed by the Māori Knowledge and Development panel from 1 to 3; and
› increase the subject area weighting for Evidence Portfolios assessed by the Pacific Research peer-review panel from 1 to 2.5.
6 Cabinet’s decisions are summarised on the Ministry of Education website.
7 The Quality Evaluation 2025 was originally expected to take place in 2024, following the usual six-year cycle. In 2020, the Government made the decision to delay the Quality Evaluation by a year due to the impact of COVID-19. As a result, the assessment period for the Quality Evaluation 2025 covers seven years rather than the usual six years.
Role of the SRG
8 The TEC has convened an SRG to operationalise changes to the Quality Evaluation 2025. As well as considering operational issues flowing from Cabinet’s decisions and instructions, the SRG will consider a number of recommendations made by participants in the Quality Evaluation 2018.
9 The TEC has established an SRG to prepare for each of the Quality Evaluations from 2003 onwards. The role of the SRG is to develop options, consult, and make recommendations to the TEC for changes to the design of the Quality Evaluation, in line with Cabinet’s directions and decisions.
10 The SRG will agree the grouping and ordering of issues to be addressed. It will discuss issues in the order agreed, approve the content of consultation papers, and make recommendations to the TEC following consideration of sector feedback. The TEC will release ‘in principle’ decisions
4
as they are made to enable the sector to commence preparations as early as possible. The final full guidelines will then be published on the TEC website at the end of the SRG process.
11 In carrying out its role, the SRG will adhere to its Terms of Reference, which set out the membership, processes, and guiding principles. The Terms reflect the SRG’s commitment to a process which meets the TEC’s obligations arising from Te Tiriti o Waitangi and which upholds Māori-Crown partnership. The SRG ratified the Terms of Reference at its first meeting on 24 September 2021. The published Terms are available on the TEC website.
12 Cabinet’s policy decisions listed above are out of scope but the SRG will consider how the operational changes to the Quality Evaluation 2025 should be implemented. The SRG will also consider some additional operational issues which have been identified by the peer review and moderation panels and TEC project staff.
13 The PBRF Guidelines published by the TEC set out the framework of the Quality Evaluation by which both the TEC and TEOs must abide. The TEC has committed to releasing the guidelines for the Quality Evaluation 2025 by June 2023 so that the sector can put in place the necessary processes to ensure compliance. These Guidelines will be the final product of the SRG process.
Operational design of the PBRF Quality Evaluation 2025
Operational design principles
14 SRG design-work ahead of the Quality Evaluation 2025 will be based on a number of principles and considerations:
› upholding the objectives and principles of the PBRF (detailed in Appendix 1);
› learning from the previous Quality Evaluations in order to make improvements to the design of the PBRF and the implementation of Quality Evaluation 2025;
› drawing on relevant experience and expertise across the tertiary education sector;
› exposing proposed changes to rigorous sector and expert scrutiny;
› achieving as much sector agreement as possible about how the next Quality Evaluation should be conducted; and
› avoiding costly or time-consuming changes unless there are good reasons for believing they will bring significant improvements.
15 Based on the above, the SRG is working on the following assumptions:
› the Quality Evaluation 2025 process will be undertaken following a similar timeline as the 2018 Quality Evaluation, i.e. EP submission in June 2025 with outcomes released in April 2026;
› the assessment period for research outputs will be seven years from 1 January 2018 until 31 December 2024; and
› the submission of EPs and research outputs will be electronic.
5
Process for operational design
16 The process the SRG will follow in considering the identified design and implementation issues is:
› SRG makes decisions on the issues to be considered, their grouping into issues/consultation papers, and the order in which those papers will be developed;
› TEC prepares issues papers for the SRG that give background information, detail the issues and potential consequences, and outline a range of potential options for resolution;
› SRG considers issues papers (in terms of the quality of the analysis, accuracy, clarity, coverage of the relevant issues and options), and agrees a recommended approach;
› TEC prepares consultation papers for the sector providing background information, clarification of issues, analysis, and recommended approach;
› consultation with the sector, and the receipt and incorporation of feedback as required;
› SRG makes recommendations to the TEC;
› receipt of feedback from TEC on the recommendations; and
› if agreed by TEC, SRG proposals integrated into the PBRF guidelines.
17 Given the importance of clear audit and evidence requirements to the conduct of the Quality Evaluation, the SRG will consider this aspect in relation to each of the issues proposed below in Table 1. In making decisions about the SRG’s recommendations, the TEC will also consider any implications for the audit process.
18 At the conclusion of the PBRF design phase in June 2023, a new set of guidelines for the operation of the Quality Evaluation 2025 will be issued, containing the integrated process. As in the previous Quality Evaluation, it is proposed that a complete set of draft guidelines is released for consultation ahead of the final guidelines publication.
Operational design consultation papers
19 Table 1 below sets out the issues and changes the SRG proposes to consult with the sector on, and includes a proposed order and timeframe for the consultation process. The issues and changes have been grouped into seven proposed consultation papers.
20 The issues/changes outlined in the table fall into two categories:
› Cabinet direction: SRG will develop options, consult, and make recommendations for implementation; and
› Operational issue arising: the SRG proposes to consider, develop options, consult, and make recommendations on a number of operational issues arising from or since the 2018 Quality Evaluation.
21 Appendix 2 contains information on the content of the Quality Evaluation 2018 Guidelines, information on the peer review panels, and a list of PBRF-related abbreviations to inform sector feedback.
6
Table 1: Proposed consultation paper topics
Consultation paper # Change or issue Rationale Indicative consultation period
Research definitions 1 Broaden the PBRF definition of research and research excellence to encompass the production of research, engagement, and impact relating to that research; and to support diverse research cultures and epistemologies including Mātauranga Māori and kaupapa Māori research, and Pacific research.
Following Cabinet directions, the SRG will consider options, consult with the sector, and make a recommendation to the TEC for implementation
December 2021 - end January 2022
2 Review Quality Category definitions in light of new research and research excellence definitions.
Following TEC advice, the SRG has decided to consider this issue
Evidence Portfolio design
3 Make changes to the design of the Evidence Portfolio to be submitted by staff in the Quality Evaluation to reflect Cabinet’s agreed changes (e.g. replacing NROs with EREs) and the new PBRF definitions of research and research excellence.
Following Cabinet directions, the SRG will consider options, consult with the sector, and make a recommendation to the TEC for implementation
March 2022
Individual researcher circumstances and identification
4 Review the current extraordinary circumstances qualifying criteria, with a view to replacing them with a ‘merit relative to opportunity’ concept.
Following Cabinet directions, the SRG will consider options, consult with the sector, and make a recommendation to the TEC for implementation
May 2022
5 Review and simplify the ‘new and emerging researcher’ qualifying criteria. Following Cabinet directions, the SRG will consider options, consult with the sector, and make a recommendation to the TEC for implementation
6 Consider how to account for the impact of the COVID-19 pandemic on individual researchers’ research activity.
Following TEC advice, the SRG has decided to consider this issue
7 Staff identification criteria: in light of new weightings for Māori and Pacific researchers, the SRG will consider what staff identification processes may be required.
Following TEC advice, the SRG has decided to consider this issue
7
Panels: assessment criteria
8 Holistic assessment boundaries: the SRG will consider whether the criteria used to select Evidence Portfolios for holistic assessment should be refined.
Following TEC advice, the SRG has decided to consider this issue
July 2022
9 Subject area selection criteria: in light of new weightings for EPs submitted to the Māori Knowledge and Development and Pacific Research panels, the SRG will consider whether high-level subject area selection criteria guidelines may be required.
Following TEC advice, the SRG has decided to consider this issue
10 Clarifying cross-referral processes and guidance across all subject areas/panels, including but not limited to the Māori Knowledge and Development and Pacific panels.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue
11 Calibration training: Given the Cabinet instructions to redesign Evidence Portfolios for the 2025 Quality Evaluation, it is not feasible to provide panels with example calibration EPs from previous QEs. However, the SRG will consider how panels will receive appropriate calibration training.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue.
Panels: membership and working methods
12 Māori representation on panels: The report of the Moderation and Peer Review Panels recommended the SRG consider appointing more Māori researchers to the Peer Review Panels across the piece so that members can provide advice on the assessment of relevant Evidence Portfolios.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue. Note also that this change would give effect to the new PBRF principles.
September 2022
13 Skills and competencies of moderators: Consider how to ensure that the selection criteria for moderators include skills and competencies such as familiarity with Te Ao Māori and Mātauranga Māori that are underrepresented in the research workforce.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue. Note also that this change would give effect to the new PBRF principles.
14 Panel and subject names: The report of the Moderation and Peer Review Panels recommended that the ‘Māori Knowledge and Development Panel’ be renamed ‘Māori Knowledge Panel’, and the subject area of ‘Statistics’ should be renamed ‘Statistics and Data Science’
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue.
8
15 Size and number of panels: the SRG will consider combining and/or separating some panels to more fairly distribute workload over panels.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue.
16 Panellist membership criteria: the SRG will consider whether the selection criteria for members of the Peer Review Panel should be revised to account for the changes to the objectives, principles and assessment framework. This includes consideration of how to ensure that panels have access to sufficient expertise in specific technical areas, such as language skills and to enable cross-panel appointments.
Following TEC advice, the SRG has decided to consider this issue
17 Panel-specific guidelines: The report of the Moderation and Peer Review Panels recommended that the panel-specific guidelines provide clearer descriptions about the breadth and quality of research contributions expected for each Quality Category. The SRG may consider high-level principles particularly given the new research definition and Quality Category definitions. However, this may be an issue for Panel Chairs to address.
Following TEC advice, the SRG has decided to consider this issue
18 Tikanga: The report of the Moderation and Peer Review Panels recommended that consideration be given to how to ensure tikanga is embedded in a consistent and informed manner across the design, operation, and processes for the Quality Evaluation.
Following TEC advice and feedback from QE 2018 participants, the SRG has decided to consider this issue
Technical matters 19 Examples of Research Excellence (ERE) request and supply processes: Given Cabinet instructions to review the PBRF definition of research, and changes to digital capabilities since 2018, the SRG will consider whether ERE request and supply processes require revision.
Following TEC advice, the SRG has decided to consider this issue
November 2022
20 Unique staff identifiers: the SRG will consider options for changing the staff identifier regime.
Following TEC advice, the SRG has decided to consider this issue
21 Other technical matters identified during process. As the SRG considers the issues above, further technical matters may be identified
Reporting 22 Reporting framework: Given Cabinet’s decision that TEC will discontinue the Average Quality Score (AQS) metric, the SRG will consider how the Quality Evaluation results should be reported, and more broadly how the information collected in the Quality Evaluation can be used to fulfil the purpose of the fund and meet its intended outcomes.
Following TEC advice, the SRG has decided to consider this issue
January – end February 2023
9
Consultation feedback
22 Feedback is sought on the proposed content and ordering of the consultation papers. The survey questions are:
› Do you agree with the proposed list of issues which the Sector Reference Group will consider in advising on the operational design of the Quality Evaluation 2025?
› Are there any additional issues which you think the SRG should consider?
› Are there any issues identified which you think the SRG should not consider?
› Do you agree with the proposed grouping of issues into Consultation Papers, and the order in which those paper will be released?
› If not, please indicate your preferred grouping and order of papers, including (if applicable) any additional issues which you think the SRG should consider.
› Any other comments.
23 The TEC and the SRG acknowledge that there are likely to be impacts arising from the COVID-19 pandemic which go beyond the specific issues of the effects on individual researchers’ eligible activities (issue 6), and that these impacts may cut across many of the issues identified. Given the significant uncertainty as to the nature and degree of those impacts at this time, the SRG proposes to revisit any decisions that may be affected by the impact of COVID-19 towards the end of the SRG process in 2023.
24 Feedback can be provided to the TEC via the online survey here: https://www.surveymonkey.com/r/BHCMJVX
25 Responses must be submitted by 5pm, 5 November 2021.
10
Appendix 1: Key background information about the PBRF
Creation of the PBRF
Cabinet agreed to establish a performance-based research fund for tertiary education organisations (TEOs) in 2002. The design details were developed by a PBRF Working Group, supported by the Ministry of Education and the Tertiary Education Commission (TEC), working in consultation with the sector. The recommendations and rationale were published as a report, Investing in Excellence. Cabinet accepted this report, and its recommendations still form the basis for most of the PBRF design and implementation.
Cabinet agreed that the fund would be allocated through three separate components:
› the Quality Evaluation;
› Research Degree Completions (RDC); and
› External Research Income (ERI).
The Quality Evaluation is an assessment of research quality for the purpose of allocating bulk funding to participating TEOs. The unit of assessment used is individual academic staff at participating TEOs. These results are used in the funding and reporting calculations.
TEOs are required to apply a set of eligibility criteria to their staff in order to determine which individuals are eligible to participate. Evidence Portfolios (EPs) consisting of published research outputs and other examples of research-related activity for each eligible academic are compiled by staff in conjunction with their employing TEO. These EPs are then submitted to the TEC along with a census of staff employed by the TEO on a specific date. This information is audited by the TEC to ensure that it is correct and robust. EPs are assessed by subject-specific peer review panels and awarded a quality category. The Quality Evaluation results, along with the results of the RDC and ERI components, form the basis of PBRF funding for each TEO for a six-year period.
Objectives of the PBRF
The primary objectives of the PBRF are to:
› increase the quality of basic and applied research at New Zealand’s degree granting TEOs;
› support world-leading research-led teaching and learning at degree and postgraduate levels;
› assist New Zealand’s TEOs to maintain and lift their competitive rankings relative to their international peers;
› provide robust public information to stakeholders about research performance within and across TEOs; and
› Support a robust and inclusive system for developing and sustaining research excellence in Aotearoa New Zealand.1
In doing so the PBRF will also:
› support the development of postgraduate student researchers and new and emerging researchers;
1 This objective was added as a part of the Ministry of Education’s review of the PBRF and agreed by Cabinet in May 2021.
11
› support research activities that provide economic, social, cultural and environmental benefits to New Zealand, including the advancement of Mātauranga Māori; and
› support technology and knowledge transfer to New Zealand businesses, iwi and communities.
Principles of the PBRF
The PBRF is governed by the following principles:
› Partnership: the PBRF should reflect the bicultural nature of Aotearoa New Zealand and the special role and status of the Treaty of Waitangi | Te Tiriti o Waitangi; 2
› Equity: different approaches and resources are needed to ensure that the measurement of research excellence leads to equitable outcomes;
› Inclusiveness: the PBRF should encourage and recognise the full diversity of epistemologies, knowledges, and methodologies to reflect Aotearoa New Zealand’s people;
› Comprehensiveness: the PBRF should appropriately measure the quality of the full range of original investigative activity that occurs within the sector, regardless of its type, form, or place of output;
› Respect for academic traditions: the PBRF should operate in a manner that is consistent with academic freedom and institutional autonomy;
› Consistency: evaluations of quality made through the PBRF should be consistent across the different subject areas and in the calibration of quality ratings against international standards of excellence;
› Continuity: changes to the PBRF process should only be made where they can bring demonstrable improvements that outweigh the cost of implementing them;
› Differentiation: the PBRF should allow stakeholders and the government to differentiate between providers and their units on the basis of their relative quality;
› Credibility: the methodology, format and processes employed in the PBRF must be credible to those being assessed;
› Efficiency: administrative and compliance costs should be kept to the minimum consistent with a robust and credible process;
› Transparency: decisions and decision-making processes must be explained openly, except where there is a need to preserve confidentiality and privacy; and
› Complementarity: the PBRF should be integrated with new and existing policies, such as charters and profiles, and quality assurance systems for degrees and degree providers.
Existing PBRF definitions of research and research excellence
› “Research” is original investigation undertaken in order to contribute to knowledge and understanding and, in the case of some disciplines, cultural innovation or aesthetic refinement. It typically involves enquiry of an experimental or critical nature driven by hypotheses or intellectual positions capable of rigorous assessment by experts in a given discipline. It is an independent, creative, cumulative and often long term activity conducted
2 This and the following two new principles were added as a part of the Ministry of Education’s review of the PBRF and agreed by Cabinet in May 2021.
12
by people with specialist knowledge about the theories, methods and information concerning their field of enquiry. Its findings must be open to scrutiny and formal evaluation by others in the field, and this may be achieved through publication or public presentation. In some disciplines, the investigation and its results may be embodied in the form of artistic works, designs or performances. Research includes contribution to the intellectual infrastructure of subjects and disciplines (e.g. dictionaries and scholarly editions). It also includes the experimental development of design or construction solutions, as well as investigation that leads to new or substantially improved materials, devices, products or processes.
› “Excellence” as a researcher includes all of the following activities:
the production and creation of leading-edge knowledge;
the application of that knowledge; and
the dissemination of that knowledge.
Quality Category descriptions from the Quality Evaluation 2018
Quality Categories are awarded to each PBRF-eligible staff Evidence Portfolio (EP). Quality Categories A, B, C and C(NE) are funded Quality Categories and are reported on by the TEC. Quality Categories R and R(NE) are not funded and are not reported on by the TEC.
Quality Category A:
› expected to contain evidence of research outputs of a world-class standard;
› research-related activity that shows a high level of peer recognition and esteem within the relevant research subject area;
› indicates a significant contribution to the New Zealand and/or international research environments; and
› may also show evidence of other significant demonstrable impact.
Can be awarded to the EPs of all PBRF-eligible staff members including new and emerging.
Quality Category B:
› expected to contain evidence of research outputs of a high quality;
› research-related activity that shows acquired recognition by peers for their research at least at a national level;
› indicates a contribution to the research environment beyond their institution, and/or significant contribution within their institution; and
› may also show evidence of other significant demonstrable impact.
Can be awarded to the EPs of all PBRF-eligible staff members including new and emerging.
Quality Category C:
› expected to contain evidence of quality-assured research outputs;
› research-related activity that shows some peer recognition for their research; and
› indicates contribution to the research environment within their institution or the wider community during the assessment period.
Can be awarded to the EPs of all PBRF-eligible staff members except new and emerging.
13
Quality Category C(NE):
› expected to contain evidence of quality-assured research outputs produced during the assessment period; and
› may have limited or no research-related activity in the research contribution component.
Can be awarded to the EPs of new and emerging researchers only.
Quality Category R:
› does not demonstrate the quality standard required for a C Quality Category or higher.
Can be awarded to the EPs of all PBRF-eligible staff members except new and emerging.
Quality Category R(NE):
› does not demonstrate the quality standard required for a C(NE) Quality Category or higher.
Can be awarded to the EPs of new and emerging researchers only
14
Appendix 2: Quality Evaluation 2018 information
1. Guidelines for tertiary education organisations participating in the 2018 Quality Evaluation
An overview What happens in the Quality Evaluation?
Key dates for the 2018 Quality Evaluation
Which organisations are eligible for funding from the PBRF?
What is research? What counts as research in the 2018 Quality Evaluation?
PBRF definition of research
Who is eligible to participate?
Who is eligible to participate in the 2018 Quality Evaluation?
Staff eligibility criteria for the 2018 Quality Evaluation
Additional information on dates relating to the staff eligibility criteria
Additional information on determining staff eligibility
Eligibility of staff members employed by two or more tertiary education organisations or who leave in the year before 14 June 2018
PBRF Staff Data file
How to complete an Evidence Portfolio
What is an Evidence Portfolio?
What information is in an Evidence Portfolio?
Evidence Portfolio and Researcher Details section
Completing the Evidence Portfolio Details section
Completing the Researcher Details section
Completing the Panel Details section
Which panel should be nominated as the primary panel?
What are the peer review panels and subject areas?
Completing the Field of Research Description
Completing the Māori and Pacific Research elements
What are research outputs and research contributions?
Completing the Platform of Research – Contextual Summary section
Completing the Research Output component
Eligibility criteria for research outputs
Determining the date that research outputs are available within the
assessment period
Types of research outputs
Quality assurance
Outputs involving joint research
Outputs with similar content
Information required in an Evidence Portfolio about a Nominated
Research Output
Information required in an Evidence Portfolio about an Other Research
Output
Assessing Nominated Research Outputs
Forms of evidence required for assessing and auditing research
outputs
Evidence required for assessment and audit
15
Research output evidence requirements
Completing the Research Contribution component
Definition of a Research Contribution
Types of research contribution
Information required in an Evidence Portfolio about research
contribution items
Evidence required for auditing research contribution items
Evidence of research contribution items
What are extraordinary circumstances?
Claiming extraordinary circumstances
Eligibility of extraordinary circumstances
General extraordinary circumstances
Canterbury extraordinary circumstances
Describing extraordinary circumstances
Validating claims under the extraordinary circumstances provisions
Conflicts of interest
Submitting conflict of interest notices for staff members
What is a conflict of interest?
Submitting a conflict of interest notice
Consideration of a conflict of interest notice
Notification to tertiary education organisations
What happens in the audit process?
Underpinning principles of the audit process
Objectives of the audit process
Stages of the audit process
Process for managing errors
Sanctions
Reporting of audits
How will the results be reported?
Reporting the results of the 2018 Quality Evaluation
Principles underpinning the reporting framework
Reporting on the 2018 Quality Evaluation
Calculating the Average Quality Score measures
Calculating PBRF allocations
Individual staff members’ Quality Categories
Protocol for tertiary education organisations on the treatment of PBRF Quality Categories
Recommended protocol
Staff requesting their own results
Requesting results
Submitting requests for results
Processing of requests
Information that will be released
How to make a complaint about
Complaints about administrative and procedural errors
Making a complaint
16
errors Processing complaints
Possible outcome from complaints
Glossary
Tertiary Education Organisation Audit Declaration
2. A guide for staff members participating in the 2018 Quality Evaluation
Information for staff participating in the 2018 Quality Evaluation
What happens in the Quality Evaluation?
What is my role in the Quality Evaluation process?
What do I need to know about staff eligibility?
What do I need to know about completing an Evidence Portfolio?
Your Evidence Portfolio should reflect your best research
Quality over quantity
Evidence Portfolios should provide a coherent view of your research
Presentation of the Evidence Portfolio
Completing the sections in the Evidence Portfolio
Selecting a panel
Completing the Field of Research description
Completing the Platform of Research – Contextual Summary
Completing the Research Output component
Completing the Research Contribution component
Extraordinary circumstances
How does the scoring and assessment process work?
The assessment process
How can the peer review panels be expected to assess all the different research areas?
What happens if I know a panel member?
Am I involved in the auditing of PBRF information?
What happens when the results are released?
How can I get my PBRF funding?
What can I do if I think my results are wrong?
How can I get a copy of my detailed results?
Who do I ask if I have questions about the PBRF or I need help?
Do I have to participate?
Key dates for the 2018 Quality Evaluation
Quality Evaluation assessment process
Guides to scoring and Quality Categories
Glossary
17
3. Guidelines for the 2018 Quality Evaluation assessment process
TEO overview What happens in the Quality Evaluation?
Key dates for the 2018 Quality Evaluation
What is research?
What counts as research in the 2018 PBRF Quality Evaluation?
PBRF Definition of Research
What is an Evidence Portfolio?
What is an Evidence Portfolio?
What information is in an Evidence Portfolio?
What is the Quality Evaluation assessment?
The 2018 Quality Evaluation assessment
The general principles of the Quality Evaluation assessment
What is the platform of research?
Guidance about quantity of research or research-related activity
Assessing new and emerging researchers
What is the role of peer review panels?
Responsibilities of panel Chairs, Deputy Chairs and members
Responsibilities of a panel Chair in the assessment process
Responsibilities of a Deputy Chair in the assessment process
Responsibilities of panel members in the assessment process
How do conflicts of interest and confidentiality work?
Guidelines for conflict of interest and confidentiality
Conflict of interest policy
Consideration of a conflict of interest notice
Confidentiality policy
How does the scoring system work?
The scoring system for Evidence Portfolios
The numerical scoring system
The weighting system for scores
What are the Quality Categories?
Defining ‘world-class research’
What are the stages of the assessment process?
The panel assessment process
Assignment of Evidence Portfolios to panel members
Cross-referring an Evidence Portfolio to the Māori Knowledge and Development Panel and the Pacific Research Panel
Cross-referring an Evidence Portfolio to another panel for assessment
Transferring an Evidence Portfolio to another panel
Pre-meeting assessment and scoring
Panel meeting assessment and scoring
How are research outputs assessed?
Assessing the Research Output component
General principles for assessing the Research Output component
Allocating scores to the Research Output component
Selecting, accessing and examining Nominated Research Outputs
Selecting a Nominated Research Output for assessment
Accessing copies of selected Nominated Research Outputs
Examining selected Nominated Research Outputs
18
Number of Nominated Research Outputs to be examined
How are research contributions assessed?
Assessing the Research Contribution component
General principles for assessing the Research Contribution component
Allocating scores to the Research Contribution component
What is the moderation process?
The moderation process
Purpose of the moderation process
The moderation process
Reconvening of panels
Moderation Panel reporting
19
4. Quality Evaluation 2018 peer review panel abbreviations and subject areas
Abbreviation Full panel name Subject areas covered
BIOS Biological Sciences Agriculture and other applied biological sciences
Ecology, evolution and behaviour
Molecular, cellular and whole organism biology
BEC Business and Economics Accounting and finance
Economics
Management, human resources, industrial relations, international business and other business
Marketing and tourism
CPA Creative and Performing Arts Design
Music, literary arts and other arts
Theatre and dance, film and television and multimedia
Visual arts and crafts
EDU Education Education
ETA Engineering, Technology and Architecture
Architecture, design, planning, surveying
Engineering and technology
HEALTH Health Dentistry
Nursing
Other health studies (including rehabilitation therapies)
Pharmacy
Sport and exercise science
Veterinary studies and large animal science
HAL Humanities and Law English language and literature
Foreign languages and linguistics
History, history of art, classics and curatorial studies
Law
Philosophy
Religious studies and theology
MKD Māori Knowledge and Development
Māori knowledge and development
MIST Mathematical and Information Sciences and Technology
Computer science, information technology, information sciences
Pure and applied mathematics
Statistics
MEDPH Medicine and Public Health Biomedical
Clinical medicine
Public health
PAR Pacific Research Pacific research
20
Abbreviation Full panel name Subject areas covered
PHYSC Physical Sciences Chemistry
Earth sciences
Physics
SSOCSS Social Sciences and Other Cultural/Social Sciences
Anthropology and archaeology
Communications, journalism and media studies
Human geography
Political science, international relations and public policy
Psychology
Sociology, social policy, social work, criminology and gender studies
21
5. List of PBRF abbreviations
AQS – average quality score
CRE – contribution to the research environment
EAG – expert advisory group
EFTS – equivalent full-time student
EP – evidence portfolio
ERE – Examples of Research Excellence
ERI – external research income
FTE – full-time equivalent
ITP – institutes of technology and polytechnics
NRO – nominated research output
ORO – other research output
OERE – Other Examples of Research Excellence
PBRF – Performance-Based Research Fund
PE – peer esteem
PTE – private training establishments
RDC – research degree completions
RO – research output
SDR – single-data return
SRG – Sector Reference Group
TEC – Tertiary Education Commission
TEO – tertiary education organisation