united nations development...

76
evaluation 2019 ANNUAL REPORT I INDEPENDENT Evaluation Office United Nations Development Programme

Upload: others

Post on 27-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

evaluation2019 ANNUAL REPORT

I INDEPENDENTEvaluation Office

United Nations Development Programme

Page 2: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The
Page 3: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

evaluation2019 ANNUAL REPORT

Page 4: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

/UNDP_Evaluation

/ieoundp

/evaluationoffice

www.undp.org/evaluationwww

IEO REPORT TEAM & CONTRIBUTORS

Director Indran A. Naidoo

Deputy Director Arild Hauge

Project Manager Richard Jones

Contributors All staff of the Independent Evaluation Office

Associated funds and programmes contributors: Andrew Fyfe (UNCDF); Martin Hart-Hansen, Sandra Koch, and Hendrik Smid (UNV)

Production and outreach Sasha Jahic

ANNUAL REPORT ON EVALUATION 2019 © UNDP March 2020

Manufactured in the United States of America.

Permission to reproduce any part of this publication is required. Please contact: [email protected].

Suggested citation: Annual Report on Evaluation 2019, Independent Evaluation Office of UNDP, New York, March 2020.

Independent Evaluation Office of the United Nations Development Programme

1 United Nations Plaza 20th Floor New York, NY 10017, USA Tel. +1(646) 781-4200

Connect with us:

Page 5: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

12019 Annual Report on Evaluation

3RP Regional Refugee and Resilience Plan

AEAC Audit and Evaluation Advisory Committee

CIS Commonwealth of Independent States

EAP Evaluation Advisory Panel

ERC Evaluation Resource Centre

GEF Global Environment Facility

ICPE Independent Country Programme Evaluation

ICPR Independent Country Programme Review

IEO Independent Evaluation Office

IFAD International Fund for Agricultural Development

ILO International Labour Organization

IPDET International Programme for Development Evaluation Training

ISIL Islamic State in Iraq and the Levant 

LDC Least developed country

M&E Monitoring and evaluation

MAP Making Access to Financial Services Possible

OECD/DAC Organisation for Economic Co-operation and Development/Development Assistance Committee

PFIP Pacific Financial Inclusion Programme

SAARC South Asian Association for Regional Cooperation

SDG Sustainable Development Goal

SHIFT Shaping Inclusive Finance Transformations

UNCDF United Nations Capital Development Fund

UNDAF United Nations Development Assistance Framework

UNDP United Nations Development Programme

UNEG United Nations Evaluation Group

UNFPA United Nations Population Fund

UNICEF United Nations Children’s Fund

UNV United Nations Volunteers

Acronyms

Page 6: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2 2019 Annual Report on Evaluation

It gives me great pleasure as President of the Executive Board of the United Nations Development Programme (UNDP) to intro-duce the 2019 Annual Report on Evaluation, submitted to the Executive Board by the Independent Evaluation Office (IEO).

2019 marked a further seminal year for evaluation at UNDP and for the IEO. A review of the UNDP Evaluation Policy and the subsequent revi-sion adopted by the Executive Board in its second regular session further embedded the central role of evaluation at UNDP. It also recognized the good work of the IEO and UNDP in evaluation.

The Board highly appreciates the increase in evaluative coverage under-taken by the IEO, which conducted 38 country programme evaluations in 2019. This will support the Board’s consideration of new UNDP country programme documents through 2020 and 2021.

We further greatly appreciate the IEO’s and UNDP’s continued support to decentralized evaluations, the strengthening of an evaluation culture across the organization, and the pursuit of more credible and usable eval-uations through the new Evaluation Guidelines, regional training and improved oversight.

While 2019 was a year of high evaluative coverage through numerous country programme evaluations, we are glad to see that the IEO has continued to engage with UNDP across a range of corporate and the-matic evaluations, detailed throughout this report. This will help UNDP to enhance its programming and align its development work with the 2030 Agenda for Sustainable Development and its 17 Sustainable Development Goals (SDGs).

Finally, we would like to thank Indran Naidoo, who served as Director of the IEO from 2012 to 2020 and brought about major transformations. The Director term report contained within this Annual Report is an important reflective piece. It illustrates the major role played by the Executive Board as custodian of the Evaluation Policy, as well as the UNDP Administrator for ensuring an enabling environment for evaluation. We on the Board have greatly valued UNDP’s emphasis on evidence-based decision-making, critical in the context of the SDGs and a rapidly changing development landscape. We welcome the new IEO Director, Oscar A. Garcia, and look forward to an ongoing, productive relationship as we continue to advance UNDP through credible evaluations and support UN-wide evaluation initiatives.

Walter Webson Permanent Representative, Permanent Mission of Antigua and Barbuda President of the UNDP Executive Board

Foreword

Page 7: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

3Preface

PrefaceIt gives me great pleasure to present the 2019 Annual Report on Evaluation to the President of the UNDP Executive Board, his Excellency Ambassador Walter Webson. This report also marks the end of my tenure as the head of the Independent Evaluation

Office, in what has been a fruitful and exciting eight years. During this time, we have made several changes to align the office to the size and complexity of UNDP. These changes included adjusting the evaluation model from one based on commissioning evaluations to one grounded in a highly capable and effective UNDP evaluation team conducting evalu-ations. The progressive increase and range of outputs delivered over the years has come about due to the Board’s support and encouragement, and I leave a well-resourced and capacitated office poised for further progress. In this report, in addition to highlighting results from 2019, we illustrate some key changes since 2012 that have led to efficiency and impact gains.

There are many partners who need to operate in synchronicity to achieve evaluation success. I am pleased to note that these are now working in harmony and with mutual support. The Executive Board in its approval of the IEO workplan and budget, and representative of UN Member States, has been a constructive authorizing body. UNDP Administrator Achim Steiner has clearly signalled the importance of evaluation to the orga-nization and beyond, and continues to encourage an increasingly close collaboration with the office to ensure that evaluation helps to inform and steer strategic decisions, towards a stronger UNDP and the achievement of the 2030 Agenda and the 17 SDGs.

The IEO’s workload and evaluative coverage have increased hugely in recent years, along with its ability to engage with UNDP strategically in providing evaluative evidence to support planning and change across the organization. This occurs both centrally through thematic and corporate

evaluation findings as well as at the programme level through broad country programme evaluation coverage, which reached 38 countries in 2019.

The IEO has also recognized and pursued a goal to strengthen evaluation culture outside of the organization. In 2019, UNDP again supported the National Evaluation Capacities Conference, which took place in Hurghada, Egypt, in partnership with the Egyptian Government. The conference saw unprecedented participation from government representatives of over 100 countries, with over 500 participants coming together under the theme of ‘Leaving no one Behind: Evaluation for 2030’.

I am especially grateful to members of the Evaluation Advisory Panel, most of whom have walked the journey with me since the panel’s incep-tion in 2013. They have supported the model transition by bringing in best practices and advancing the quality of our products. I also thank all UNDP management and staff for their collaborative support to me and the office.

I would like to give my deepest thanks and gratitude to the IEO team today and over the years. They have been exemplary in always projecting pro-fessionalism and the values of evaluation throughout their engagement in the evaluation processes and the development of strong evaluation products. They have been a pleasure to work with.

As I move on, I would like to welcome the IEO’s new Director, Oscar A. Garcia. I trust he will receive continued partnership, cooperation and goodwill from UNDP and the Executive Board as I have in recent years, and that he will find the office on a solid footing to take his own evaluative strategy forward.

Indran Naidoo Director, Independent Evaluation Office, UNDP

Page 8: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

4 2019 Annual Report on Evaluation

Chapter 1. Evaluation in UNDP 6

1.1 The 2019 Evaluation Policy review and revision 7

1.2 Evaluation architecture 10

1.3 IEO key performance indicators 12

1.4 Regional bureau engagement 12

Indran Naidoo: Director’s Full-Term Report, 2012 to 2020 14

Chapter 2. Key evaluations undertaken in 2019 18

2.1 Country programme evaluations 19

2.2 Corporate and thematic evaluations 23

Chapter 3. Advancing global evaluation culture and practice in 2019 29

3.1 National Evaluation Capacities Conference 2019 30

3.2 The International Program for Development Evaluation Training 33

3.3 The United Nations Evaluation Group 33

3.4 African Evaluation Association 34

Chapter 4. Oversight and support to decentralized evaluation 35

4.1 UNDP investment in evaluation in 2019 36

4.2 Decentralized evaluation implementation in 2019 37

4.3 What is being evaluated in UNDP? 38

4.4 Quality of decentralized evaluations, 2019 39

4.5 IEO and UNDP support to decentralized evaluation 40

Contents

Page 9: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

52019 Annual Report on Evaluation

Annex 1: Snapshot of all decentralized evaluations in 2019 53

Annex 2: Africa snapshot of decentralized evaluations in 2019 56

Annex 3: Arab States snapshot of decentralized evaluations in 2019 57

Annex 4: Asia and the Pacific snapshot of decentralized evaluations in 2019 58

Annex 5: Europe and the CIS snapshot of decentralized evaluations in 2019 59

Annex 6: Latin America and the Caribbean snapshot of decentralized evaluations in 2019 60

Annex 7: Snapshot of global/ headquarters-based decentralized evaluations in 2019 61

Annex 8: Average budgets for evaluation 62

Annex 9: Quality assessment of decentralized evaluations in 2017, 2018 and 2019 63

Annex 10: Monitoring and evaluation capacity, 2014 to 2019 64

Annex 11: Key performance indicators 65

Annex 12:Evaluation Advisory Panel 2019 summary letter 66

Annex 13: Evaluation Advisory Panel summary of work 2019 68

Endnotes 69 Photo credits 70

Annexes

Chapter 5.The United Nations Volunteers and the United Nations Capital Development Fund 44

5.1 The United Nations Volunteers 45

5.2 The United Nations Capital Development Fund 45

Chapter 6. Staffing and finances of the IEO 48

6.1 IEO staffing 49

6.2 IEO finances 50

6.3 Programme of work in 2020 and 2021 50

Page 10: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

6 2018 Annual Report on Evaluation

Evaluation in UNDP

CHAPTER 1

Page 11: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

7Evaluation in UNDP

This Annual Report on Evaluation details the work undertaken by the

Independent Evaluation Office (IEO) in 2019. This included 38 country

programme evaluations, preparatory work for several thematic

evaluations to be finalized in 2020, and measures to strengthen

the evaluation function across the United Nations Development

Programme (UNDP) and in partner countries.

1.1 The 2019 Evaluation Policy review and revision

The IEO continues to regularly review its Evaluation Policy to sustain its relevance, with independently commissioned policy reviews taking place in 2010, 2014 and 2019. Each iteration has advanced the policy.

UNDP Evaluation Policy reviewOn approval of the 2016 Evaluation Policy, the Executive Board requested a 2019 review of the policy’s implementation. An independent

external panel of three senior evaluation experts undertook a detailed assessment, interviewing a range of stakeholders across UNDP, including the Administrator, bureau heads and staff as well as IEO management and staff. The review examined evaluative efforts from September 2016 to January 2019, taking into consideration contextual and organizational changes since the Board’s approval of the policy.

Overall, the independent review provided a positive assessment of the 2016 Evaluation Policy and its implementation, while at the

same time, recommending opportunities for further improvement. A review report made 11 recommendations concerning Evaluation Policy principles, the evaluation architec-ture, procedures and quality assurance. The review team presented its key findings and recommendations at an informal session of the Executive Board in May 2019 (DP/2019/13).

The IEO and UNDP management considered the final policy review and recommenda-tions in detail and presented a joint response (DP/2019/14).

Revised 2019 Evaluation PolicyThe Evaluation Policy review set the foundation for a revision of the 2016 Evaluation Policy, which was jointly under-taken by the IEO and UNDP management. It responded to 8 of the 11 recommendations suggested by the policy review team. The policy (DP/2019/29) was presented to and accepted by the Executive Board at its second annual session in September 2019.

Page 12: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

8 2019 Annual Report on Evaluation

Evaluation Policy recommendations and the IEO/UNDP management response RECOMMENDATION MANAGEMENT RESPONSES

1 Include reference to the Charter of the IEO and the 2019 Evaluation Guidelines.

The proposal to include a reference to the Charter of the IEO and the revised Evaluation Guidelines in the Evaluation Policy is welcomed.

2Principles should include a reference to the 2030 Agenda, gender equality, diversity, inclusion, human rights and the private sector.

The inclusion of a reference to the 2030 Agenda in the policy is supported.

3 Consultation and engagement with stakeholders are key. This is a useful and welcome proposal.

4 Decisions on what to evaluate should note the purpose and use for strategic decision-making.

UNDP and the IEO agree that this is a useful proposal.

5 Technical reporting lines of regional M&E specialists to the IEO Director should be established.

UNDP and the IEO consider that creating a matrix management system for the current regional M&E focal points is untenable. Instead, in order to address ongoing concerns about the decentralized evaluation system, UNDP and the IEO will consider extending IEO coverage and positions from headquarters to the regional hubs with the creation of a cadre of P4/P5 posts plus support staff.

6 Different/new types of evaluations and data collection methods to be encouraged.

UNDP and the IEO welcome this recommendation. There have been significant changes made to UNDP programming approaches, and it is important that the evaluation approaches recognize these different initiatives.

7 There should be increased efforts to elaborate messages from evaluations (syntheses, trends etc.).

The recommendation is welcomed, and UNDP is committed to undertaking more analysis for organizational learning going forward.

8 Management responses are not required for all evaluations.

Management responses are an integral part of the evaluation process, and contribute to programme/project implementation effectiveness and organizational accountability. UNDP will consider the capture of its response to ICPE recommendations through new country programme documents.

9 Link 0.8 percent for non-IEO evaluations with the UNDP activity and funds portfolio by introducing a budget line.

UNDP and the IEO do not agree with this recommendation, which is not clear and contains some inaccuracies. Programme units are required to submit a costed and timed evaluation plan to the Executive Board with each country, regional and global programme document considered for approval.

10The Audit and Evaluation Advisory Committee (AEAC) should no longer be part of the UNDP evaluation architecture.

Both the IEO and UNDP management recognize the importance of having an oversight body for evaluation and value the AEAC in terms of providing advisory and oversight support. The recommendation is therefore not accepted. However, it is recognized that the Committee is still evolving in its focus and will in the forthcoming period examine ways to strengthen the evaluation coverage within its work.

11 An independent review of the evaluation function should be done every four years.

UNDP and the IEO concur with the recommendation and note that the Evaluation Policy has been reviewed twice since 2010.

Page 13: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

9Evaluation in UNDP

IEO regionalization and the strengthening of decentralized evaluationThe most significant change proposed in the latest policy revision is an increase in the percentage of resources reserved for the work of the IEO within the 1 percent of UNDP’s combined programmatic resources (core and non-core) allocated to the eval-uation function annually. The 2019 policy sets an annual resource allocation for the IEO of 0.3 percent of combined program-matic resources, representing a 50 percent increase from the previous 0.2 percent. This revision responds to policy review findings underscoring concerted action to improve the quality, independence and impartiality of the decentralized evaluation function.

In recent years, the Executive Board has repeatedly raised concerns about the quality and credibility of UNDP’s decentralized eval-uations, in response to quality assessment findings presented in the IEO’s Annual Report on evaluation. The office has worked closely with UNDP in recent years to address these concerns and strengthen guidance for and

oversight of the decentralized evaluation function. This has included improved over-sight of implementation through a more effective quality assessment process and the upgrading of the Evaluation Resource Centre (ERC). The IEO led the revision of the Evaluation Guidelines in 2019 and organized regional workshops and online webinars to roll these out. Direct support to country offices has historically come from a cadre of regional monitoring and evaluation (M&E) focal points in UNDP’s regional bureaux, whose work leaned strongly towards monitoring over evaluation.

Moving forward, the IEO will continue collaborative efforts with UNDP manage-ment. At the same time, the increased budget allocation will enable a significant expansion of IEO oversight of decentralized evaluation as well as technical support and guidance to regional bureaux and country offices in the planning and implementation of evaluations.

To fulfil this additional role, the IEO will consider various options, including but not limited to building a cadre of IEO evaluation advisers at the regional level focused solely

on evaluation. To ensure independence, such personnel will remain under the IEO and report to the IEO Director. The greater proximity to country and regional offices will improve engagement and be consistent with practices at other UN organizations. The new Director of the IEO will initiate this approach from April 2020. He may also reflect on other options to improve IEO con-tributions to organizational priorities and UN reform imperatives.

Other revisions in the Evaluation Policy, while having less operational significance, are also very important. Evaluation principles now include explicit reference to the 2030 Agenda for Sustainable Development, and to univer-sally shared values of equity, justice, gender equality and respect for diversity. Finally, the new policy underscores the contribution of UNDP evaluations to UN system-wide accountability and learning, in recogni-tion of new mandates emanating from the Quadrennial Comprehensive Policy Review adopted by the UN General Assembly as well as the UN Secretary-General’s development system reforms.

Page 14: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

10 2019 Annual Report on Evaluation

1.2 Evaluation architecture

The UNDP Executive BoardThe IEO continues to engage closely with the UNDP Executive Board as the custodian of the Evaluation Policy. During 2019, in addition to delivering evaluative and policy review findings at both informal and formal ses-sions of the Board, IEO country programme evaluation reports accompanied new country programme documents as they came to the Board for consideration. This has only been possible through the 100 percent coverage elaborated later in this report.

Evaluation Advisory Panel The Evaluation Advisory Panel (EAP), established in 2013, saw its sixth year of sup-porting the IEO in 2019. The panel provides support and advice on producing high-quality evaluations that further the IEO’s objective of enhancing overall UNDP performance and results. It injects critical guidance and intel-lectual insight into the work of the office, enhancing its strategic impact. The EAP has helped the office improve the quality of its evaluations, increase evaluative competen-cies, and deepen its role in evaluation within UNDP and externally.

The EAP offers:

• Quality assurance: reviewing key products and deliverables, including occasional papers and evaluation documents, and providing specific evaluation guidance and detailed feedback.

• Methodological guidance: recommen-dations on improvements to the overall coherence and consistency of the IEO’s approach, work programme, guidance doc-uments, procedures and methodologies.

• Strategic direction: advising the IEO on ways to raise its profile and credibility, including knowledge-sharing platforms and dissemination strategies, and

• Development perspectives: debating issues surrounding development in the international context, and evaluating development and the complexity of the geopolitical environment.

The annual EAP meeting in 2019 focused on both a review and lessons learned from the last six years. It also considered a review of the ICPE approach, corporate and the-matic evaluations completed in the previous 12 months as well as those in development, and the independent review of the UNDP Evaluation Policy. A detailed discussion also took place on the possible future focus and role of the EAP.

Presentation of the evaluation of UNDP support to poverty reduction in the least developed countries

First regular session 2019

Annual Report on Evaluation 2018UNDP Evaluation Policy review

Annual session 2019

Presentation and adoption of the revised UNDP Evaluation Policy

Second regular session 2019

In 2019, the IEO made presentations to the Executive Board as follows:

Page 15: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

11Evaluation in UNDP

2014 2015 2016 2017 2018 2019

Sessions and half-day workshops were held on evaluation methodology and design, with individual members scheduled to meet with IEO evaluation managers on their ongoing evaluations/project areas.

The IEO and EAP reflected on lessons learned from the first three years, differentiating between advisory work and quality assessment. A key conclusion of this reflection was that the panel has helped the IEO to systematically establish consistent practices and ensure the overall quality of its work.

Meetings focused on a retrospective review of the past five years of EAP operations, an analysis of EAP inputs in IEO projects, a tentative plan for addressing the challenges of 100 percent Independent Country Programme Evaluation (ICPE) coverage and roll-out, as well as past and ongoing evaluations.

Sessions open to UNDP/external stakeholders covered topics including

big data, complexity and unintended outcomes. Strategic sessions

with IEO staff focused on thematic evaluations and IEO initiatives, as well as the IEO’s strategic direction for the

coming year.

Meetings focused on the IEO’s annual progress, emerging issues, IEO

structure and ongoing evaluations. Other topics included evaluating the SDGs, audit and evaluation,

the design and use of meta-synthesis, and the United Nations

Evaluation Group (UNEG).

The sixth annual meeting brought to a close the tenure of the original panel members, with sessions dedicated

to reflection, lessons learned and broad-stroke recommendations from the EAP on future operations.

Major sessions were also allocated to discussions and decisions arising from the two EAP-led initiatives: the

ICPE review project, and the independent review of the latest UNDP Evaluation Policy.

EAP sessions from 2014 to 2019

Page 16: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

12 2019 Annual Report on Evaluation

Role of the Audit and Evaluation Advisory Committee The IEO has continued to engage with the Audit and Evaluation Advisory Committee (AEAC), presenting its work three times in 2019 in meetings with the IEO Director. The AEAC is charged with reviewing the activities of all of UNDP’s oversight entities—including the IEO, the Office of Audit and Investigation and the Ethics Office—which are increas-ingly harmonizing their work. It serves in an organizational advisory capacity for the IEO Director, and does not impact or undermine the principles of independence and account-ability to the Executive Board. The 2019 policy review provided perspectives on these issues.

1.3 IEO key performance indicators

As the IEO expands its evaluative output and scope, it has developed several key perfor-mance indicators to manage progress, and ensure it applies qualitative and quantitative methods to realize its interlinked mandates of accountability and learning.

The key performance indicators were initially developed at the behest of the AEAC to mon-itor mostly operational elements, such as the

efficient use of resources, as well as to track achievement towards a number of goals, such as gender distribution within the office at the professional level as well as geographical coverage and communications outreach. The IEO has now expanded the indicators to cap-ture essential aspects of effectiveness, such as the timeliness of the delivery of evaluation products and the implementation of recom-mendations via management responses across its evaluations. Indicators now cover both quantitative and qualitative elements.

1.4 Regional bureau engagement

The IEO organized biannual meetings with senior managers of the five UNDP regional bureaux between July and August 2019. These took stock of evaluations and dis-cussed current and upcoming work in the regions. The IEO presented a number of key issues drawn from recent ICPEs, and shared the evaluation plan for the 2020-2021 cycle. The discussion has been fruitful, with all bureaux focused on improving under-standing of the IEO’s role and evaluations as well as informing the IEO of regional changes during the year. Bureau-specific evalutive findings and common issues from ICPEs were shared and included:

• Diverse development contexts  across and within bureaux (lower-income to upper middle-income and high-income countries): Bureau-level strategies are needed to support operating across these differing contexts. Many countries face quickly evolving, fluid environments. More attention is needed to emerging issues (e.g., crises).

• Low-income countries: Capacity devel-opment efforts need to be strengthened. There has been limited knowledge/tech-nical capacity transfer under  the national implementation mode. There is limited evidence of impact from poverty reduction activities.

• Middle-income countries: A declining donor base has accompanied increasing reliance on government cost-sharing and vertical funds. Greater diversification of funds should include innovative financing. Other priorities comprise a balance between gov-ernment cost-sharing goals and UN/UNDP values and strategic goals (e.g., human rights and equality), and clarity in UNDP’s role and offer to middle-income countries. UNDP should facilitate greater South-South cooperation using its global  presence.

Page 17: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

13Evaluation in UNDP

• Net contributor countries: This country context requires a different UNDP offer. UNDP should articulate its role and added value in delivering  government-funded/-driven  programmes, and forge strong  partnerships  with  civil society organizations, the private sector, academia and donors.

• Programme design: There is a need for greater flexibility in programming and decision-making  (e.g., the suitability of a country programme document as a strategic framework versus a ‘bridging programme’ for post-conflict stabilization). More atten-tion needs to go towards the issues of poverty and youth. Greater risk analysis is required.

• Strengthen results-based management practices: Challenges encompass the limited use of theories of change or stra-tegic frameworks to  guide  programmes. Large claims in results-oriented annual reporting often cannot be substantiated. Programme units need to be measured against actual results rather than against financial delivery.

• Declining financial resources: In low-income countries, there is over-reliance on TRAC (and vertical funds) but little leverage of

funds using TRAC, and a lack of strategic choices in the allocation of human and financial resources . Funding sources need more diversification (e.g., government cost-sharing, the private sector).

• Role of headquarters/regional hubs:  Country offices need more external sup-port (e.g., thematic guidance), and early and full engagement of the regional bureaux for timely guidance.

• UNDP partnerships: In many contexts, UNDP is seen as a valuable, trusted partner, with international expertise and networks, and an ability to respond swiftly to needs. There is a need to strike a balance between administrative services supporting gov-ernments and  substantive  programmes, especially in terms of government cost-sharing, and in middle-income countries and net contributor countries.

• Redefining UNDP’s roles: There have been challenges in redefining UNDP’s role in its programme countries since the Resident Coordinator system delinkage. This requires better delineating UNDP’s role as an ‘inte-grator’ (vis-à-vis the Resident Coordinator) and ‘convener’ (e.g.,  indigenous peoples, other sensitive topics) .

Page 18: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

14 2019 Annual Report on Evaluation

Phase 1: 2012 evaluation model change

The office I took over in 2012 had a solid foundation set by my predecessor, Saraswathi Menon, in terms of addressing evaluation imperatives and principles. My assessment of the evaluation context and needs of UNDP as a key development agency working in complex situations amid resource challenges, however, meant that the pro-file and outputs of the IEO needed to expand in scope and depth to optimize the impact of evaluation and better guide the organization.

For evaluation to have an impact in an international organization the size of UNDP, with its presence in 170 countries, there must be adequate evaluation coverage to justify its expenditure. Evaluation must produce credible findings, conclusions and recom-mendations, including statistically valid samples to make any generalizations about performance reliable and to answer big-picture questions.

The dominant approach at the time was that the office largely managed evaluations commissioned from contracted external parties. This was inade-quate not only to meet demands for expansion but

also to create a critical evaluation culture at the centre of UNDP. Only there could the value of evaluation be seen by all parties for learning and improvement, and not just accountability. The commissioning model did not provide a sustainable model for the coherent and uni-form exercise of independent evaluation, which was necessary for the organizational credibility of the function. Several organizational changes were required to raise the status of evaluation, ranging from enhanced policy to the new Evaluation Guidelines. This has also improved the office’s relationship with programme countries, the Executive Board and UNDP management.

It gives me pleasure to submit this full-term report as part of

the Annual Report on Evaluation 2019. This chapter provides

an overview of three phases of change that marked the

evolution of the IEO during the Director’s term (2012 to 2020).

These included the evaluation model change in 2012, a new

UNDP Administrator, and the announcement of full evaluation

coverage of country programmes and a higher IEO budget as

part of the revised Evaluation Policy.

Indran Naidoo: Director’s Full-Term Report, 2012 to 2020

1

Page 19: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

15Evaluation in UNDP

In this context, I began professionalizing the office through the introduction of an evaluation-conduct model. Under this approach, IEO evaluations are led by IEO evaluators and not commissioned from external parties. Several initiatives were required to embed the model, build confidence in the office and its products, and create and sus-tain an evaluation culture to move evaluation from the periphery to the centre.

This period of change also saw greater demand for and receptiveness to evaluations by UN Member States at both the country and Executive Board levels, as well as a greater move towards recognizing evaluations as useful in promoting learning, rather than simply as perfunctory exercises aimed primarily at accountability. Building on the team in the office, I began recruiting more widely to bring in a greater range of expertise and diversity. This helped develop a cadre of professional evaluators who could project and enhance credibility by showing substantive understanding of the work under review and the context that influences it. A further key output has been efficiency gains, through a 75 percent reduction in unit costs and time, and closer links between country programme evaluations and thematic and corporate evaluations.

A key element of professionalization has been the technical and meth-odological support brought to the office through the establishment of the EAP. I was fortunate to have a group of prominent peers from the global evaluation community accept my invitation to join the panel. As

a team of thinkers and practitioners, they have walked the journey with IEO staff, providing their thoughts, reflections and advice on more than 200 IEO outputs over their term and engaging in six lively annual EAP meetings. This has raised the quality of evaluative thinking within the office and brought in new global perspectives on the meaning of development evaluation. It has provided opportunities for the IEO team to partner with evaluation thought leaders. As part of my com-mitment to transparency, the annual reports of the EAP have been included as part of the Annual Report on Evaluation.

Phase 2: 2016, the UNDP Administrator’s commitment to full coverage and budget

The Executive Board, as the custodian of the UNDP Evaluation Policy, has played an important role in promoting the office and its work. In 2016, based on an earlier review, the Board adopted a new policy. At the same time, it was announced that the IEO would undertake full evaluation coverage of all country programmes reaching their conclusion, leading to a five-fold increase in outputs. This required the office to undertake an internal reflection around how to build the capacity required to meet this obligation, evaluate on a higher scale

2

Page 20: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

16 2019 Annual Report on Evaluation

and across a larger portfolio of UNDP’s programmatic budget, and conduct evaluations within shorter time frames without compromising engagement and quality. All of these aims were achieved. The office has now accumulated an evaluative base for examining the strategic impact of overarching UNDP policies and vision, allowing for more evidence-rich corporate and thematic evaluations. This has provided strong information for the forthcoming UNDP Strategic Plan evalu-ation. Full coverage also required an internal reorganization in the form of new protocols, the Evaluation Charter and new supporting guidelines. These have entrenched evaluation as a fundamental part of UNDP.

Phase 3: 2019 Evaluation Policy and the IEO global/regional expansion

This progression of changes has been codified in the 2019 revised Evaluation Policy and globally recognized as a success. The work of the office has been chosen as an international case study for instructional material by the International Program for Development Evaluation Training, used as an example at the National Evaluation Capacities Conferences, and made UNDP a leader in the UN evaluation system. The new director will need to continue these developments and move the office into regions, so that it can engage more meaningfully at that level, supporting decentralized evaluation as well as national evalua-tion capacity as governments strive to attain the 2030 Agenda.

The IEO in 2020:

• The IEO is the leading evaluation office in the UN system, with one of the highest geographic spans and coverage of programming. With an annual budget of $14.7 million and a staff of 34, it is ready for any transition required for even greater impact.

• The policy foundation has been laid for the further geographic expansion and focus of the IEO, which will support evaluation needs in relation to organizational and UN priorities as well as building on the pivotal National Evaluation Capacities Conferences.

• A robust Evaluation Policy centred on adherence to the principles of independence, credibility and utility was affirmed in the indepen-dent 2019 policy review and in Executive Board statements on the Annual Report on Evaluation.

• An office with a mature and full management structure, including chiefs of section, delegations and protocols, has a staff drawn from a rich global evaluation network. It reflects sectoral and national diversity (over 25 nationalities) as well as an impressive average evaluation experience of 17-plus years.

• Full programme coverage resulting in a high repository of evaluation work, with over 100 ICPEs, 25 thematic evaluations, papers, etc. forming the basis for evaluative commentary. Much of the work of the IEO has been presented in evaluation forums globally as offering examples of good practices.

3

Page 21: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

17Evaluation in UNDP

• The first UN course on evaluation is being offered for the second year at the International Program for Development Evaluation Training (IPDET).

• The IEO partners with the Independent Evaluation Group of the World Bank on the Global Evaluation Initiative, affirming its recognized quality and capacity in the area.

Building capacity within and beyond UNDP

The National Evaluation Capacities Conferences are now the largest evalua-tion event globally. Focused attention on advancing the evaluation profession through these biennial conferences has meant they have grown from a 50-person event in 2011 to involve 160 governments and 2,000 par-ticipants across the six conferences held to date. The conference has helped set the eval-uation agenda and supported governments in achieving the 2030 Agenda and the SDGs.

Page 22: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

18 2018 Annual Report on Evaluation

Key evaluations undertaken in 2019

CHAPTER 2

Page 23: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

19Key evaluations undertaken in 2019

2019 was an extremely busy evaluative year for the IEO. All staff engaged

across the globe in country programme evaluations as well as the preliminary

design and desk reviews for several highly important corporate evaluations.

REGION2019

(38 COUNTRIES)20201

(26 COUNTRIES)

Regional Bureau for Africa

Burkina Faso, Cameroon, Côte d’Ivoire, Eswatini, Ethiopia, Guinea-Bissau, Mauritius, Mozambique, Seychelles, Uganda, Zimbabwe (11)

Botswana, Chad, Eritrea, The Gambia, Niger, São Tome and Principe, South Sudan, United Republic of Tanzania, Zambia (9)

Regional Bureau for Asia and

the Pacific

Afghanistan, Bangladesh, China, Indonesia, Malaysia, Maldives (6)

Lao People’s Democratic Republic, Mongolia, Viet Nam (3)

Regional Bureau for the Arab States

Bahrain, Iraq, Lebanon, Somalia, Syria (5) Algeria,2 Morocco, Saudi Arabia (3)

Regional Bureau for Europe

and the CIS

Armenia, Azerbaijan, Belarus, Georgia, Kazakhstan, Kosovo,3 Republic of North Macedonia, Serbia, Tajikistan, Turkey, Turkmenistan, Uzbekistan (12)

Albania, Montenegro (2)

Regional Bureau for Latin American and the Caribbean

Argentina, El Salvador, Panama, Uruguay (4)

Barbados and the Office of the Eastern Caribbean States, Belize, Brazil, Guyana, Haiti, Honduras, Jamaica, Suriname, Trinidad and Tobago (9)

The year saw a number of firsts for the office. These included the highest number of evaluations undertaken in a single year, covering 38 countries, and the highest col-lective financial portfolio under evaluation, at $6.4 billion in UNDP programme budgets. The year also featured evaluations in a large number of crisis or post-crisis countries, and the first regional clustering of country programme evaluations in Europe and the Commonwealth of Independent States (CIS). Understandably, several challenges arose that will further inform IEO approaches moving forward.

2.1 Country programme evaluations

The 38 ICPEs carried out in 2019 covered $6.4  billion in UNDP programmes. The IEO plans to have the ICPEs ready for all country programme documents sent to the Executive Board in 2020, to aid and inform the board’s consideration of the submissions. Country programme evaluations in 2019 and planned for 2020 included:

Page 24: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation20

To efficiently use resources and take into account geographical positioning and sim-ilarities in programme implementation, a number of country programme evaluations included combined data collection missions (Mauritius, the Maldives and the Seychelles; and Eswatini and Zimbabwe) or were imple-mented through a cluster approach. This applied to programmes under the Regional Bureau for Europe and the CIS as well as the Regional Refugee and Resilience Plan (3RP), as detailed below.

In 2018 and 2019, evaluations in crisis and post-crisis settings comprised Afghanistan in 2019 (instability and protracted conflict), Iraq in 2019 (immediate post-conflict stage), Somalia in 2019 (chronic instability and large-scale insurgency), Syria in 2019 (post-conflict, localized active conflict), Burkina Faso in 2019 (increased localized conflict and accom-panying humanitarian crisis), Mali in 2018 (post-conflict, localized attacks), Yemen in 2018 (conflict) and Venezuela in 2018. Evaluations of Lebanon and Turkey (and Syria) in 2019 covered support to humani-tarian work due to the ongoing refugee crisis.

Country Programme Evaluation of Afghanistan, 2015–2019

When UNDP Afghanistan’s current country

programme was conceived in 2014, there

was significant optimism about prospects

for development, following the withdrawal of

international military forces and completion of the

presidential election. The increasing erosion of security since then,

however, has posed major challenges to UNDP operations.

The evaluation found that UNDP has made significant efforts to align its work with Afghanistan’s national priorities, including through leadership demonstrated in developing the One UN for Afghanistan 2018–2021 strategic document. Further, the country programme under review coincided with a period of transition for UNDP in Afghanistan entailing a series of internal operational and programmatic adjustments. These produced positive changes (e.g., the establishment of dedicated results, strategy and communication teams; improved financial oversight; and the creation of four subregional offices) but also led to volatility (e.g., varying programme delivery rates and workplace issues).

Key messages from the evaluation included the following:

The Law and Order Trust Fund for Afghanistan (LOTFA) remained UNDP’s flagship initiative, contributing to systems development and improvement in police payroll management func-tions and oversight mechanisms, and improving the national police workforce, including through additional female police officers. Support to institutional development was limited due to midcourse adjustments, an issue that needs to be resolved in the next cycle. With

Page 25: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

21Key evaluations undertaken in 2019

a new multipartner trust fund scheme, LOTFA has entered a new phase with an expanded mandate to address rule-of-law and security reforms. This requires significant coordination with relevant partners.

UNDP continued to support elections, bringing credibility and legitimacy to Afghanistan’s polling process. It backed gains in human rights and gender equality by improving peo-ple’s access to justice through legal aid facilities, and supported the preparation of various studies for a multi-year anti-corruption programme. It also played a crucial coordination role in the country’s health sector by managing projects under the Global Fund to Fight AIDS, Tuberculosis and Malaria, which are now in a second grant cycle.

Stemming from past recommendations to deliver more direct impacts to the Afghan people, UNDP produced various project-level outputs in its livelihoods and resilience work. Several design issues were raised, however, including overly ambitious project goal and budget setting, the limited scale of interventions and the lack of evidence-based approaches.

UNDP contributed to strengthening the national SDG framework under a single governance project, but these efforts need to be widely integrated into programmes in Afghanistan overall.

Operationally, programme design and implementation need to be strengthened, including more risk-informed, evidence-based programming; synergies across different portfolios; increased engagement in policy dialogue; field monitoring practices; and enhanced sub regional office capacity to substantively contribute to UNDP programmes.

A robust partnership and resource mobilization strategy would bolster UNDP’s role, ensuring engagement with major development players, including the international finan-cial institutions, the United Nations Assistance Mission in Afghanistan, donors and civil society organizations.

The findings of country programme evalu-ations have provided significant lessons for UNDP on both the need for and the approaches to evaluations in crisis and unstable settings. The process has highlighted that evaluation is even more necessary in these contexts, and importantly, can be done. Planning and part-nership are key to success. Evaluators need to be patient and flexible, and have access to additional resources given the greater com-plexity and security concerns in crisis settings.

Lessons from these evaluations have further strengthened the office’s work and method-ological approaches for the future, especially in 2020, when the IEO will evaluate UNDP’s work in a number of additional crisis-affected countries.

A positive step in 2019 has been the strength-ening of links between country programme evaluations and corporate and thematic eval-uations. Country programme evaluations for Lebanon, Syria and Turkey have informed the corporate evaluation of UNDP’s support to the Syrian refugee crisis response and integrated resilience approach to human migration (3RP).

To further support 100 percent country programme coverage by the ICPEs, the IEO undertook an independent review of its ICPE methodology, helping to strengthen it and improve the efficiency and quality of IEO outputs.

Page 26: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation22

Regional Bureau for Europe and the CIS country programme evaluation clusterIn 2019, 11 ICPEs from country offices under the Regional Bureau for Europe and the CIS were included in an ICPE cluster approach. This allowed the IEO to adopt a more cost-effective mode of operating.4 Each of 10 countries and one territory included in the cluster underwent an ICPE examining UNDP’s work at the country level during the programme cycle from 2016 to 2020. Results are expected to provide a set of forward-looking recommendations for developing the next country programme document for each country, and to feed into a regional synthesis report.

Country programme evaluation in Iraq, 2015 to 20185

UNDP support to Iraq aimed to address the

most pressing needs in areas newly liberated from

ISIL (Islamic State in Iraq and the Levant) while

maintaining reduced programmatic support in

other areas.

Given that the country programme document was designed in 2015, prior to the ISIL crisis, it did not provide an adequate guiding programmatic framework to respond to the changes in the country. The ICPE covered the period from 2015 to 2018, following the programme structure, which organized work around four pillars, including stabilization, economic diver-sification and employment, governance and reconciliation, and environment and energy.

• UNDP in Iraq successfully created a model of intervention under the stabilization component that supported key political objectives and recovery in the immediate post-conflict period. It demonstrated the importance of retaining programmatic flexibility in that setting and adjusting the programme to meet emerging needs.

• UNDP is delivering the largest stabilization programme to date, with significant results.6

It is considered a highly valued partner. Institutional partners are clearly committed to continue working with UNDP and supporting it directly if possible.

• Relatively limited attention has been paid to national priorities outside the newly liberated areas. At the time of the evaluation, only small interventions reached southern areas of Iraq that are experiencing major difficulties. Large programmes operating outside of newly liberated areas and in areas receiving internally displaced people had not pro-gressed on the delivery of outputs and outcomes. A return to regular programming was taking longer than expected.

A positive step in 2019 has been the

strengthening of links between

country programme evaluations

corporate andthematic evaluations&

Page 27: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

23Key evaluations undertaken in 2019

• While UNDP has effectively managed the delivery of the largest stabiliza-tion programme to date, employing innovative operational processes and improved turnaround times to increase transparency and efficiency, it has lacked a coherent and compre-hensive programme structure in line with national and regional priorities. Further, resource mobilization efforts have not been adequate or capitalized on recent successes.

• Due to limited quality assurance and monitoring functions, and the absence of evaluation capacity, there is insuf-ficient analysis of performance and effectiveness to support programme development, prioritization and imple-mentation. The absence of knowledge management and information-sharing has exacerbated the tendency to imple-ment programmes in isolation, thereby limiting the opportunity to leverage synergies and expertise.

2.2 Corporate and thematic evaluations

At the start of 2019, the IEO submitted its eval-uation of UNDP’s work in the least developed countries, which was followed by a detailed UNDP (LDCs) management response during the annual session of the Executive Board. During 2019, several significant corporate and thematic evaluations were in various stages of planning and implementation for final production in 2020 and 2021.

The management response to the evaluation on work in the least developed countries7

The IEO presented its findings, conclusions and recommendations from its evaluation of UNDP support to poverty reduction in the LDCs in the Annual Report on Evaluation 2018, and to the Executive Board at its first board session in January 2019 (DP/2019/4). Due to the evaluation’s complexity and detailed recommendations, UNDP presented an initial management note that was followed by a more detailed management response (DP/2019/17) at the Executive Board’s annual session in June 2019. The latter fully outlined steps to address the evaluation’s conclusions and recommendations.

Page 28: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation24

RECOMMENDATION MANAGEMENT RESPONSES

1 UNDP should consider a more consistent engagement in a set of poverty reduction subthemes.

UNDP appreciates the complexity of developing a consolidated offer on poverty eradication that can meet the needs of a diverse set of countries such as the LDCs.

2

UNDP should better define for government counterparts the poverty reduction areas where it intends to stake out a strong technical support role, and detail the substantive tools and solutions it can provide for sustainable income generation and livelihoods.

Signature solution 1, keeping people out of poverty, addresses interconnected social, economic and environmental challenges faced by the poor and vulnerable by focusing on determinants of both exiting poverty and falling back into poverty.

3 UNDP should demonstrate global leadership in the development and use of multidimensional poverty indices.

The Human Development Reports were first published in the late 1980s when it became clear that progress was not defined by income growth alone, but by the ability of people to live the lives they value.

Going forward, UNDP will continue to forge closer collaboration with the United Nations system and other partners to strengthen the capacities of national statistical institutions to implement, monitor, track and report on Sustainable Development Goal achievement.

4

UNDP should increase the pace and thrust of its support to private sector development and impact investment in LDCs. Given the structural constraints in harnessing market opportunities, innovative private sector finance tools should be improvised and promoted in LDC contexts.

The forthcoming UNDP private sector development and partnerships strategy will drive progress on three strategic priorities: unlocking private finance for the Sustainable Development Goals, aligning business strategies and operations with the Goals, and developing policies that foster a green and inclusive economy.

5

Further emphasis is needed to enable linkages between UNDP community-level sustainable livelihood programmes and rural poverty alleviation policies in LDCs. While fulfilling respective funding stream commitments, synergies between various sustainable livelihood interventions under the Global Environment Facility (GEF) and Green Climate Fund in the country programmes need to be strengthened. UNDP should take measures to leverage this important area of its work to better inform government policies and programmes.

UNDP recognizes the importance of strengthening its poverty and environmental approaches to sustainable livelihoods, as enshrined in the Strategic Plan. The linkages across the vertical funds and other aspects of the UNDP poverty portfolio will also benefit from the integrated thinking that underpins the Global Policy Network. UNDP acknowledges that the governing instrument of the vertical funds calls for resources to contribute to the Sustainable Development Goals, thus providing a foundation for better integration with the UNDP poverty eradication focus.

6

Bridging the humanitarian and development divide for more sustainable poverty reduction should be systematically pursued in crisis and post-crisis contexts. UNDP should also pay sufficient attention to intersecting vulnerabilities that reverse poverty reduction outcomes.

UNDP recognizes that the root causes of many crises lie in endemic acute poverty for which there needs to be a concurrent coordinated and multifaceted response. UNDP works closely with humanitarian, peace and national partners to jointly identify medium-term collective outcomes that have an impact on protracted humanitarian challenges including poverty indicators.

Page 29: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

25Key evaluations undertaken in 2019

7

Partnerships for poverty reduction at the global and country levels should be pursued as a strategic programming option. UNDP should expand promising partnerships with United Nations and other development agencies that substantively and practically enhance its poverty-related programming in LDCs, especially to scale up pilot and community-level initiatives.

UNDP will capitalize on its existing partnerships at the country, regional and global levels to deliver an integrated package of poverty solutions which are country relevant. Along with ILO, UNFPA (United Nations Population Fund), UNICEF (United Nations Children’s Fund) and the World Food Programme, UNDP is a core founding member of the Joint Fund for the 2030 Agenda, an interagency pooled funding mechanism to support the acceleration of Sustainable Development Goal achievement at the country level.

8UNDP should pay further attention to strengthening gender-responsive poverty reduction policy processes. There is a need for more dedicated resources and commitment to gender equality and women’s empowerment in the LDCs.

The gender equality strategy, 2018–2021 will help UNDP to ensure that its support for eradicating poverty includes a focus on gender inequality.

9 UNDP should take steps to enhance its programming on youth employment and empowerment.

UNDP is committed to scaling up its programming on youth employment and empowerment. Its focus is to facilitate youth engagement in areas of economic, social and political activities, and to enhance institutional capacities (public and private) to interact with and create conditions for youth empowerment and employment for poverty reduction

Forthcoming evaluations

Evaluation of the common chapter of the strategic plans of UNDP, UNICEF, UNFPA and UN Women

In line with a request of the UN General Assembly in the Quadrennial Comprehensive Policy Review (resolution 71/243) as well as the Secretary-General’s reforms for the repo-sitioning of the United Nations development system to deliver on the 2030 Agenda for Sustainable Development, UNDP, UNFPA, UNICEF and UN Women have committed to working better together to support the achievement of development results. This commitment is embodied in a common chapter of the respective organizational strategic plans for 2018 to 2021. The chapter detailed six areas of collaboration and four approaches to working better together.

AREAS OF COLLABORATIVE ADVANTAGE

APPROACHES TO WORKING BETTER TOGETHER

• Eradicating poverty

• Addressing climate change

• Improving adolescent and maternal health

• Achieving gender equality and the empowerment of women and girls

• Ensuring greater availability and use of disaggregated data for sustainable development

• Contributing to peacebuilding and sustaining peace in conflict and post-conflict situations

• Planning together

• Implement programmes together differently

• Enhance multistakeholder partnerships

• Enhance efficiency together

In 2019, the evaluation offices of UNDP, UNFPA, UNICEF and UN Women began a joint evaluation of the common chapter to provide an independent assessment of prog-ress and results. The evaluation has taken

place in two phases, starting with a baseline study and evaluability assessment. The evaluability assessment report (finalized in February 2020) verified the existence of basic evaluation parameters—such as the quality

Page 30: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation26

of design and data availability. It identified the extent to which the common chapter has changed how UNDP, UNFPA, UNICEF and UN Women work together, in programmes and operations, to leverage results at the country level, with a focus on the six areas of collaborative advantage.

Based on available information and an assessment of process indicators, the eval-uability assessment highlighted how the absence of a conceptual framework, com-bined with weak management and support

mechanisms and insufficient guidance, has made it difficult to actualize the common chapter at the country level. The scope and intent of the common chapter, as well as roles and responsibilities in its operationalization, remain unclear or have been interpreted dif-ferently. Perceptions about the value added of the chapter varied, particularly in the con-text of UN reform.

The evaluability assessment found very limited evidence that the common chapter had significantly improved collaboration

among the four agencies. Most gains in plan-ning and implementing programmes and achieving efficiency have occurred through pre-existing structures. The extent to which the common chapter has led the four agen-cies to work ‘better’ and ‘differently’ together remained uncertain. The planned acceler-ators have progressed only when linked to pre-existing initiatives loosely connected with the common chapter.

The initial plan was for a second phase to examine the modalities through which the four agencies have been working together, as well as with other United Nations part-ners, in the context of the ongoing reform of the United Nations development system, including the common chapter. At the 2020 annual sessions of the respective Executive Boards, a decision will be needed on whether to move forward.

Evaluation of UNDP’s development cooperation in middle-income countries UNDP’s global strategic presence is pred-icated on the recognition that while there are obvious development challenges in the least developed and crisis countries, most middle- and high-income countries have unfinished development agendas. These include, albeit not exclusively, pockets of poverty and high spatial, income and gender

Page 31: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

27Key evaluations undertaken in 2019

inequality. UNDP’s programme expenditures in the 83 middle-income countries where it operates amounted to over $11.5 billion from 2014 to 2019 (October 2019), representing approximately 59 percent of overall pro-gramme expenditures. Contributions from programme countries to projects in their own countries (government cost-sharing) take on greater significance in middle-income countries, although the amounts vary substantially.

Given the importance and prominence of middle-income countries in UNDP’s work, an evaluation of work in them is highly relevant. The evaluation is expected to:

a. Assess the nature, type and scale of UNDP support to middle-income countries, considering their wide diversity of devel-opment conditions and needs;

b. Assess UNDP’s contributions through key priority areas of support; and

c. Identify the factors affecting UNDP’s posi-tioning and engagement in middle-income countries.

While the evaluation will inevitably adopt a retrospective approach, evaluating what has already been done, its prime benefit to UNDP should be through its compilation,

assessment and systematic presentation of key lessons and recommendations for future approaches and actions.

The evaluation will cover the first two years of the current UNDP Strategic Plan (2018-2021) and the four years of the previous plan (2014-2017). While this period of six years will be examined in detail, it will be contextualized through a brief historical analysis of UNDP’s longer-term engagement in middle-income countries, highlighting any significant events or trends. The evaluation will cover support in all five UNDP regions. The final evaluation is due for submission to the Executive Board at the second regular session in 2020.

Evaluation of UNDP programming for climate resilience Through its signature solution 3, UNDP’s Strategic Plan emphasizes an integrated approach to helping countries address disaster risks and adapt to climate change.8

The strategy singles out countries that are highly exposed to hazards and slow- or rap-id-onset crises as requiring a distinct form of support. UNDP’s work on disaster resilience and climate change is defined through four interrelated areas of engagement: disaster risk reduction, climate change, disaster recovery and sustainable energy.

Global funding for UNDP climate change adaptation and disaster risk reduction work is approximately $250 million annually, up from $182 million five years ago. Climate change mitigation, or a mixture of adapta-tion and mitigation, absorbed $185 million in 2017. UNDP also provides a significant amount of disaster risk reduction support in recovery efforts, which accounted for over $600 million in 2017.

The objectives of the evaluation are to:

• Evaluate UNDP achievements and perfor-mance in addressing the vulnerability of partner countries to natural hazards and increasing risks brought about by climate change; and

• Provide actionable recommendations for future UNDP strategic planning and pro-gramme implementation that can enhance support services for countries in the areas of disaster risk reduction and climate change adaptation.

Combining disaster risk reduction and climate change adaptation in one evaluation recognizes that the two challenges are closely related and should ideally be addressed in an integrated manner, and that UNDP has closely linked these two areas in its strategic planning and programming.

Page 32: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation28

UNDP provides governance support to partners across a wide array of social and economic activities that could be construed as contributing tangentially to climate adaptive capacity. The evaluation will not endeavour to gauge the extent and results of this wider governance work, even as it recog-nizes that building adaptive capacity requires a combination of climate risk management (specific adaptive capacity) and broader socio-economic and political reform (generic adaptive capacity). These elements together shape a country’s overall vulnerability.

Since conflict impairs disaster preparation and recovery efforts, the evaluation will analyse UNDP’s support to some affected countries. It is beyond the scope of the evalu-ation, however, to assess the extent to which natural disasters and climate change may intensify conflicts.

While sustainable energy occupies an important niche in UNDP’s climate and disaster resilience programming, the evalu-ation will not look at results in this area other than to recognize the contribution to climate change mitigation and the overall portfolio of UNDP’s climate-related programming. The final evaluation is due for submission to the UNDP Executive Board at the second regular session of 2020.

Evaluation of UNDP support to the Syrian refugee crisis responses The evaluation assesses the contribution of UNDP to the Syrian refugee response at the national level and to the 3RP, with a spe-cific emphasis on the integrated resilience approach and corporate learning in other human migration responses.

As the Syrian crisis enters its ninth year, the conflict’s protracted nature, and its complexity, severity and scale have led to the largest refugee displacement in the world. Massive humanitarian and devel-opment impacts continue to unfold. UNDP has supported a shift in approach towards resilience-building aimed at bridging the humanitarian-development divide. The UNDP programme portfolio encompasses initiatives on livelihoods, employment and local economic development; local and municipal service delivery; social cohesion; and natural resources and environmental sustainability. Programmes aim to mitigate the socio-economic impact of the crisis on the most vulnerable, and support concerned governments in coping, recovering and addressing the consequences of the influx of refugees.

The evaluation will build on four ICPEs, conducted in 2019 in Iraq, Lebanon, Syria and Turkey, to provide in-depth insights on UNDP's engagement and contributions to the Syrian refugee response. In addition, the evaluation will consider UNDP’s resil-ience-building approach in countries hosting refugees, and the organizational structure set up to coordinate 3RP interventions. The evaluation will examine to what extent the concept of resilience-based development has been used as an underpinning frame-work in other migration and displacement programmes and corporate approaches. More particularly, the evaluation will look at UNDP resilience-based development pro-grammes in response to multicountry and cross-border migration and displacement in the more recent Rohingya crisis and the Lake Chad Basin9 region.

The evaluation will contribute to strength-ening the Syrian refugee response, and developing corporate strategies, policies and programmes on migration and displacement. It will strengthen UNDP's accountability to global, regional and national development partners, including the Executive Board. The evaluation will be presented to the Executive Board at the annual session in June 2020.

Page 33: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

292019 Annual Report on Evaluation

Advancing global evaluation culture and practice in 2019

CHAPTER 3

Page 34: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation30

This chapter outlines the IEO’s continued commitment and

contribution to strengthening an evaluation culture globally, within

UNDP and the United Nations, and beyond. This entails work with a

broad range of regional and global communities of practice, including

the UNEG as well as with national governments in developing

evaluation capacities. The office continues to provide support through

the National Evaluation Capacities Conference, the International

Programme for Development Evaluation Training (IPDET), the

International Development Evaluation Association (IDEAS) and

other platforms.

3.1 National Evaluation Capacities Conference 2019

The IEO organized the National Evaluation Capacities (NEC) Conference 201910 in Hurghada, Egypt in partnership with the Government of Egypt’s Ministry of Planning, Monitoring and Administrative Reform. The conference took place from 20 to 24 October. It was the sixth in a series of biennial con-ferences on national evaluation capacities, aimed at advancing the use of evaluation to improve development effectiveness, which has become increasingly salient in the SDG era.

The IEO debuted the first conference a decade ago in 2009 in Casablanca, Morocco, with 55 participants from 30 countries. Ten years later, the sixth conference brought together over 500 participants from 107 countries, around the theme ‘Leaving No One Behind: Evaluation for 2030’.

The conference opened with remarks by United Nations Deputy Secretary-General Amina Mohammed, who underlined the importance of evaluation, stating, “the imple-mentation of the SDGs can be accelerated globally by bolstering evaluation, a pow-erful tool that improves public accountability and contributes to positive development

change.” Welcoming addresses were also given by the IEO Director Indran Naidoo as well as Randa Aboul-Hosn, UNDP Resident Representative to Egypt, and Her Excellency Hala Helmy El Saeed, Minister of Planning, Monitoring and Administrative Reform, Egypt, who emphasized that “the moni-toring and evaluation process is crucial for the development process and technology is important in improving the monitoring and evaluation process.”

Following the opening ceremony, the first plenary session set the scene for the confer-ence, with an exploration of what ‘leaving no one behind’ means for evaluation in light of the 2030 Agenda. Pedro Conceição,

Page 35: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

31Advancing global evaluation culture and practice in 2019

Director of the UNDP Human Development Report Office, shared emerging findings of the 2019 Human Development Report on inequalities, pointing out that ‘leaving no one behind’ manifests in many different ways. A panel discussion with eminent evaluators fol-lowed, delving into the implications of these findings for evaluation.

The second day of the conference saw a further important plenary session focused on evaluation criteria. The five principle cri-teria of relevance, effectiveness, efficiency, impact and sustainability first articulated by the Organisation for Economic Co-operation and Development/Development Assistance Committee (OECD/DAC) in 1991 have become core to evaluation policy and practice. During the previous National Evaluation Capacities Conference in 2017, a discussion began on taking stock of global experience using the five traditional evaluation criteria. At the 2019 conference, Megan Kennedy-Chouane of the OECD/DAC presented current thinking on the criteria following two years of discussion in the global evaluation community. This pro-cess has shaped emerging new definitions and principles for use. Panellists shared their reflections and critiques from a variety of perspectives, reminding the audience that evaluation criteria provide a foundation for better evaluation. This, they noted, requires not only asking the right questions, but also asking who is asking the questions, and how are questions answered.

Day three opened with a plenary session on architecture for evaluation effectiveness, beginning with a keynote speech by the IEO Director, who highlighted that four critical areas for strengthening an evaluation func-tion are evaluation policy, evaluation quality, evaluation coverage and communication.

Multiple conference sessions carried these themes forward, particularly with respect to strengthening national evaluation systems for the SDGs. For example, a panel with

representatives from Bangladesh, Finland and Nigeria noted that successful efforts to track progress on the SDGs require a ‘whole-of-government approach’ with high-level commitment. The panellists concluded that countries need credible road maps of how to achieve the SDGs, underscoring why evaluation is imperative. Other sessions emphasized that strengthening an M&E system is not an event, but a process that requires commitment from all stakeholders. Such processes are not just technical but

Page 36: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation32

political. Addressing participation, voice and power is central to institutionalizing equity in evaluation, yet engaging citizens and ensuring their voices are heard is an iterative process that takes time.

A further set of conference sessions explored lessons learned as well as new tools and techniques that can help transform evalu-ation in support of the SDGs. One session

examined SDG 13 on climate action in detail, noting that one of the world’s greatest col-lective challenges is coping with a changing climate. In this, evaluation has an important role. Another session demonstrated that geospatial data and methods offer pow-erful tools to ‘open up’ theories of change and show unanticipated consequences and impacts. Examples from Afghanistan, Liberia and Somalia illustrated that new

technologies can provide real-time, ground-truth answers to key programmatic design and implementation questions.

Partnerships are crucial for transforming evaluation. Engagement with the private sector in particular is critical, even central, to achieving the SDGs. Risk, reticence and reluctance make evaluation essential to over-seeing and managing these partnerships,

47% of the participants were women

2019 National Evaluation Capacities Conference

LEAVING NO ONE BEHIND:EVALUATIONfor 2030

Over 500 participants

from 100 countries

21 conference sessions

21 training workshops

76 governments represented

In a nutshell: the National Evaluation Capacities Conference 2019

Page 37: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

33Advancing global evaluation culture and practice in 2019

towards ensuring the proper use of public funds to pursue the goals. Another session highlighted the contributions that youth and young evaluators can bring to evaluation.

Leaving no one behind was a constant theme throughout the conference. Speakers pointed to the importance of asking, “Why are we doing the evaluation? For whom?” They stressed engaging communities in evaluation to strengthen credibility and bring out the voices of diverse people. An important reflec-tion was that data, and data collection, may be challenges but people are there, ready to tell their stories. With respect to gender, despite progress, speakers noted that the evaluation community needs to collectively advocate for more gender-responsive evalu-ation. Another session shared six principles that should apply to evaluations to ensure that they leave no one behind, including mapping stakeholders at the outset of an evaluation, and sustaining stakeholder engagement throughout.

The three days of vibrant exchanges and the sharing of lessons learned, experi-ences, thoughts and ideas provided the 500-plus participants with new knowledge, renewed motivation and heightened com-mitment to fostering evaluation that leaves no one behind and helps accelerate progress towards the SDGs. This was complemented by two days of pre-conference training,

when 30 evaluation experts from around the world offered 21 different workshops in three languages to 280 participants from governments, civil society, and UN and other development partner agencies.

3.2 The International Program for Development Evaluation Training

The IPDET, founded in 2001, is a renowned executive training programme that provides evaluation managers and practitioners with tools to evaluate policies, programmes and projects at all levels, as well as to com-mission, manage and use evaluations for decision-making. In 2018, the programme moved to Bern, Switzerland, where it oper-ates under a partnership between the Center for Continuing Education at the University of Bern, the Center for Evaluation at Saarland University in Germany, and the Independent Evaluation Group of the World Bank.

In 2019, the IPDET requested the IEO to develop and deliver a training course on evaluation in the United Nations. The course introduced the overall context of evaluation in the United Nations, with the first sessions presenting the importance of evaluation for accountability and learning in the UN con-text; the UNEG norms and standards; the SDGs and the implications for integrating

human rights, gender equality and ‘leave no one behind’ perspectives; and the IEO and the types of evaluations it conducts. A resource package for the course can be modified and replicated as needed. Twenty-four partici-pants from 18 countries, and representing UN agencies, academia, governments and evaluation associations, took the two-and-a-half-day course. Their positive response led to an invitation for the IEO to continue offering the course in 2020.

3.3 The United Nations Evaluation Group

The UNEG Annual General meeting was held in Nairobi, Kenya from 16-17 May 2019, adopting a new Strategic Plan for 2020-2024. As vice chair of the UNEG, the IEO Director chaired a session on progress achieved by working groups under UNEG Strategic Objective 2 from 2018 to 2019. During the UNEG Evaluation week, prior to the Annual General Meeting, the IEO Director held side meetings, co-sponsored with UNAIDS, to follow up on the UNAIDS peer review, in which he has been an active participant.

Key discussions covered the roles and support of the UNEG membership in ongoing UN reforms such as the revision of guidelines on UN Sustainable Development Cooperation Frameworks (UNSDCFs, now the main

Page 38: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation34

instrument for planning and implementing UN development activities to support national governments), the framework eval-uation process, and the establishment of an independent system-wide evaluation function.

IEO staff are engaged across many UNEG working groups, including on the new system-wide evaluation policy, draft guide-lines for evaluating the new UNSDCF and strengthened evaluation peer reviews.

System-wide evaluation policyThe UNEG has recommended a new system-wide evaluation policy to the UN Secretary-General. Expected to be launched in 2020, the policy revises the Independent System-wide Evaluation Policy established in 2013. The IEO has been a key partici-pant in developing the new policy, serving as co-coordinator of the UNEG working group drafting it and establishing related support mechanisms.

The overarching purpose of the policy is to generate cohesive and timely evaluative evi-dence across the UN development system at country, regional and global levels. This offers particular value to UN governing and legislative bodies in oversight, deci-sion-making and direction-setting for the UN system. It supports UN system leader-ship in seeking better understanding of SDG

progress and impediments, and contribu-tions to system-wide results. It also assists UN Member States as they look to the United Nations for guidance and support to achieve the SDGs in their respective countries.

The policy defines system-wide evaluation efforts, roles and responsibilities at country, regional and global levels. At the country level, the policy covers the evaluation of the UN Sustainable Development Cooperation Framework and related joint funds. At the regional level, it applies to regional system-wide evaluations, regional knowl-edge-sharing and regional engagement on Cooperation Framework evaluations. At the global level, the policy pertains to planning, resourcing and conducting system-wide evaluations, as well as knowledge-sharing and reporting.

A multipartner trust fund is being established to fund system-wide evaluations, with par-ticular emphasis on how the UN is fulfilling its mission to advance the 2030 Agenda and the SDGs.

3.4 African Evaluation Association

As part of its support to national evaluation capacity development, the IEO contributes to regional and global evaluation events such as the African Evaluation Association’s (AfrEA)

biennial conferences. Founded 20 years ago, the association has grown considerably. Its conferences now attract more than 600 participants and offer multiple platforms for learning and exchange. The ninth AfrEA conference was held in Abidjan from 11 to 15 March 2019 on the theme ‘Accelerating Africa’s Development: Strengthening National Evaluation Ecosystems’.

In partnership with CLEAR AA, Twende Mbele, the African Development Bank and Oxfam, the IEO contributed substantively to the 2019 conference by co-convening a strand of discussions on the roles of the judi-ciary, executive and legislature in responsive national evaluation systems. UNDP organized one of the sessions within the strand, and provided opportunities for evaluators and evaluation commissioners from programme countries to attend the conference with a con-tribution of $20,000. The IEO participated in a half-day round table workshop on the profes-sionalization of evaluation, organized by the UNEG Working Group on Professionalization.

The UNEG has recommended

a new system-wide evaluation policy

to the UN Secretary-General and it is expected to

launch in 2020.

Page 39: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

Oversight and support to UNDP decentralized evaluations

CHAPTER 4

Page 40: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation36

In line with the IEO’s commitments under the Evaluation Policy,

this chapter provides detailed oversight and analyses of decentralized

evaluations undertaken across UNDP in 2019. The chapter also shows

the considerable support the IEO has given, in partnership with

UNDP, to strengthening the evaluation function, providing guidance

and capacity development, and assessing the quality of evaluations

across UNDP.

4.1 UNDP investment in evaluation in 2019

UNDP spent $25.7 million on evaluation in 2019. The IEO had a budget expenditure of $10.9 million for evaluations, institutional

activities, and staff and rental costs, with $10.5 million allocated from core resources. The allocation was in line with the requested annual budget for the office approved by the Executive Board in its first session of 2018.11

In 2019, considerable additional expenses

resulted from the increased number of ICPEs to secure full coverage of the new country programme cycle, as well as the greater costs in implementing evaluations in crisis countries such as Afghanistan, Iraq, Somalia and Syria.

UNDP country offices spent $13.2 million on evaluation during 2019. This included eval-uation implementation costs ($7 million), staff costs ($4.8 million) as well as additional evaluation-related costs ($1.5 million).12

Expenditure at headquarters and by regional bureaux in implementing, supporting, and overseeing evaluation amounted to $1.6 mil-lion, including evaluation costs ($250,000), staff ($1.3 million) and additional evaluation expenditures ($47,000).

$10.9 million

Regional bureaux & headquarters

$13.2 millionCountry offices

$1.6 million

IEO

Total expenditure for evaluation in UNDP

Page 41: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

37Oversight and support to UNDP decentralized evaluations

4.2 Decentralized evaluation implementation in 2019

Through the Annual Report on Evaluation, the IEO continues to oversee and report on the implementation of UNDP’s substantial port-folio of decentralized evaluations. In 2019, UNDP completed 290 evaluations (Figure 1) planned for the year. These comprised 132 UNDP project evaluations (46 percent), 126 GEF terminal evaluations or mid-term reviews (43 percent), 15 outcome evalua-tions (5 percent), and 17 UN Development Assistance Framework (UNDAF), thematic or country programme document evaluations (6 percent).

The number of completed evaluations once again fell short of the number planned at the start of the year (531). This impacted expen-diture on evaluations, which declined from a planned evaluation budget of $16.8 million to an actual recorded expenditure of $7.25 mil-lion (43 percent). During the year, 658 changes were made to evaluation plans, including 144  evaluation cancellations or deletions, 207 additions and 307 date changes. While 75  percent of GEF evaluations were com-pleted as planned, only 55 percent of UNDP project evaluations, 30 percent of UNDAF and other evaluations, and 25 percent of outcome evaluations were completed as planned. For 2020, 504 decentralized evaluations are planned with a budget of $17.2 million.

Figure 1: Evaluation planning and implementation, 2019, numbers and budgets

EVALUATION TYPE

PLANNED EVALUATIONS

(1 FEBRUARY 2019)

COMPLETED EVALUATIONS

(5 FEBRUARY 2020)

EVALUATIONS COMPLETED, PERCENTAGE

EXPENDITURE (US $)

AVERAGE EVALUATION

EXPENDITURE

Africa 108 59 55% 1,724,356 29,226

Arab States 71 41 58% 964,104 23,515

Asia and the Pacific

144 89 62% 2,456,041 27,596

Europe and the CIS

82 49 60% 938,815 19,160

Latin America

and the Caribbean

102 45 44% 913,291 20,295

Global 24 7 29% 249,783 35,683

Total 531 290 55% 7,246,390 24,988

Source: ERC, 5 February 2020.

Page 42: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation38

4.3 What is being evaluated in UNDP?

As was highlighted in previous years, the IEO is concerned that UNDP is not evaluating across its portfolio to capture lessons and results that support the development of its programme work. In 2019, GEF evaluations again represented 43 percent (Figure 2) of all UNDP evaluations (126), while UNDP project evaluations accounted for 46 percent (132), and outcome, UNDAF, country programme document and other evaluations accounted for 11 percent (32). In some regions, GEF eval-uations accounted for more than 50 percent of all evaluations completed, including in Africa, at 56 percent of the total (33 out of 59 completed), and Asia and the Pacific, at 53 per cent of the total (47 out of 89 evaluations).

Figure 2: Numbers of completed decentralized evaluations by region, 2019

Source: ERC, 5 February 2020.

 

Africa Arab States

Asia and the Pacific

Europe and the CIS

Latin America and the

Caribbean

Global Total Share, percentage

UNDP PROJECT

EVALUATIONS19 26 39 27 5 16 132 46%

UNDP GEF EVALUATIONS 33 9 47 16 1 20 126 43%

OUTCOME EVALUATIONS 2 4 0 4 0 5 15 5%

UNDAF AND OTHER

EVALUATIONS5 2 3 2 1 4 17 6%

TOTAL EVALUATIONS

COMPLETED59 41 89 49 7 45 290

PERCENTAGE OF TOTAL

EVALUATIONS20% 14% 31% 17% 2% 16%

Analysis and comparison of UNDP’s budget allocation, SDG prioritization and evaluation implementation across 2018 and 2019 (Figure 3) showed that SDGs 1, 3 and 16 (no poverty; good health and well-being; and peace, jus-tice and strong institutions) accounted for 58 percent of UNDP’s programmatic budget

allocation,13 yet these three goals accounted for only 29 percent of evaluations for the same period. While SDGs 12, 13, 14 and 15,14 which are focused on environmental issues, accounted for 13 percent of the overall budget allocation, they accounted for 42 percent of evaluations.15 This suggests

In 2019, UNDP completed

290 evaluations. For 2020,

504 decentralized evaluations are planned with a budget of

$17.2 million.

Page 43: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

39Oversight and support to UNDP decentralized evaluations

a misalignment of evaluative planning and focus and the SDGs. As GEF projects have clear and enforced mandatory evaluation requirements, this is often the focus of the evaluative work of programme units. This means UNDP is still not capturing the broader spectrum of interventions across the goals. With only 10 years to achieve the SDGs and meet obligations under the 2030 Agenda, broader evaluation will be essential to cap-ture what is working or not to best support governments in attaining the goals on time.

4.4 Quality of decentralized evaluations, 2019

2019 marked the fourth year of the decentralized evaluation quality assess-ment process following the 2016 revision of the system. The quality assessment tool has remained the same throughout, ensuring consistency; it will be reviewed in 2020 to consider changes to the UNDP evaluation guidelines.

Figure 3: UNDP budget allocation and evaluations by SDG, 2018 and 2019

UNDP budget allocation

by SDG

UNDP evaluation

by SDG

23%

15%

1% 2% 1%4%

1% 2%5% 4% 3% 3%

6%

18%

9%6%

3%

5%

25%

4% 9%

15%12% 5%

2% 2% 2%

Others

Figure 4: Quality assessment ratings, 2017-201916

100%

80%

60%

40%

20%

0%

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

19.4%

1%

53.8%

19.1%

2017 2018 2019

6.6%

19.3%

51.2%

24%

5.1%

0.4%

53.3%

20.5%

18.1%

7.6%0.5%

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

Page 44: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation40

The IEO quality assessed 210 decentralized evaluations undertaken in 2019.17 The process found that 20.5 percent (43) were satisfactory,

and 53 percent (112) were moderately satisfactory. Another 26.2 percent (55) were moderately unsatisfactory, unsatisfactory

or of a highly unsatisfactory quality. This is in line with the quality assessment process findings from previous years, as reported in earlier Annual Reports on Evaluation.

4.5 IEO and UNDP support to decentralized evaluation

In response to quality concerns around UNDP decentralized evaluations, and in turn evalu-ation credibility and usability issues, the IEO and UNDP since 2016 have embarked on a comprehensive programme to strengthen decentralized evaluation across the organi-zation.  This was partially enabled through financial support from the Swiss Agency for Development Cooperation, which has sup-ported capacity development across regions, the updating of the Evaluation Guidelines and the development of online training.

Figure 5: UNDP evaluation quality, 2017-2019 Highly satisfactory

Satisfactory

Moderately satisfactory

Moderately unsatisfactory

Unsatisfactory

Highly unsatisfactory

70%

60%

50%

40%

30%

20%

10%

0%0%

25%

49%

21%

5%0%

16%

0%0%

43%

28%

13%

0% 1%

19%

60%

15%

5%0% 1%

19%

57%

17%

6%0%0%

17%23%

53%

7%0%

5%

37%

54%

2%2% 0% 1%

21%

53%

19%

6%

Africa Arab States Asia and the Pactific

Europe and the CIS

Latin America and the Caribbean

Global Total

Page 45: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

41Oversight and support to UNDP decentralized evaluations

January Feb., May, July, Dec. March, June, OctoberMarch June - December

Launched new UNDP Evaluation Guidelines

Availability of the French and Spanish versions of the guidelines

Hosted webinars to promote knowledge sharing

Organized regional evaluation workshops with regional bureaux (Europe and the CIS, Africa, Asia and the Pacific, Arab States)

Developed an online evaluation training

Revised Evaluation GuidelinesNew Evaluation Guidelines were launched in January 2019, with French and Spanish versions issued in March 2019. The updated guidelines reflect several recent changes in UNDP in line with the

UNDP Strategic Plan, the 2030 Agenda and the adoption of the SDGs.

The Evaluation Guidelines give renewed emphasis to the importance of planning for evaluations and ensuring appropriate evaluative coverage of UNDP’s work across programmes. They provide greater detail on expected roles and responsibilities for evaluation, and include links to examples of good quality evaluations, with a view to strengthening the quality and utility of future decentralized evaluations.

The UNDP Evaluation Guidelines replace the evaluation sections of the 2009 Handbook on Planning, Monitoring and Evaluating for Development Results.

Regional training and webinar seriesAs part of the roll-out of the new guidelines, the IEO with support from the UNDP Bureau for Policy and Programme Support conducted a series of webinars and training sessions for UNDP staff members. Easily accessible webi-nars took place across the year to introduce the new Evaluation Guidelines and detail key areas and changes. The webinars involved 200 participants, and included the following:

• Introduction to the Evaluation Guidelines (5 February 2019)

• UNDP Evaluation Guidelines: Gender Equality and Women’s Empowerment (14 May 2019)

• UNDP Evaluation Guidelines: Evaluations Plans (15 July 2019)

• UNDP Evaluation Guidelines: Evaluation Quality Assessment (10 December 2019)

Page 46: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation42

The IEO, the Bureau for Policy and Programme Support, the GEF and UNDP’s regional bureaux collaborated to hold three workshops with 158 M&E focal points and programme staff from 100 country offices. In two-day sessions, participants further exam-ined changes in the Evaluation Guidelines, and participated in practical exercises to review evaluation plans to ensure compre-hensive coverage, draft effective evaluation terms of reference, identify skilled evalua-tors and manage evaluations, and ensure high-quality reports. Participants shared experiences and challenges as well as ideas to improve evaluation quality and culture.

The first workshop, in Istanbul in February 2019, drew together regional programme staff and included a session on the ICPE pro-cess, given the high number of these in the region in 2019. In June 2019, two workshops, one in French and one in English, took place in parallel in Addis Ababa for the Africa regional bureau, bringing together Francophone and Anglophone countries. The event marked one of the largest ever gatherings of mem-bers of the UNDP evaluation community. In Hurghada, Egypt, in October 2019, colleagues from the Arab States and the Asia and the Pacific region participated in an intensive joint workshop before taking part in the National Evaluation Capacities Conference.

Participation in workshops rolling out the Evaluation Guidelines

BUREAU VENUE DATE PARTICIPANTS COUNTRIES NOTES

Europe and

the CIS

Istanbul, Turkey

February 51 20

Including 25 regional bureau participants

AfricaAddis Ababa, Ethiopia

June 60 46

Two workshops held in French and English,three French-speaking countries from the Arab States region also represented

Arab States

Hurghada, Egypt

October

20 14 Held prior to the National Evaluation Capacities Conference

Asia and the Pacific

27 20

Latin America

and the Caribbean

Panama City, Panama

- -

Scheduled for March 2020, but postponed due to COVID-19

Total 158 100

Page 47: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

43Oversight and support to UNDP decentralized evaluations

Development of online evaluation trainingDuring 2019, as part of introducing the new Evaluation Guidance and improving eval-uation quality, the IEO developed certified online training. When launched, it will be mandatory for all UNDP staff assigned as M&E focal points, officers or specialists, or as evaluation managers. A non-certified course for UNDP staff at large outlines key evalua-tion approaches and requirements based on the Evaluation Guidance. Both courses will roll out throughout 2020 and will be carefully overseen to ensure all key evaluation staff in regions and country offices have the required knowledge to implement and manage evalu-ations, and produce usable evaluation results and follow-up.

Enhanced Evaluation Resource Centre The ERC is an online platform that facilitates UNDP's efforts to strategically plan and effectively use evaluations for accountability, management for results and knowledge man-agement. The ERC was initially designed as a database to house all UNDP evaluations. It currently gives the public, donors, govern-ments, other UN agencies and UNDP access to 4,500 evaluations and evaluation terms of reference covering 15 years. Evaluations can be searched by a range of criteria, including the SDGs, Strategic Plan outcomes, country or

region, and even evaluator. The ERC provides the organization with one of its most compre-hensive knowledge management tools.

In recent years, the ERC has been further enhanced as an evaluation oversight tool for UNDP bureaux and country offices. Additional detailed dashboards quickly illustrate overdue evaluations and evalua-tion quality (through the quality assurance tool). Management can track management responses and key action implementation deadlines and delays using bureau and country office dashboards. During 2019, the IEO has further enhanced the ERC and has assigned a number of oversight responsibil-ities to regional focal points, including the

review and approval of terms of reference and final reports to the focal points, and the review, validation and approval of changes to evaluation plans.

Use of the ERC has increased over the last 12  months, from an already respectable 956,000 downloads of reports and other doc-uments every year to 1.1 million in 2019. This is likely due to the greater use of the ERC as both a database for evaluations and as an oversight and management tool. Linking the ERC to a number of UNDP knowledge man-agement databases has fostered broader use of evaluations. The ERC is currently being considered in UNDP’s artificial intelligence machine-learning endeavours.

Downloads from the ERC, 2018 and 2019REGION 2019 2018 CHANGE

Africa 300,574 247,984 21%

Arab States 95,586 79,688 20%

Asia and the Pacific 198,423 180,656 10%

Europe and the CIS 161,600 135,934 19%

Latin America and the Caribbean 203,340 197,007 3%

Global 139,290 115,024 21%

Total 1,098,813 956,293 15%

Page 48: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation44

The United Nations Capital Development Fund and United Nations Volunteers

CHAPTER 5

Page 49: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

45The United Nations Capital Development Fund and United Nations Volunteers

The IEO continued to support the United Nations Capital

Development Fund (UNCDF) and UN Volunteers (UNV) in various

capacities in 2019, including through the quality assessment of all

evaluations that the two organizations conducted during the year.

5.1 The United Nations Volunteers

The UNV budget for evaluation in 2019 was around $76,500, drawn from core and non-core funds. The budget covered the cost of UNV’s participation in the 2019 UNEG Evaluation Week and the National Evaluation Capacities Conference as well as the costs of the evaluation team at UNV headquarters in Bonn, Germany.

In 2019, as a result of UNV’s organizational transformation and its focus on decentraliza-tion, evaluation activities had a strong focus on internal capacity-building. UNV continued to provide technical support and quality assurance to decentralized evaluations. It carried out one evaluation, the midterm eval-uation of a project to establish a National Volunteer Program in Côte d'Ivoire, jointly implemented by UNV, UNDP and the national Government.

With UNV’s current Strategic Framework 2018-2021 focusing almost exclusively on advisory services for strengthening national volunteer infrastructure and on volunteer mobilization for the UN system, only a few projects remain to be evaluated. Of the four project evaluations in UNV’s Evaluation Plan 2018-2021, one was finalized in 2018 and one in 2019. The two remaining evaluations will be conducted in 2020. They concern a project on poverty reduction among youth in Cambodia as well as UNV’s Support to the UN Peacebuilding Fund’s Gender Promotion Initiative.

A further two impact evaluations, two thematic evaluations and a midterm review of the Strategic Framework are planned. The midterm review and one thematic eval-uation will take place in 2020. The midterm review will assess overall progress made against Strategic Framework results. It will consider UNV’s institutional effectiveness in view of its organizational and digital transformation processes.

To complement findings emanating from its own evaluations, in 2019, UNV continued to seek inclusion of its joint work with UNDP in the IEO’s ICPEs, as well as in IEO thematic evaluations that touch on areas of UNV spe-cialization. In 2019, collaboration between the IEO and UNV led to the inclusion of informa-tion on UNV, UN Volunteers and volunteerism in the ICPE for Ethiopia. UNV intends to inten-sify this kind of collaboration by providing data for additional ICPEs in 2020.

Key challenges remaining include UNV’s limited evaluation space and competing pri-orities within the organization. In 2019, UNV’s M&E focal points participated in the UNEG Evaluation Week, the National Evaluation Capacities Conference and an IEO-led work-shop on the UNDP Evaluation Guidelines. For UNV, capacity-building is at the heart of promoting a strengthened evaluation culture within the organization, and will help reach the goal of successfully using and benefitting from evaluations.

5.2 The United Nations Capital Development Fund

UNCDF continued to prioritize evaluation in 2019 in line with commitments made in its 2018-2021 Strategic Framework and accompanying Evaluation Plan. With a full staff complement of three evaluation

Page 50: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation46

professionals,18 UNCDF spent $882,000 on evaluation or 1.3 percent of programmatic expenditure, which met the 1 percent target in the Evaluation Policy.

UNCDF’s Evaluation Unit completed three midterm and final evaluations of the fol-lowing multicountry programmes: the joint UNDP-UNCDF Pacific Financial Inclusion Programme (PFIP), the Shaping Inclusive Finance Transformations (SHIFT) South Asian

Association for Regional Cooperation (SAARC) programme, and the Making Access to Financial Services Possible (MAP) global programme. UNCDF also began a joint mid-term evaluation of the Inclusive and Equitable Local Development programme supported by UNDP, UN Women and UNCDF.

Results from the final evaluation of PFIP praised its relevance to Pacific Island coun-tries through its focus on improved policy

and regulation, consumer empowerment and financial education, and financial innovation. PFIP played a key role in supporting Pacific Island governments to develop and imple-ment national financial inclusion strategies, with a new focus on digital financial services. The programme was successful in helping financial service providers roll out innovative new products in pensions, credit, insurance and mobile money reaching 780,000 new consumers, although uptake by low-income

Page 51: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

47The United Nations Capital Development Fund and United Nations Volunteers

populations in rural areas was less successful than envisioned. PFIP support to embedding financial education in national curricula in Fiji, Papua New Guinea and the Solomon Islands has been widely recognized.

The evaluators found that the programme is well managed and cost-efficient, with an average cost of $20 per new client. Going for-ward, they recommended increased support to governments to implement systems of dig-ital payments; more focus on ensuring new financial products reach women and other marginalized populations; and continued support for consumer protection and the integration of financial education into school curricula in other Pacific Island countries to sustain financial inclusion over the long term.

At its midterm stage, the SHIFT SAARC programme was judged relevant to the pri-orities of the Government of Bangladesh in digital financial inclusion, and gender equality and women’s empowerment. Key results have encompassed increased dia-logue between government and private sector actors, including through the Digital Finance Consultative Group, as well as support (together with UNDP’s Access to Information initiative) for new policies on financial inclusion for low-income popula-tions and Electronic Know Your Customer guidelines. Six innovation grants are

intended to promote the digitalization of fast-moving consumer goods supply chains and use of digital financial services by low-in-come micro-merchants, as well as training for micro-merchants in digital technologies to improve their businesses. Evaluators highlighted initial weaknesses in programme management systems, including challenges with staffing and delays in meeting key tar-gets, and a results management system that had not adequately generated evidence to support programme improvements. Going forward, they recommended that the pro-gramme take a more strategic approach to policy advocacy, adopt more systematic capacity development, and be more focused in promoting gender equality and women’s empowerment across programme activities.

Implemented in 14 countries since 2015, the MAP programme has provided comprehen-sive diagnostic studies of financial inclusion markets combined with a participatory approach to drawing up related policy road-maps. It was found highly relevant to least developed country governments, and com-plementary to other initiatives such as the UK Department for International Development’s Financial Sector Deepening Trusts (in Southern African Development Community countries) and World Bank-backed financial sector development strategies. The quality of MAP deliverables was seen as technically

high, although the evaluators highlighted the need for an improved evidence base for pol-icymaking, and noted it has proven difficult for some governments to support roadmap funding and implementation. Despite some constraints, evaluators pointed to the ongoing evolution of market systems for financial inclusion in the countries they vis-ited, for which MAP has provided a strong evidence base and clear policy framework. Recommendations called for an expanded set of more cost-efficient tools with better segmentation of data by socio-economic group, improved tracking of country-level results, and a clearer strategy to help partner governments implement the results of UNCDF’s support.

UNCDF continues to benefit from its part-nership with the IEO, which encompassed independent assessment of UNCDF evalu-ation reports and inclusion in relevant IEO thematic evaluations. UNCDF kept up its cooperation with interested Member States on evaluation, including by working with the Government of Australia on the design and management of the PFIP evaluation. UNCDF also stepped up its support to the UNEG, co-launching with UNICEF an Interest Group on Evaluation Methods, and supporting the development of standards around the peer review of evaluation functions across the UN system.

Page 52: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation48

Staffing and finances of the IEO

CHAPTER 6

Page 53: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

49Staffing and finances of the IEO

The IEO is built on a foundation of skilled professional evaluators

who come with a wide variety of professional experiences and

perspectives, enabling the office to offer comprehensive judgment

through its evaluations.

6.1 IEO staffing

The structural division of the office across four sections19 operated successfully in 2019. To make sure that evaluations draw on diverse insights, colleagues work within and across sections. A staff of 33 comprised 25 professional staff and 8 general service staff.

During 2019, a comprehensive open recruitment took place for 10 new staff posi-tions, including 7 professional and 3 General Service posts. The recruitment brought in new experiences, languages and profes-sional skills that will further strengthen the office’s work.

Professional staff members currently come from 27 countries and speak more than 15 lan-guages. They have an average of 17 years of experience in development and evaluation across a range of development organiza-tions, including the Asian Development Bank, African Development Bank, World Bank, International Monetary Fund, Australian Department for Foreign Affairs and various

United Nations entities. The last comprise the United Nations Environment Programme, UNICEF, the United Nations Office on Drugs and Crime, the International Atomic Energy Agency, UNFPA, the International Fund for Agricultural Development (IFAD), the Food and Agriculture Organization, GEF, the Department of Peacekeeping Operations, the Joint Inspections Unit, the United Nations Secretariat and a considerable number of UNDP country offices across all regions.

IEO structureIn 2019, the IEO’s sectional structure and the dynamic interaction among sections allowed it to achieve full evaluation coverage of all country offices preparing to start a new programme cycle.

One of the IEO’s goals has been to secure the team’s institutional memory by keeping a stable cadre of evaluators who bring consid-erable professional knowledge, and can offer comprehensive understanding of multilateral and bilateral aid and particularly UNDP. The

creation of seven new professional posts has further bolstered the team’s stability. The office will continue to strengthen the professional capacity of its staff and take all strategic measures to secure their retention and protect organizational investments.

Organizational commitmentsThe IEO is highly committed to UNDP’s organizational goals. Several strategies uphold compliance with all relevant pro-cedures, including mandatory training and timely completion of performance manage-ment and development assessments. The IEO has proactively disseminated to staff and consultants the organizational policies, stan-dards and reporting lines that apply to zero tolerance of sexual exploitation, harassment, discrimination and abuse of authority.

Progress on commitment to UNDP’s environmental goals includes significant strides towards sustainable operations aligned with the ‘Greening UNDP Moonshot’. The office has worked with staff to devise strategies to reduce personal and profes-sional carbon footprints. Best practice waste management and energy-saving activities have increased and improved recycling, reduced waste and lowered energy use across the office. Steps to reduce travel encompass conducting evaluation stakeholder work-shops virtually rather than in-country, and combining country missions.

Page 54: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation50

6.2 IEO finances

FinancesIn 2019, the IEO spent $10.9 million on evaluations and other institutional activities (including staffing and rent). This compro-mises a supplementary allocation in the last quarter to finance the extraordinary costs (including security) resulting from the ICPEs conducted in several crisis countries.

The office continues to partner strategically and selectively with external development agencies and governments in advancing the evaluation mandate and function beyond the core work programme. In 2019, the IEO deep-ened its partnerships with the Norwegian and Swiss governments to support the national evaluation capacities diagnostic tool and decentralized evaluation, among other areas. Assistance for participants at the National Evaluation Capacities Conference was pro-vided by Denmark, Germany, UN Women, UNICEF and the World Bank. The IEO does not solicit external funding given its core programme principle of independence.

In 2020, the IEO expects to receive a financial allocation of $14.7 million derived from pre-liminary estimates of UNDP’s combined core and non-core delivery volume of $4.9   bil-lion during 2019. This sum is based on the revised Evaluation Policy, which stipulates an increase from 0.2 per cent to 0.3 per cent

of programme delivery. The revised policy allows additional funds for the IEO to sup-port decentralized evaluation across UNDP, including through the establishment of IEO regional offices.

6.3 Programme of work in 2020 and 2021

Corporate/thematic evaluationsAs detailed above, 2020 and 2021 will be busy years for the corporate and thematic evalua-tion section, with several evaluations planned for completion. Building on the approved IEO multi-year evaluation plan, Executive Board submissions are as follows:

Country programme evaluationsThe commitment to 100 percent evaluation coverage of country programmes prior to the Executive Board’s consideration of new country programme documents means 2020 will be another year of high volume and activity for the IEO. The office will produce 26 ICPE reports for country programme doc-uments due in 2021. Against a backdrop of limited human and financial resources, more flexible, faster and targeted evaluations are required. These will continue to measure UNDP’s progress and achievement of results, and provide credible evaluative evidence to strengthen existing programmes and enhance new country programme documents.

IEO corporate/thematic report tabledExecutive Board session

Annual session– June 2020

Annual Report on Evaluation (information)

Common chapter evaluability study (information)

Second session– September 2020

UNDP’s contribution in middle-income countries (decision)

UNDP’s contribution to climate resilience (decision)

First regular session– January 2021

Conflict prevention and recovery (decision)

Human migration and 3RP (decision)

Annual session– June 2021

Strategic Plan 2018-2021, GEF Small Grants Programme (decision), Annual Report on Evaluation (information)

Second regular session– September 2021

UNDP’s engagement with the private sector (decision)

IEO multi-year evaluation plan (informal)

Page 55: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

51Staffing and finances of the IEO

Accordingly, in 2020, the IEO will initiate a fresh approach to the ICPEs. In several coun-tries, an Independent Country Programme Review (ICPR) will complement the standard ICPE. The review process has been developed based on the findings and recommendations of a 2019 assessment of the ICPE method-ology as well as lessons learned from ICPE implementation in 2019.

The ICPR approach gives the IEO an opportunity to differentiate its approach and efficiently allocate resources across its evaluation portfolio and annual workplan. The review will provide a rapid, indepen-dent assessment of implementation of the country programme document, which will support development of the next country programme, and provide the Executive Board with an overview of progress on agreed out-puts and outcomes in the current programme period. Criteria established by the IEO and discussed with UNDP bureaux detail which country offices will receive ICPEs and ICPRs in the future.

Capacity developmentThroughout 2020, the IEO will continue to support the strengthening of the evalua-tion function across UNDP, in partnership with the Bureau for Policy and Programme Support and regional bureaux. Plans call for rolling out two online evaluation courses, conducting additional evaluation webinars,

providing training for the Regional Bureau for Latin America and the Caribbean and associated country offices, and updating the Evaluation Guidelines and evaluation quality assessment process.

The IEO will also further engage with UNDP management (bureau directors, Resident Representatives and Deputies) in strength-ening the evaluation function in line with Organization Performance Group commit-ments in 2019. The provision of additional resources to the office to establish regional outposts, as per the revised Evaluation Policy, will lend fresh impetus to engage-ment with regional bureaux and country offices. The office is developing a strategic plan to support decentralized evaluation and country offices as well as resource allocation, and tentatively plans to put support in place from 2021.

Going forward, the IEO envisages investment and innovation in several of its workstreams, including the use of information communica-tion and technology for analytical functions. The learning potential of the ERC platform is being enhanced with a full set of histor-ical evaluations ‘tagged’ to the SDGs and their targets. Machine-learning and artifi-cial intelligence tools are being developed to further unlock the analytical potential embedded in the cumulative stock of more than 4,500 evaluations.

Lastly, the office will enhance and expand its national evaluation capacity work during 2020 with a number of activities to follow up the 2019 National Evaluation Capacities Conference. These include the publication of conference proceedings, continued col-laboration with the IPDET to offer courses on evaluation in the UN system, and devel-opment of activities under the Global Evaluation Capacity Initiative in close coop-eration with the World Bank’s Independent Evaluation Group.

IEO Directorate transitionIn early 2020, the IEO’s Director, Indran Naidoo, came to the end of his term after nearly eight years with the office and UNDP (2012 to 2020). The IEO’s new Director, Oscar A. Garcia, was previously the Director of the Independent Office of Evaluation at IFAD in Rome.

The ICPR approach gives the IEO an opportunity to differentiate its approach

and efficiently allocate resources

across its evaluation portfolio and annual workplan.

Page 56: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

ANNEXES

Page 57: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

53Annexes

* 2019 planned evaluation numbers and budgets are from the ERC, 1 February 2019.

Annex 1: Snapshot of all decentralized evaluations in 2019

Table 1. Evaluation planning versus implementation, 2019, numbers and budgets

Evaluation typePlanned

evaluations(1 February 2019)

Completed evaluations

(5 February 2020)

Percentage completed

Actual expenditure (US$)

Africa 108 59 55% 1,724,356

Arab States 71 41 58% 964,104

Asia and the Pacific 144 89 62% 2,456,041

Europe and the CIS 82 49 60% 938,815

Latin America and the Caribbean 102 45 44% 913,291

Global 24 7 29% 249,783

Total 531 290 55% 7,246,390

Total budget 16,790,979 7,246,390 43%

Table 2. Number of decentralized evaluations completed by type, 2016 to 2019

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 127 166 151 132 576 46%

UNDP GEF evaluations 100 120 146 126 492 39%

Outcome evaluations 26 25 21 15 87 7%

UNDAF and other evaluations 36 31 20 17 104 8%

Total number of evaluations 289 342 338 290 1,259

Page 58: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation54

Table 3. Decentralized evaluation budgets by type, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 3,115,724 3,887,963 3,744,740 3,084,039 13,832,466 41%

UNDP GEF evaluations 2,464,934 3,337,466 3,629,441 3,103,012 12,534,853 38%

Outcome evaluations 866,443 835,881 445,073 509,899 2,657,296 8%

UNDAF and other evaluations 1,462,813 1,453,531 890,355 549,440 4,356,139 13%

Total evaluation budget 7,909,914 9,514,841 8,709,609 7,246,390 33,380,754  

Table 4. Number of decentralized evaluations completed by region, 2016 to 2019

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

Africa 111 117 113 59 400 32%

Arab States 15 27 30 41 113 9%

Asia and the Pacific 55 55 66 89 265 21%

Europe and the CIS 47 68 55 49 219 17%

Latin America and the Caribbean 52 57 56 45 210 4%

Global 9 18 18 7 52 17%

Total number of evaluations 289 342 338 290 1,259

Annex 1 (cont’d)

Page 59: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

55Annexes

Table 5. Decentralized evaluation budgets by region, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

Africa 3,282,314 3,400,102 3,213,369 1,724,356 11,620,141 34.8%

Arab States 296,002 707,282 922,827 964,104 2,890,215 8.7%

Asia and the Pacific 1,796,124 1,868,297 1,979,901 2,456,041 8,100,363 24.3%

Europe and the CIS 980,179 1,131,401 909,901 938,815 3,960,296 11.9%

Latin America and the Caribbean 1,194,242 1,445,367 1,146,844 913,291 4,699,744 14.1%

Global 361,053 962,392 536,767 249,783 2,109,995 6.3%

Total evaluation budget 7,909,914 9,514,841 8,709,609 7,246,390 33,380,754 100.0%

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0%

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

19.4%

6.6%

53.8%

19.1%

1.0%

19.3%

5.1%

51.2%

24.0%

18.1%

7.6%

53.3%

20.5%

0.5%

Annex 1 (cont’d)

2017 2018 2019

0.4%

Page 60: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation56

Annex 2: Africa snapshot of decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 44 56 51 19 170 43%

UNDP GEF evaluations 36 29 42 33 140 35%

Outcome evaluations 13 14 9 2 38 10%

UNDAF and other evaluations 18 18 11 5 52 13%

Total number of evaluations 111 117 113 59 400

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

2016 2017 2018 20192016-2019

total2016-2019

percentage

UNDP project evaluations 1,217,759 1,607,557 1,278,414 459,987 4,563,717 39%

UNDP GEF evaluations 991,511 891,545 1,292,356 980,641 4,156,053 36%

Outcome evaluations 408,667 359,007 206,585 81,148 1,055,407 9%

UNDAF and other evaluations 664,377 541,993 436,014 202,580 1,844,964 16%

Total evaluation budget 3,282,314 3,400,102 3,213,369 1,724,356 11,620,141

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Africa

22%

9%

51%

19%

21%

44%

31%

3%

18%

54%

26%

2%

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0% 2017 2018 2019

Page 61: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

57Annexes

Annex 3: Arab States snapshot of decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

totalPercentage

UNDP project evaluations 10 24 18 26 78 69%

UNDP GEF evaluations 3 3 11 9 26 23%

Outcome evaluations 1 0 1 4 6 5%

UNDAF and other evaluations 1 0 0 2 3 3%

Total number of evaluations 15 27 30 41 113

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 140,062 595,314 630,714 532,028 1,898,118 66%

UNDP GEF evaluations 56,590 111,968 269,713 197,326 635,597 22%

Outcome evaluations 29,350 - 22,400 150,000 201,750 7%

UNDAF and other evaluations 70,000 - - 84,750 154,750 5%

Total evaluation budget 296,002 707,282 922,827 964,104 2,890,215

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Arab States

15%

42%

23%

19%

16%

48%

28%

8%

16%

41%

31%

13%

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0% 2017 2018 2019

Page 62: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation58

Annex 4: Asia and the Pacific snapshot of decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 19 21 29 39 108 41%

UNDP GEF evaluations 24 28 33 47 132 50%

Outcome evaluations 3 4 1 0 8 3%

UNDAF and other evaluations 9 2 3 3 17 6%

Total number of evaluations 55 55 66 89 265

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 569,740 602,016 808,179 1,089,250 3,069,185 38%

UNDP GEF evaluations 549,328 913,813 944,391 1,230,251 3,637,783 45%

Outcome evaluations 166,284 292,468 23,000 - 481,752 6%

UNDAF and other evaluations 510,772 60,000 204,331 136,540 911,643 11%

Total evaluation budget 1,796,124 1,868,297 1,979,901 2,456,041 8,100,363

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Asia and the Pacific100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0%

14%2%

64%

17%

16%

59%

18%

6%

14%

58%

21%

7%

2%

2017 2018 2019

Page 63: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

59Annexes

2017 2018 2019

Annex 5: Europe and the CIS snapshot of decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

totalPercentage

UNDP project evaluations 20 34 19 27 100 46%

UNDP GEF evaluations 20 27 31 16 94 43%

Outcome evaluations 5 6 5 4 20 9%

UNDAF and other evaluations 2 1 0 2 5 2%

Total number of evaluations 47 68 55 49 219

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 280,057 428,758 271,524 492,178 1,472,517 37%

UNDP GEF evaluations 435,776 563,249 555,381 319,865 1,874,271 47%

Outcome evaluations 148,232 109,406 82,996 93,972 434,606 11%

UNDAF and other evaluations 116,114 29,988 - 32,800 178,902 5%

Total evaluation budget 980,179 1,131,401 909,901 938,815 3,960,296

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Europe and the CIS

18%

59%

20%

2%

21%

59%

13%

8%

20%

50%

17%

13%

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0%

2%

Page 64: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation60

Annex 6: Latin America and the Caribbean snapshot of decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

totalPercentage

UNDP project evaluations 26 25 22 16 89 42%

UNDP GEF evaluations 16 29 25 20 90 43%

Outcome evaluations 4 1 5 5 15 7%

UNDAF and other evaluations 6 2 4 4 16 8%

Total number of evaluations 52 57 56 45 210

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 577,053 553,518 422,342 321,183 1,874,096 40%

UNDP GEF evaluations 401,729 755,391 505,600 362,929 2,025,649 43%

Outcome evaluations 113,910 75,000 110,092 184,779 483,781 10%

UNDAF and other evaluations 101,550 61,458 108,810 44,400 316,218 7%

Total evaluation budget 1,194,242 1,445,367 1,146,844 913,291 4,699,744

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Latin America and the Caribbean 100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0%

22%

8%

51%

18%

28%

53%

15%

5%

19%

56%

16%

9%

2017 2018 2019

Page 65: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

61Annexes

2017 2018 2019

Annex 7: Snapshot of global/headquarters-based decentralized evaluations in 2019

Table 1. Number of decentralized evaluations completed, 2016 to 2019

2016 2017 2018 2019 2016-2019

totalPercentage

UNDP project evaluations 8 6 12 5 31 60%

UNDP GEF evaluations 1 4 4 1 10 19%

Outcome evaluations 0 0 0 0 0 0%

UNDAF and other evaluations 0 8 2 1 11 21%

Total number of evaluations 9 18 18 7 52

Table 2. Decentralized evaluation budgets, 2016 to 2019, in US$

  2016 2017 2018 2019 2016-2019

total2016-2019

percentage

UNDP project evaluations 331,053 100,800 333,567 189,413 954,833 45%

UNDP GEF evaluations 30,000 101,500 62,000 12,000 205,500 10%

Outcome evaluations - - - - - -

UNDAF and other evaluations - 760,092 141,200 48,370 949,662 45%

Total evaluation budget 361,053 962,392 536,767 249,783 2,109,995 -

Highly unsatisfactory

Unsatisfactory

Moderately unsatisfactory

Moderately satisfactory

Satisfactory

Highly satisfactory

Figure 1. Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Global

35%

53%

6%

7%

47%

47%

22%

67%

11%

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0%

6%

Page 66: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation62

Annex 8: Average budgets for evaluation

Table 1. Average budgets for evaluations in 2019, in US$

Average budgets AfricaArab

States

Asia and the Pacific

Europe and the

CIS

Latin America and the

Caribbean

Global All

UNDP project evaluations 24,210 20,463 27,929 18,229 20,074 37,883 23,364

UNDP GEF evaluations 29,716 21,925 26,176 19,992 18,146 12,000 24,627

Outcome evaluations 40,574 37,500 - 23,493 36,956 - 33,993

UNDAF and other evaluations 40,516 42,375 45,513 16,400 11,100 48,370 32,320

All evaluations 29,226 23,515 27,596 19,159 20,295 35,683 24,988

Source: Based on evaluation numbers and budget data in the ERC, accessed 5 February 2020.

Table 2. Average budgets for evaluations in 2016, 2017, 2018 and 2019, in US$

Average budgets 2016 2017 2018 2019 2016-2019

UNDP project evaluations 24,533 23,421 24,800 23,364 24,015

UNDP GEF evaluations 24,649 27,812 24,859 24,627 25,477

Outcome evaluations 33,325 33,435 21,194 33,993 30,544

UNDAF and other evaluations 40,634 46,888 44,518 32,320 41,886

All evaluations 27,370 27,821 25,768 24,988 26,514

Source: Based on evaluation numbers and budget data in the ERC, accessed 5 February 2020.

Page 67: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

63Annexes

Annex 9: Quality assessment of decentralized evaluations in 2017, 2018 and 2019

Figure 1. Quality assessment by region, 2017 to 2019, numbers

RegionHighly

satisfactorySatisfactory

Moderately satisfactory

Moderately unsatisfactory

UnsatisfactoryHighly

unsatisfactoryTotal

Africa 0 58 112 47 12 0 229

Arab States 0 13 36 23 11 0 83

Asia and the Pacific 1 28 89 22 8 0 148

Europe and the CIS 1 25 74 22 8 0 130

Latin America and the Caribbean 0 20 64 28 9 0 121

Global 2 15 22 1 0 1 41

Total 4 159 397 143 48 1 752

Percentage 1% 21% 53% 19% 6% 0%

Figure 2: Quality assessment by evaluation type, 2017 to 2019, numbers

Highly satisfactory

SatisfactoryModerately satisfactory

Moderately unsatisfactory

UnsatisfactoryHighly

unsatisfactoryTotal

UNDP project evaluations 0 88 199 96 41 1 425

UNDP/GEF evaluations 1 37 153 21 1 0 213

Outcome evaluations 0 15 26 15 3 0 59

UNDAF and other evaluations 3 19 19 11 3 0 55

Total 4 159 397 143 48 1 752

Percentage 1% 21% 53% 19% 6% 0%

Source: Based on the IEO quality assessment data in the ERC, accessed 15 March 2020.

Page 68: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation64

Annex 10: Monitoring and evaluation capacity, 2014 to 201920

Table 1. Global monitoring and evaluation capacity, 2015 to 2019

2015 2016 2017 2018 2019

Number of full-time M&E specialists 83 76 90 91 103

Number of full-time regional M&E specialists 13 12 10 8 9

Total 96 88 100 99 112

Share of country offices with full-time M&E capacity, percentage 52 56 80 51 66

Source: M&E focal point data reported by country offices through regional bureaux (January 2020).

Page 69: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

65Annexes

Annex 11: Key performance indicators

Metric 2018 2019, third quarter Target

A. Completion of IEO workplan

Percentage of corporate/thematic evaluations that went to the Executive Board on time and were completed

100% (3 out of 3)

100% (on track) (1 out of 1) 100%

Number of ICPEs completed 93%(14 out of 15)*

95%(38 out of 40)** 100%

Percentage of decentralized evaluations assessed 100%(225) 0% (on track) 100%

B. Efficiency

Corporate/thematic evaluations average cost $115,000 $31,491 $275,000

Corporate/thematic evaluations average duration (months) 11.5 5 12

ICPE average cost $59,585 $73,100 $75,000

ICPE average duration (months) 11 8.5 7

Human resources (staff members)

Ratio of male-to-female staff membersMale: 37% Male: 36%

Gender parityFemale: 63% Female: 64%

Ratio of male-to-female staff members per level

Management team: 50% Management team: 50%

Gender parityGeneral Service: 78% female General Service: 75% female

Professional: 57% female Professional: 50% female

Director’s team: 100% male Director’s team: 100% male

Geographical representation All regions represented with 29 nationalities and five UN languages

Footnotes:

(*) The country programme document for Bosnia and Herzegovina was extended for one year.

(**) Algeria and Libya evaluations were deferred to 2020 due to country programme document extensions.

Page 70: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation66

Annex 12: Evaluation Advisory Panel 2019 summary letterDear Indran,As in earlier years the IEO Evaluation Advisory Panel has prepared a Report following the 2019 Annual Meeting. As this coincides with the end of the term of current EAP members together with the term of office of yourself as IEO Director, we think it right to situate our report within a longer-term perspective.

1. As this report coincides with the end of the current EAP term it offers an oppor-tunity to reflect on the last 6 years as well as to look forward to the chal-lenges that IEO is likely to face in the coming years.

2. The EAP was established in 2013 ‘to provide support and advice to the Independent Evaluation Office’ and in particular to:

• Recommend improvements to the overall coherence and consistency of the UNDP Evaluation Office approach, work programme and methodologies;

• Review key deliverables, including guidance covering methodologies and procedures, and specific evalua-tion documents: terms of reference, draft and final reports; and

• Advise the Evaluation Office on ways to raise its prominence, including improvements to knowledge sharing platforms and dissemination strategies.

3. One benefit of EAP continuity has been the opportunity to become familiar with IEO systems, personnel and workstreams. This has allowed EAP members to work closely and effec-tively with IEO evaluators on specific assignments as well as contributing to strategic thinking and policy develop-ment when requested. We hope that the undoubted benefits of continuity will be recognised and built-in to any future EAP arrangements.

4. Since 2013 we have observed an increasing professionalisation of IEO staff; greater clarity of roles and expectations; and a deepening of the evaluation culture within IEO. It would be inappropriate to claim that these improvements are the result of EAP inputs, however we have been gratified with the extent to which EAP initiated frameworks, methodological sugges-tions and other advisory inputs have become embedded in IEO practice.

5. Since the inception of EAP, its inputs as requested by the IEO Director have mirrored the evolving profile and prior-ities of IEO within the remit set out by UNDP’s Executive Board. There have also been enduring themes throughout this time, such as:

• the need to adapt evaluation approaches in the face of the growing complexities of develop-ment in particular the challenges of climate change; the salience of the 2030 Agenda; and the impor-tance of ensuring that development programmes are aligned with the pri-orities of developing countries and stakeholders in the global south

• the need to update IEO’s own methodological capacities given contemporary developments in theo-ry-based-evaluation, qualitative and quantitative methods; and new data sources and methods of data capture

• ensuring that evaluative conclusions are supported by evidence; and that evaluative judgements are rooted in the values and mission of UNDP and the UN system

Page 71: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

67Annexes

• balancing IEO’s accountability and learning remit in the context of IEO’s independence and its continuing obligations to strengthen UNDP’s Country Programmes and enable development country learning

• strengthening the quality of UNDP’s decentralized evaluations as an important input for IEO as well as UNDP’s Country Offices, whilst con-tinuing to maintain IEO’s focus and independence

Because these are ongoing challenges rather than problems that can simply be ‘solved’ on a once-and-for-all basis, we anticipate that these enduring themes will also need to be on the agenda of any future EAP, which an incoming IEO Director may wish to set up.

1. In recent years IEO has experienced major changes in its work-programme and working practices following the decision to move to 100% coverage of Country Programmes. This has had

major impacts on the volume of IEO work; the workload of IEO staff; and on the proportion of Country, Thematic and Corporate evaluations in IEO’s overall work-programme.

2. The increased volume of ICPE work has been accompanied by understandable streamlining and simplification of tools, data gathering methods, data anal-ysis and feedback in-country. This has resulted in less time for engagement with COs and national stakeholders; and a greater emphasis on accountability rather than support for learning and capacity development.

3. Following the completion of an ‘early-stage’ review by EAP of ICPE design and implementation the EAP identified the urgent need to address these problems and in particular the possibility of:

• Greater differentiation between the resources invested in different ICPEs to reflect UNDP’s strategic

priorities, the maturity and strengths of existing CPs; and the needs of country stakeholders

• Rebalancing IEO’s portfolio to increase the number of cluster eval-uations; Thematic and Corporate evaluations; thus maximising the potential for synergies with ICPEs

4. In our most recent deliberations EAP highlighted a number of near-horizon issues that are likely to shape IEO effi-ciency and effectiveness and therefore need to be addressed. These include:

• Anticipated restructuring that would accompany the UN Reform agenda which would be likely to change IEOs remit requiring for example much closer co-working with evaluation functions in other UN agencies; as well as the need to further develop methods able to demonstrate UNDP’s contribution and value-added along-side its more direct impacts

SignatoriesElliot Stern Michael Bamberger Rachid Benmokhtar Bagele Chilisa

Osvaldo Feinstein Paulo Jannuzzi Zenda Ofir Ray Rist

Olga Schetinina Thomas Schwandt Daniel Weiner Chen Zhaoying

Page 72: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

2019 Annual Report on Evaluation68

Annex 13: Evaluation Advisory Panel summary of work 2019

EAP members in 2019

Country Assignments completed in 2019

Michael Bamberger United Kingdom EAP Annual Meeting; ICPEs in Afghanistan, Bangladesh, El Salvador, Panama, Somalia, Uruguay and Venezuela; ICPE review project, middle-income countries

Rachid Benmokhtar Morocco EAP Annual Meeting, ICPE Bahrain, ICPE review project

Bagele Chilisa Botswana EAP Annual Meeting; ICPEs in Burkina Faso, Eswatini, Somalia, Uganda and Zimbabwe; 6th International Conference on National Evaluation Capacities

Osvaldo Feinstein Argentina EAP Annual Meeting; ICPEs in Argentina and Guinea-Bissau; ICPE review project, climate resilience; Evaluation Policy review

Paulo Jannuzzi Brazil EAP Annual Meeting, ICPEs in Mozambique and Panama

Zenda Ofir South Africa EAP Annual Meeting; ICPEs in China, Côte d’Ivoire, Ethiopia, Guinea-Bissau and Iraq; ICPE review project, climate resilience

Ray Rist United States of America EAP Annual Meeting, ICPE review project, other projects

Olga Schetinina Ukraine EAP Annual Meeting; ICPE review project, Regional Bureau for Europe and the CIS cluster; ICPE 3RP (Turkey)

Thomas Schwandt United States of America EAP Annual Meeting, ICPEs on Indian Ocean Island States (Maldives, Mauritius and Seychelles), ICPE review project

Elliot Stern United Kingdom EAP Annual Meeting, ICPE review project (design, rapid review, workshop and final report)

Daniel Weiner United States of America EAP Annual Meeting, ICPEs in Argentina and Iraq

Chen Zhaoying China EAP Annual Meeting; ICPEs in Afghanistan, China and Eswatini; ICPE review project

Page 73: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

69Endnotes

Endnotes1 Some ICPEs planned for 2020 have been

deferred to later years as country pro-gramme documents have been extended. The COVID-19 pandemic may lead to further deferments.

2 The ICPE for Algeria was postponed from 2019.

3 References to Kosovo are in the context of United Nations Security Council resolution 1244 (1999).

4 A 12th ICPE of Turkey was not included in this cluster. It was evaluated separately, with results contributing to the Syrian refugee response evaluation.

5 The full report is available at: http://web.undp.org/evaluation/evaluations/adr/iraq.shtml.

6 As analysed by the evaluation team, the Funding Facility for Stabilization is the largest stabilization programme to date, even considering multiproject and multi-partner stabilization programmes.

7 The full report and management response are available at: http://web.undp.org/eval-uation/evaluations/thematic/poverty-ldc.shtml.

8 It recognizes the need for greater collab-oration across the conflict prevention, governance, disaster risk reduction and climate change adaptation areas of work in

order to provide countries with more inte-grated and holistic approaches to resilience. It also acknowledges the need for tailored responses to natural hazards, humanitarian emergencies, and other forms of shocks and crisis.

9 See: https://www.undp.org/content/undp/en/home/news-centre/news/2017/l-alle-magne-et-le-pnud-unissent-leurs-forces-pour-la-stabilisati.html.

10 Access the National Evaluation Capacities Conference conference at a glance: http://web.undp.org/evaluation/documents/NEC/2019/NEC2019_Brief.pdf.

11 See DP/2018/4.

12 Staff time allocations for evaluation and additional evaluation costs are self-reported through the results-oriented annual report. Staff costs for evaluation are calculated by UNDP based on these self-reported figures. Evaluation implementation costs are taken from the ERC and are also self-reported and entered by programme units.

13 Data taken from UNDPs transparency portal, https://open.undp.org/, 10 February 2020.

14 SDG 12, responsible consumption and production; SDG 13, climate action; SDG 14, life below water; and SDG 15, life on land.

15 Of the 628 evaluations undertaken in 2019 and 2018, 468 (75 percent) were tagged in the ERC with their corresponding SDG alignment. This tagging and linking is an ongoing process. Since 209 evaluations were tagged with more than one SDG, they therefore appear across several SDGs in our analysis.

16 HS=highly satisfactory meets and exceeds UNDP requirements; S=satisfactory, fully meeting UNDP requirements with minor shortcomings; MS=moderately satisfactory, partially meeting UNDP requirements with a number of shortcom-ings; MU=moderately unsatisfactory, more than one parameter was unmet with signifi-cant shortcomings; U=unsatisfactory, most parameters were not met and there were major shortcomings; HU=highly unsatis-factory, none of the parameters were met and there were severe shortcomings.

17 UNDAF evaluations and GEF midterm reviews are not quality assessed.

18 Including one junior professional officer from Italy.

19 Country programme evaluation section, corporate and thematic evaluation sec-tion, capacity development section and operation section.

20 Staff dedicated to M&E on a full-time basis as reported by country offices and regional bureaux.

Page 74: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

70 2019 Annual Report on Evaluation

Photo Credits

Page 18: UNDP Mozambique

Page 17: UNDP Georgia

Page 13: UNDP Kazakhstan

Page 31: 2019 NEC Conference Photo credit: IEO/UNDP

Page 29: UNDP Iraq Photo credit: Claire Thomas

Page 26: UNDP Armenia

Page 40: UNDP Ethiopia

Page 6: UNDP Panama

Page 23: UNDP Bangladesh

Page 35: UNDP Seychelles

Page 44: UNV and UNDP Serbia Photo credit: Momira Markovic

Page 46: UNCDF

Page 47: UNDP Azerbaijan

Page 52: UNDP Colombia

Page 75: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The
Page 76: United Nations Development Programmeweb.undp.org/evaluation/documents/annual-report/2020/ARE2019.pdf · 4 2019 Annual Report on Evaluation Chapter 1. Evaluation in UNDP 6 1.1 The

Independent Evaluation Office United Nations Development Programme One UN Plaza, DC1-20th Floor, New York, NY 10017, USA Tel. +1(646) 781 4200

⁄ undp.org/evaluation

⁄ UNDP_Evaluation

⁄ ieoundp

⁄ evaluationoffice

I INDEPENDENTEvaluation Office

United Nations Development Programme