assessing multilateral organisation effectiveness: a ... › lotusquickr › cop-mfdr... · common...

48
Assessing Multilateral Organisation Effectiveness: A Comparative Analysis of Data Collection Tools Draft A discussion paper presented to the Multilateral Organisation Performance Assessment Network (MOPAN) Working Group in Stockholm on June 11 th , 2007. Submitted to: Andrew Clark, A/Director Policy and Strategic Planning Division Multilateral Program Branch, CIDA Prepared by Werner Meier, Director http://www.RBMG.ca Wednesday, June 6, 2007

Upload: others

Post on 29-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

Assessing Multilateral Organisation Effectiveness: A Comparative Analysis of Data Collection Tools

Draft

A discussion paper presented to the Multilateral Organisation Performance Assessment Network (MOPAN) Working Group in Stockholm on June 11th, 2007.

Submitted to: Andrew Clark, A/Director

Policy and Strategic Planning Division Multilateral Program Branch, CIDA

Prepared by Werner Meier, Director

http://www.RBMG.ca

Wednesday, June 6, 2007

Page 2: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

i

LIST OF ACRONYMS AfDB African Development Bank BO Bilateral Organization COMPAS Common Performance Assessment System CIDA Canadian International Development Agency DAC Development Assistance Committee Danida The Danish International Development Agency DFID UK Department for International Development FAO Food and Agricultural Organization of the United Nations FARP Framework for Assessing Relevance and Performance (MFA - Sweden) IFI International Financial Institution M&E Monitoring and Evaluation MDB Multilateral Development Banks MDG Millennium Development Goals MEFF Multilateral Effectiveness Framework (DFID) MES Multilateral Effectiveness Summary (DFID) MES-BSC Multilateral Effectiveness Summary Balanced Scorecard (DFID) MERA Multilateral Evaluation Relevance and Assessment system (CIDA) MfDR Managing for Development Results MFA Ministry of Foreign Affairs MMS Multilateral Monitoring System (Netherlands) MO Multilateral Organization MOPAN Multilateral Organizations Performance Assessment Network MPA Multilateral Performance Assessment (frameworks) NPM New Public Management ODA Official Development Assistance OECD Organization for Economic Cooperation and Development PCM Project Cycle Management PFM Public Financial Management PMF Performance Management Framework (Danida) PSIA Poverty and Social Impact Analysis RAP RBM Assessment Program (Danida) RBM Results-Based Management SIDA Swedish International Development Agency UNAIDS Joint United Nations Programme on HIV/AIDS UNDP United Nations Development Program UNFPA United Nation Population Fund UNICEF United Nations Children�s Fund WHO World Health Organization

Page 3: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

ii

TABLE OF CONTENTS LIST OF ACRONYMS...................................................................................................... i TABLE OF CONTENTS ................................................................................................. ii 1. Introduction.............................................................................................................. 1

1.1 Background..................................................................................................... 1 1.2 Rationale......................................................................................................... 1 1.3 Purpose........................................................................................................... 2 1.4 Limitations....................................................................................................... 2

2.0 Inventory of Assessment Approaches................................................................... 2

2.1 Sorting Apples from Oranges.......................................................................... 2 2.2 Established Corporate Approaches ................................................................ 3 2.3 Recent Survey Initiatives ................................................................................ 3 2.4 New Approach Experimental Models ............................................................. 3

3.0 Comparative Analysis.............................................................................................. 4

3.1 Purpose........................................................................................................... 4 3.2 Design............................................................................................................. 5 3.3 Overall Methodology ....................................................................................... 6 3.4 Key Performance Indicators and/or Questions ............................................... 8 3.5 Transparency and Resource Allocation ........................................................ 10 3.6 General Strengths and Weaknesses ............................................................ 11

4.0 Prospective for a Common Approach.................................................................. 12

4.1 New Public Management Context ................................................................. 12 4.2 Rationale for Assessing Multilateral Effectiveness ....................................... 12 4.3 A Harmonised Approach ............................................................................... 13 4.4 Design Principles for a Common Assessment Approach.............................. 14 4.5 Proposed Common Assessment Approach................................................... 18

ANNEX A: LEXICON OF TERMS USED...................................................................... 22

ANNEX B: SUMMARY OF MOPAN SURVEY METHODOLOGY................................. 23

ANNEX C: SUMMARY OF THE MEFF METHODOLOGY ........................................... 25

ANNEX D: SUMMARY OF THE MES METHODOLOGY ............................................. 28

ANNEX E: SUMMARY OF THE MERA METHODOLOGY........................................... 30

ANNEX F: SUMMARY OF THE SCORECARD METHODOLOGY............................... 32

ANNEX G: SUMMARY OF THE PMF METHODOLOGY ............................................. 34

ANNEX H: SUMMARY OF THE RBM ASSESSMENT METHODOLOGY.................... 38

ANNEX I: SUMMARY OF THE FARP METHODOLOGY ............................................. 40

ANNEX J: CROSS-CASE COMPARISON TABLE....................................................... 42

Page 4: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

1

1. Introduction

1.1 Background The Multilateral Organization Performance Assessment Network (MOPAN) is a network of nine1 like-minded donor countries having a common interest in:

(a) sharing information and mutually drawing on their experience in the monitoring and assessment of the work and performance of multilateral organisations;

(b) conducting annual surveys on MOs through their embassies and country offices (the Annual MOPAN Survey); and

(c) carrying out joint evaluations of MOs. The MOPAN member countries jointly conduct an annual in-house survey of multilateral partnership behaviour in developing countries (partnerships with national governments, civil society and other bilateral and multilateral development agencies). The Survey is based on the perceptions of MOPAN member embassies or country offices, arising from their day-to-day contacts with multilateral organizations. The MOPAN Annual Survey2 is not an evaluation and does not cover actual development results achieved.

MOPAN members are using the results of the Annual Survey for their own accountability on multilateral financing and as input: (a) into their policy towards the multilateral organizations concerned; (b) to strengthen their participation in the governance of these organisations; (c) for their joint advocacy work; and (d) to contribute to wider debates on aid effectiveness.

1.2 Rationale At MOPAN meetings in 2006 there was consensus that members required further information on the effectiveness3 of individual multilateral organisations (MOs) beyond the Annual Survey. This was in order to meet growing demands for accountability and in order to make more informed policy choices as multilateral budgets increased.

MOPAN members therefore agreed to establish a Working Group to begin moving towards a common approach for the assessment of multilateral effectiveness which would involve initiatives beyond the Annual Survey. The objectives of the Working Group are two-fold:

! The first is to compare the questions and indicators used in any existing methods to assess agency performance and to develop a common core set that can be used jointly;

! The second is to work with the multilaterals to improve their self reporting based on a clearer understanding of the information required to meet accountability and reporting requirements.

A number of MOPAN members have already developed questionnaires, checklists and scorecards for assessing the organisational effectiveness of their multilateral partners. It was unclear however the extent to which any could serve as the basis for a more harmonised approach in the future, or how multilateral reporting maps on to the information requirements of different donors. The consultant was therefore appointed by the Working Group to provide some independent advice.

1 Current MOPAN members are Austria, Canada, Denmark, Finland, France, the Netherlands, Norway, Sweden,

Switzerland and the United Kingdom. Ireland is currently an observer. 2 A summary description of the survey methodology is provided in Annex B. 3 A lexicon of terms used in this paper, e.g., effectiveness, is provided in Annex B based on the OECD-DAC

�Glossary of Key Terms in Evaluation and Results Based Management�.

Page 5: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

2

1.3 Purpose The Consultant was mandated to assist the MOPAN Working Group in fulfilling its first objective by initially conducting a comparative analysis of existing bilateral donor assessment approaches and to recommend how a more harmonised approach might be developed and applied by Working Group members in the future. The specific purpose of this discussion paper is therefore:

! To compare existing bilateral donor approaches to assessing multilateral organisation effectiveness and to provide advice on the feasibility of developing a common approach.

All available documentation provided by the members of the MOPAN Working Group was reviewed and telephone interviews conducted with representatives of MOPAN member organisations. The indicators/questions employed in the assessment tools and survey questionnaires which constituted or formed part of the various assessment approaches were analysed using a concept mapping technique. This grouping and mapping process allowed the consultant to identify the priority areas of assessment of each approach, as well as the common priorities for assessing multilateral effectiveness. Few gaps were identified given the comprehensiveness of some of the assessment tools and the range of assessment priorities.

1.4 Limitations The limitations affecting the quality of the comparative analysis presented in this report are relatively common for this type of work. The allocated levels of effort to conduct the research and analysis, as well as the timelines were fixed parameters within which the work had to be completed. Access to MOPAN members and frequency of consultation opportunities during the research process was also limited, as was the amount of available documentation on some of the assessment approaches reviewed, especially the more recent initiatives. Lastly, the Consultant was appointed by the Working Group to provide some independent advice which will obviously be influenced by past experience and professional preferences.

2.0 Inventory of Assessment Approaches4

2.1 Sorting Apples from Oranges Establishing an inventory of multilateral assessment approaches has been akin to sorting apples from oranges and has led to some surprises as the work has progressed. A number of MOPAN members had been identified as having already developed �questionnaires, checklists and scorecards� for assessing the organisational effectiveness of their multilateral partners. Examples included the Dutch �Multilateral Monitoring Survey�, the Danish �Performance Management Framework� and the British �Multilateral Effectiveness Framework�. While at first blush they may appear to be simple data collection instruments, this is not the case for all, since they range in character from well established corporate approaches to the institutional effectiveness �check list�. Based on implementation experience and lessons learned, in some cases from repeated cycles of data collection, a second generation of assessment approaches have also been developed. Comparisons are rendered even more challenging with the inclusion of relatively new bilateral donor initiatives, such as, the Canadian �Multilateral Effectiveness and Relevance Assessment� and the even more recent Swedish �Framework for Assessing Relevance and Performance of Multilateral Organisations� which have limited supporting documentation and have only recently completed a first cycle of data collection.

4 Summary descriptions of the assessment approaches included are provided in Annexes C-I.

Page 6: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

3

Returning to apples and oranges, yes they are classified as fruit, but prudence must be exercised to avoid a false analogy such as where an "apple" is faulted for not being a good "orange." For this reason the assessment approaches being reviewed have been grouped according to some common characteristics described below.

2.2 Established Corporate Approaches The primary criteria for this grouping of established corporate approaches are the number of data collection cycles that have been undertaken and the extent of corporate involvement in the assessment. The Danida Performance Management Framework (PMF) fulfils both criteria since it was first introduced in 2003, has completed a number of annual data collection cycles and systematically involves Ministry of Foreign Affairs personnel in Headquarters (HQ), Regional Delegations and Embassy staff, as well as soliciting the views of other relevant government departments. It should also be noted that the multilateral effectiveness assessment is only one component within the PMF which serves as a management framework for bilateral cooperation programming as well. The Danida Annual Performance Report follows the PMF structure and criteria providing comprehensive performance information. The Netherlands Ministry of Foreign Affairs Scorecard also fulfils both criteria since it was first introduced in 2004 and has completed annual data collection cycles since. It is also a corporate effort which systematically solicits the views of HQ and Embassy staff, as well as relevant policy departments. While the Scorecard has until recently been a confidential document, the Embassy field survey report is posted on the corporate intranet, so that it can be available to everyone at the Ministry.

2.3 Recent Survey Initiatives The primary criteria that characterises this group of recent survey initiatives is that they have only undertaken once cycle of data collection since inception and that they appear to rely on a single survey instrument for data collection. The Swedish Framework for Assessing Relevance and Performance (FARP)5 completed its first round of data collection in 2007 using an Embassy staff survey instrument, while the CIDA Multilateral Effectiveness and Relevance Assessment (MERA) was piloted in late 2006 using a data collection template for use by the responsible Multilateral Desk Officer. In neither case have the reports been fully completed. Included in this group is the DFID Multilateral Effectiveness Framework (MEFF) even though it was first deployed in 2004, it has not been used since. It also relies primarily on a single data collection instrument referred to as the �checklist� by donor and multilateral organisation staff alike. Unlike the other two assessment approaches in this grouping, the MEFF methodology is very well documented and a very thorough analysis of not only the findings, but also the feedback and lesson learned have been documented.

2.4 New Approach Experimental Models What characterises this experimental models grouping is the innovative and unique nature of the assessment approaches, their relative newness and their second generation status. The Danida RBM Assessment Program was launched in 2006 as the �New Approach�. Five (5) highly

5 The title of the Swedish assessment approach was based on an unofficial translation and is not final.

Page 7: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

4

focussed and rigorous RBM assessments have been completed of the following UN Agencies: UNDP, UNFPA, IFAD, OHCHR and UNICEF. While an integral part of the overall Danida PMF methodology, the implementation of these assessments remains somewhat outside the Danida corporate structure given that the Dalberg Global Development Advisors has been commissioned to undertake them. The RBM assessment methodology is very well documented and the assessment reports are equally as comprehensive and insightful. First introduced in a June 2006 DFID Fact Sheet, the Multilateral Effectiveness Summaries (MES) was also introduced as a �New Approach�. It employs a Balanced Scorecard (BSC) approach the indicator data fields of which are populated by secondary sources, e.g., MEFF, MOPAN Survey, Paris Donor survey, etc. It is unclear as to whether this is an assessment approach that will be used by DFID going forward, or whether it is a proposal for consideration by other like-minded donors in the spirit of harmonisation. Nevertheless some data collection has apparently taken place recently and Effectiveness Summaries for sixteen (16) MOs will be published in July 2007.

3.0 Comparative Analysis

3.1 Purpose In the 2006 CIDA survey bilateral donors identified accountability and learning, i.e., identifying strengths and weaknesses for corrective action, as the two most important reasons for undertaking multilateral assessments. An analysis of the stated purpose for each of the assessment approaches presented in the table below does not however reveal the same consistency of messaging.

Assessment Approach

Inception Year

Freq. # of MOs / Year Assessed

Purpose

PMF Danida

2003 Annual 3 MOs/yr/country across 15 countries

To: (a) enhance the quality of development cooperation, (b) improve management and continuous learning, (c) strengthening accountability.

Scorecard Netherlands

2004 Annual 25 MOs / year

To assess the relevance of MOs to Dutch policy objectives, their performance and adoption of RBM.

FARP Sweden

2007 Once 28 MOs / yr To inform budget decision-making process, and to provide an analytical basis for strategic work with the MOs.

MERA CIDA

2006 Once 19

To determine effectiveness and relevance with a view to directing future resources to enhance MO performance.

MEFF DFID

2004 Once 23

Provide information for agency�s: (a) Public Service Agreement reporting, (b) its Institutional Strategies, (c) its financing strategies.

RBM Assessment Danida

2006 Upon Request

5

Provide support to: (a) strengthen MOs RBM capacity (b) enabling bilateral donors to depend on MO results reporting for assessments.

MES-BSC DFID

2007 Once 23

Provide information for agency�s: (a) Public Service Agreement reporting, (b) its Institutional Strategies, (c) its financing strategies.

Page 8: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

5

The PMF purpose statement is the exception as it focuses on learning and improvement, while the adjunct RBM Assessment, as well as the Scorecard also include an RBM strengthening rationale. These assessment approaches appear to have a dual measurement and management purpose. Accountability is also a prominent message in the purpose statements of the PMF, MEFF and MES but not explicitly stated for any of the other assessment approaches. What is explicit in the latter two cases, as well as for the FARP and MERA is the intention to use the assessment to �inform� or �direct� funding decisions. Relevance to national policy priorities and/or international development goals is also implied in the Scorecard and MERA statements. In summary, the impression given by this cursory review of the above purpose statements is that in the majority of cases the performance measurement function is viewed as a means to demonstrating accountability and influencing funding decisions rather than to encourage organizational learning and change.

3.2 Design Key to understanding the various assessment approaches is their overarching design characteristics, particularly the criteria and/or perspectives which are used to structure data collection, analysis and even reporting. An analysis of the design characteristics for each of the assessment approaches presented in the table below reveals variations around three principal themes: relevance to national and international development priorities, internal performance ( i.e., institutional effectiveness) and external performance (i.e. development effectiveness).

Assessment Approach

Assessment Criteria and/or Perspectives

# of Criteria

National and International Development Priorities Addressed

PMF Danida

(a) Danida Corporate level; (b) MO HQ level; (c) Country level (field level)

3 Intn�l: MDGs, PRSPs and cross-cutting issues, e.g., GE, HRDGG, ENVIR

Scorecard Netherlands

Scorecard criteria: a) internal performance, b) external performance, and c) relevance

3 Intn�l: MDGs and Paris Declaration; Dutch priorities: e.g., PSD, GGDD, anti-corruption, etc.

FARP Sweden

Relevance: a) goal congruence and b) role relevance. Agency performance c) internal and d) external performance

4 Intn�l: MDGs and Paris Declaration; Swedish priorities: HRGGDD, GE, NR, economic growth, social development and security, etc.

MERA CIDA

(a) relevance, (b) results (c) managing of the institution

3 Intn�l: MDGs and Paris Declaration Canadian priorities: GE, ENVIR, and RBM

MEFF DFID

Eight organizational systems and their focus on (a) internal performance, (b) country level results and (c) partnerships

8 Intn�l: MDGs and Paris Declaration: partnership and harmonisation

RBM Assessment Danida

RBM System Elements a) Architecture b)Process, and c) Incentives/Resource Allocation

3 N/A

MES-BSC DFID

Four quadrants of the Balanced Scorecard: a) Building for the Future, b) Managing Resources, c) Partnerships d) Country / Global Performance.

4 Intn�l: MDGs and Paris Declaration: partnership and harmonisation

Page 9: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

6

The inclusion of relevance as an assessment criterion is either explicit or implicit for the PMF, Scorecard, FARP and MERA. As usual there are two parts to an assessment of relevance, relevance of the MOs development programming to national (donor country) policy priorities and/or relevance to internationally agreed upon development priorities. The specific global, sectoral or thematic priorities are generally identified in the supporting documentation or built into the data collection tools used and have bee presented in the table above.

Internal performance as an assessment criterion is the most common among all the approaches, is used consistently but to varying degrees depending on the importance accorded to it. For example, the MEFF has been critiqued for having been too focused on internal performance issues related to the eight organizational systems. The MERA refers to it as �managing the institution�, the MES-BSC has two quadrants �Building for the Future and �Managing Resources� which are internally focussed, while other approaches make reference to assessing �institutional effectiveness�. The RBM Assessment could be viewed as well as being exclusively dedicated to examining the internal RBM systems and processes, although widely considered precursors for development effectiveness, i.e., external performance.

The internal � external performance dichotomy appears to also be employed synonymously with the MO headquarters versus country-level distinction; external performance being supposedly equated with country-level. Both the PMF, MEFF and MES-BSC make specific reference to �country level� results, however this is not to be mistaken with an assessment of actual development results. In the case of the PMF, Scorecard and FARP, data collection instruments have been specifically designed to collect information from Embassy staff at the country level �about� MO performance, e.g., harmonization, partnership and achievement of development results. The provenance of the data being mistakenly equated with its data properties. In the case of the MES-BSC, as well as the aforementioned, the majority of the �country level� performance questions are at best �proxy� indicators of MO development effectiveness. Consequently, the internal � external performance dichotomy becomes somewhat confusing.

3.3 Overall Methodology The caveat for this comparative analysis of overall methodologies is that it is based on available information. As previously mentioned, some of the assessment approaches are very well documented in terms of their methodologies and some are not given their relative �newness�. Nevertheless, there are some important similarities in methodology among most of the assessment approaches, with the exception of the RBM Assessment (Annex D) and the MES-BSC (Annex H) which are quite unique. For this reason please refer to the appropriate annexes for the latter and the data table on the next page for a comparison of the former.

The PMF and Scorecard are bonafide corporate approaches which use multiple lines of evidence, including an Embassy staff survey to nourish their annual assessments with data. A MFA Desk/Policy Officer, presumably one responsible for each MO serves as the focal point and acts as the �team leader� for the assessment. They fulfill the coordination and quality assurance functions and are supported by other staff positions responsible for compiling the staff questionnaires and Embassy survey data, disaggregating it by MO and providing MO specific summary reports. The findings from documentary content analysis of MO specific annual reports, evaluation reports, or donor evaluation reports all have to be systematically integrated. The Desk/Policy Officer acts as the crucible for synthesizing the data, albeit mostly qualitative, and transforming it into performance information.

Page 10: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

7

Assessment Approach

Overall Methodology

Data sources

# of KPI/Qs

Type KPI/Qs

Rating scales used

Scores used to

comparePMF Danida

Assessment lead by MFA Desk Officer using multiple lines of evidence, including Embassy staff survey and RBM assessment

Multiple sources: MFA Desk Officers, Regional Delegations (NY, GEN, WASH, ROME), Embassy staff, MO staff and managers (RBM Assessment)

21 Mixed No No

Scorecard Netherlands

Scorecard updated by MFA Policy Officer heavily based on opinion surveys and other empirical data.

Scorecard sources: HQ experiences, policy department perceptions MMS Survey, MOPAN Survey, other findings from the field and analysis of MO evaluations.

15 Mostly Evaluative

Yes No

FARP Sweden

Assessment by MFA Desk Officer who leads the team drawing on evaluation reports and the informed opinions of e.g. constituency offices, Delegations and field staff.

Mandatory: Survey of 20 Embassies; HQ input; SIDA thematic input; analysis of audit reports and annual reports. Additional: MOPAN reports; MO evaluations, peer reviews, other donors.

33 Evaluative Yes No

MERA CIDA

Assessment conducted by CIDA MPB Directorate staff drawing on personal perceptions and available empirical data.

CIDA staff, available documentation and MO performance reports.

26 Evaluative Yes Yes

MEFF DFID

Assessment initially conducted by DFID staff with various degrees of input from MOs and ending with final dialogue meeting.

DFID staff with various degrees of input from MOs staff and managers

72 Empirical Yes Yes

RBM Assessment Danida

Assessment of MOs RBM systems in HQ/field conducted by consultants in a highly collaborative manner.

Multiple sources: MO documentation, staff interviews, on-site observations, demos of online data systems, discussion workshops interviews with staff.

33 Mixed No No

MES-BSC DFID

Assessment initially conducted by DFID staff using various secondary data sources.

Secondary data sources: MEFF, MOPAN Survey, Paris Survey, DAC peer reviews, MO performance and evaluation reports.

40 Mixed Yes No

Page 11: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

8

The MERA and the FARP have similar, or potentially similar corporate approaches to the PMF and Scorecard, but are perhaps still too recent to have well established data gathering and analysis procedures and processes in place which generally come after repeated annual cycles. A well structured process of mandatory and additional data sources has been outlined for the FARP which will no doubt add to the rigor of this assessment in the long term. On the other hand, it appears that the MEFF and to a certain extent the MERA depend inordinately on the opinion of the responsible Desk Officer, initially or otherwise, to complete the summary �checklist� or �template�. The extent of counter-verification of the assessment with the MO also varies from none � to extensive, depending on the circumstances and preparedness of the MO to get involved. Both the FARP and the MERA have the MO assessments reviewed internally by management bodies as a form of face validity check and presumably to smooth out any extreme opinions or question any outlying data points.

3.4 Key Performance Indicators and/or Questions There are collectively almost 250 key performance indicators and/or questions (KPI/Qs) being used to assess multilateral effectiveness based on the above data table. Most or all of the KPI/Qs for the Scorecard, FARP and MERA are evaluative in their formulation, i.e., they require the respondent to express an informed opinion or to exercise expert judgment based on available information. While this methodology has merit, there are issues of trustworthiness when respondents change over time, e.g., new inexperienced staff, or personality conflicts arise.6 It should also be noted that the addition of rating scales to evaluative questions does not enhance their trustworthiness, unless of course the scales are criterion-referenced. The MEFF provides a good example of the use of empirical KPI/Qs in conjunction with a rating scale that requires supporting evidence in the form of text segments. Unfortunately, the specificity of such an assessment approach requires the use of many more KPI/Qs in order to cover the same breadth of evaluation topics or assessment priorities.

Ideally, a mixed set of evaluative and empirical KPI/Qs would generate the qualitative and quantitative data that if properly cross-referenced or triangulated, would produce credible performance information. Based on the limited information available, such appears to be the case for the Danida PMF combined when combined with the RBM Assessment, as well as the DFID MES-BSC which incorporates MEFF data. In both cases there is a balance between evaluative data obtained from various opinion surveys and empirical data based on observations, document content analysis and other factual information.

Determining the extent of overlap and gaps in the selection of the KPI/Qs is part of this comparative analysis and was accomplished using a concept mapping technique. First, the KPI/Qs were extracted from the primary data collection tools used in each assessment approach. They were then pooled and sorted based on easily recognized common characteristics. This technique resulted in the identification of four priority areas and a number of sub-priority clusters for MO assessment which serve as a frame of reference for determining the extent of overlap and gaps. The structure of the frame of reference is open to interpretation and the labels used are a matter of personal preference. The final product of the concept is the distribution table presented on the next page where the KPI/Qs are mapped against this frame of reference.

6 The terms merit and trustworthiness are used as synonyms for validity and reliability respectively when

discussing the characteristics of qualitative data. �Fourth Generation Evaluation� by Yvonna Lincoln and Econ Guba, Econ (1989).

Page 12: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

9

From the perspective of individual assessment approaches, the gaps are self-evident, while from a more collective perspective there are only two obvious cases. The Danida RBM Assessment (RAP) suggests the importance of examining MO use of corporate indicators and targets. The two DFID assessment approaches (MEFF and MES-BSC) suggest the importance of examining quality control and quality assurance processes put in place by MOs prior to the approval of new development investments, as well as management systems for assessing and tracking the quality of country portfolio investments. The implicit suggestion, with which I would concur, is that the related KPI/Qs are good proxy or predictive indicators of MO effectiveness.

The extent of overlap is equally self-evident. It should be noted that the three donor �harmonization� categories could be collapsed, or alternately, more differentiation introduced in the formulation of the KPI/Qs grouped under �harmonization � general�. In any case, the next step in a concept mapping process is prioritization where the stakeholders would rank the importance of each indicator within the sub-priority clusters. This step would serve to identify those KPI/Qs collectively most valued by the stakeholders which would then be used to populate a common assessment tool.

Assessment Approach / Priority Assessment Areas

PMF MMS Scorecard

FARP MERA MEFF RAP MES-BSC

STRATEGIC MANAGEMENT Corporate Governance 1 2 10 4 9 - 2 Corporate Strategy 6 4 5 6 7 2 - Country Strategies 1 4 1 2 - - - Corporate Indicators/Targets - - - - - 7 - Total Number of KPI/Qs 8 10 16 12 16 9 2 Percentage of Total KPI/Qs 38% 67% 48% 46% 22% 27% 5%

OPERATIONAL MANAGEMENT Financial Resources 6 1 2 3 8 4 6 Human Resources 3 - 3 - 9 3 2 Quality @ Entry - - - - 4 - 1 Country Portfolio - - - - 5 - 3 Total Number of KPI/Qs 9 1 5 3 26 7 12 Percentage of Total KPI/Qs 43% 7% 15% 12% 36% 21% 31%

RELATIONSHIP MANAGEMENT Harmonisation w/Multilaterals - 1 - 1 - - - Harmonisation w/ Bilaterals - 1 1 1 - - - Harmonisation � General 1 - 2 1 6 - 4 Alignment w/ Partners 1 - 2 - 2 - 13 Total Number of KPI/Qs 2 2 5 3 8 0 17 Percentage of Total KPI/Qs 10% 13% 15% 12% 11% 0% 44%

KNOWLEDGE MANAGEMENT RBM Stewardship 1 2 1 2 2 - 1 Performance Monitoring - - 1 - 7 4 2 Evaluation Management 1 - - 4 1 1 4 Performance Reporting - - 1 2 10 8 - Lessons Learned - - 4 - 2 4 1 Total Number of KPI/Qs 2 2 7 8 22 17 8 Percentage of Total KPI/Qs 10% 13% 21% 31% 31% 52% 21% Total Number of KPI/Qs 21 15 33 26 72 33 39

Page 13: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

10

3.5 Transparency and Resource Allocation Based on the information presented in the data table below, it appears that several bilateral donors intend to use the findings of their MO assessment exercises to inform decision making with regard to resource allocations. At the same time a good number of the same donors do not share, or intend to share, their assessment findings with the MOs in the countries, or otherwise. Furthermore, there are several that do not make even the summary report publicly available.

The relationship between transparency and resource allocation is a critical one for donors to manage if they want to avoid excessive �gaming� and enhance the level of confidence and trustworthiness accorded to the assessment findings. The Monterrey Conference cited transparency as a key principle underlying reforms in the international financial architecture to enhance stable financing for development and poverty eradication. �It is important to promote measures in source and destination countries to improve transparency and the information about financial flows.�7 The assessment approaches which are conducted and reported in an open and transparent manner, e.g., the MEFF, RBM Assessment and MES-BSC, can more easily justify the use of the findings for resource allocation purposes.

Assessment Approach

Report shared with MOs in country

Summary Report publicly available

Report used for Resource Allocation

PMF Danida

No Yes Not mechanically or systematically, but can influence senior management decision-making.

Scorecard Netherlands

No No - Summary report on each MO based on MMS data is given to Policy Officer which is then inputted into the Scorecard which has not been a public document.

Not mechanically or systematically, but can influence policy dialogue and funding decisions.

FARP Sweden

No, but the intention is to provide feedback, in one form or another, to MO during high level consultations.

Not yet decided for the future � for the first year it will not be.

Intended to guide budget allocations and funding decisions, but not mechanically so. Will influence policy dialogue.

MERA CIDA

No No Yes, to reward good performance with additional funds.

MEFF DFID

Yes Yes Yes

RBM Assessment Danida

Yes Yes No

MES-BSC DFID

N/A Effectiveness Summaries for 16 MOs will be published in July 2007.

No, rather to inform decision making and to ensure senior managers have a consistent picture of each multilateral.

7 Monterrey Consensus on Financing for Development, (March 2002, pp.10)

Page 14: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

11

3.6 General Strengths and Weaknesses The strength of any assessment approach, or evaluation for that matter, lies in the extent to which it is designed to fulfill both its measurement and management functions. Measurement requires multiple lines of evidence built into the design so as to solicit an appropriate balance of evaluative and empirical data from multiple data sources. Opportunities to compare similar data sets received from different data sources, conduct face validity checks, or discuss self-assessment findings are evaluation techniques that enhance the reliability and trustworthiness of the data, findings and ultimately the conclusions. Management requires that attention be paid to MO stakeholder engagement in the assessment process to enhance the credibility and utility of the findings for the multilateral organisation. An appropriate balance must be struck between measurement and management if the dual objectives of accountability and organisational learning are to be respectively achieved.

While the established corporate approaches appear to have some of these characteristics, their weaknesses lay in an over-reliance on opinion-based data and a lack of transparency. With the exception of the RBM Assessment, the remainder appear to favour a measurement approach but are for the most part still in the early stages of development and could benefit from a more balanced and rigorous data collection and analysis methodology, or at least one that is more explicitly documented. Despite their level of complexity, the MES-BSC and RBM Assessment have the greatest potential for evolving into a common approach for assessing multilateral effectiveness.

Assessment Approach

Measurement or Management

Complexity of Use Sustainability

PMF Danida

Mixed measurement and management

Complex given three (3) levels, variety of data sources and collection methods. Embassy survey is fairly simple but time consuming to analyse for individual MOs.

Yes

Scorecard Netherlands

Mixed measurement and management

Scorecard somewhat complex because of variety of data sources. MSS easy for Embassies to answer but time consuming to analyse for individual MOs.

Yes

FARP Sweden

Measurement Fairly simple survey approach, but somewhat complex given it was the first time. Also, rather time-consuming,

Yes

MERA CIDA

Measurement Simple for Desk officers to answer questions with no additional primary data collection requirements.

Yes

MEFF DFID

Measurement Fairly simple to complete Checklist, even for MOs, however data analysis / reporting become complex and time consuming

Not for a single donor

RBM Assessment Danida

Mixed measurement and management

Complex and time consuming to conduct requiring between 60-100 p/d level of effort for two people over three (3) month period.

Not for a single donor

MES-BSC DFID

Measurement Somewhat complex to compile secondary data given the variety of data sources and timing, perhaps less time consuming to do data analysis for individual MOs.

Not for a single donor

Page 15: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

12

4.0 Prospective for a Common Approach

4.1 New Public Management Context The origins of the New Public Management (NPM) in the benchmark cases of the United Kingdom, Australia, New Zealand, USA and eventually Canada can be traced to shifts in economic policy toward fiscal austerity due to burgeoning national deficits and unsustainable government operating budgets. The over-bureaucratisation of government and organizational inefficiency in the public sector took center-stage on government policy agendas during the 1980s-90s, and for quite sometime thereafter in many other countries. This led directly to changes in public management policymaking, institutional rules and government systems in the areas of expenditure planning, financial management, embracing the concepts of economy, efficiency and effectiveness. The agenda-setting process in public policy including the adoption of performance-based budgeting was greatly affected by this �conjuncture-économique�, seeing major cutbacks in government spending, especially in the social support and healthcare delivery sectors. The long term effects, at least based on the Canadian experience have not been commendable and the growing consensus in the public administration literature is that the NPM has not lived up to the hype. While the genesis of the NPM in Western Europe may have been somewhat different than the Anglo-American benchmark cases, one of its defining characteristics of a more results-oriented approach to public management has remained intact. As the budget cutting and slashing subsided, there has been a fundamental shift in emphasis from efficiency, e.g., �doing more with less�, to effectiveness, i.e., actually �achieving expected results�. This orientation toward the achievement of results, rather than merely managing funds and following correct procedures has been a positive outcome of the NPM era; the �managing for results�, rather than �managing by results� language reflects this post-NPM thinking. However, managing for results requires a fundamental change in organizational management culture which, based on recent experience, is best nurtured in a climate of openness, transparency and partnership.

4.2 Rationale for Assessing Multilateral Effectiveness Like any other RBM initiative, the key incentives for bilateral donors to assess the effectiveness of multilateral organisations are: 1) accountability and 2) organisational learning, not only for the bilateral donors, but also for the multilateral organisations involved. Resource allocations or performance-based budgeting (PBB) should be of lesser immediate importance, especially given the nascent development of solid RBM systems and processes within most MOs, as well as the enormity of the political considerations when determining financial contributions.

Two important dynamics have to be kept in mind. First, just because an MO has not developed a results-based performance reporting capacity doesn�t mean that it is not achieving development results. To reduce funding in such circumstances would be an injustice to the intended beneficiaries and clearly counter-development. Second, just because an organisation is/is not achieving development results, doesn�t mean that the political considerations shouldn�t take precedence. This is the nature of democratic societies where performance information is only one of several considerations in what is inevitably a political decision. For at least these two reasons, it would be inappropriate to elevate resource allocation any higher that a distant third position after organisational learning as a purpose for assessing multilateral effectiveness; at least until all other options have been exhausted.

Page 16: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

13

Notwithstanding the above, assessing multilateral effectiveness will really only be cost-effective in the long term if there are going to be consequences for either: 1) not establishing the necessary RBM culture to enable a determination development effectiveness, or 2) clearly demonstrated ineffectiveness. What other consequences would be appropriate in the short-term, and how long does one wait for signs of organisational learning before using intimidation and eventually budget cuts? These decisions clearly require management judgement on a case-by-case basis and hopefully in concert with other like-minded donors - thus the need for a harmonised approach.

4.3 A Harmonised Approach While the Paris Declaration focuses primarily on donor harmonization of common arrangements at the country level to reduce donor and partner transaction costs and improve aid effectiveness, the spirit, intent and commitments apply equally as well to multilateral development programming. Of particular importance is the donor commitment to �work together to harmonise separate procedures�. However, the recent proliferation of bilateral donor initiatives to assess multilateral effectiveness has increased donor transaction costs with every additional survey questionnaire forwarded to Embassy staff and the additional resources required to analyse and report on the findings. Similarly, MO transaction costs will increase as well with every new �checklist�, survey, information requirement and high level management debriefing requested by donors in order to complete their assessments.

One of the lessons learned from the MEFF experience8 was the high transaction costs that these multilateral assessments incurred for both donors and MOs alike. The feedback provided by MOs (Box 8) speaks to the need for a common assessment approach. Progressive bilateral donors with several years of experience are reconsidering their assessment approaches. �As the MOPAN survey has developed into a useful and seemingly sustainable tool, the added value of the annual stand-alone assessment by Danish Embassies should be considered. A harmonized approach to assessing both country-level and corporate performance of multilateral organisations seems to be the way forward.�9 There are sufficient similarities and commonalities in the assessment approaches reviewed in this report, that the questions being asked should no longer be �why� or �if�, but moreover �how�?

8 �The MEFF Methodology: A review of DFID�s multilateral effectiveness framework� by Allison Scott

DFID, IDAD, March 2006. 9 Danida Annual Performance Report 2007, pp 62.

Page 17: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

14

4.4 Design Principles for a Common Assessment Approach An appropriate starting point for a discussion on the design principles for a common assessment approach would be the five Managing for Development Results (MfDR) principles agreed upon during the Second Roundtable on Managing for Results in 2004. 10

4.4.1 Focusing the Dialogue on Development Results If the mandates, mission statements, corporate and country strategies of multilateral organisations all espouse the achievement of development results, then why are there still key deficiencies in MO performance reporting that hinder effective performance assessment? A recent survey of bilateral donors 11 identified the following deficiencies in order of importance: 1) poor reporting on development results, 2) poor reporting on country results, 3) unclear application of lessons learned, 4) poor reporting on capacity development results, 5) absence of sufficient on-line data, and 6) limited monitoring and evaluation reporting. In a context of managing for results, credible performance reporting is a by-product of good management, and conversely, its absence is indicative of the organisation�s results-based management culture. Perhaps the answer to the above question lay in the continued adherence to conventional notions of accountability.

Conventional definitions of accountability, whether it be mandate fulfillment or contract execution emphasize compliance with agreed rules and standards in conducting the work of the organization, the prudent use of public funds, fair and honest reporting, etc. While not unimportant, results-oriented management would focus less on input and process management and more on achieving and demonstrating the achievement of results. In 1998, the Government of Canada adopted the following definition of accountability which may be of use:

Accountability is a relationship based on the obligation to demonstrate and take responsibility for performance in light of agreed expectations. 12

This definition works well in non-hierarchical relationships and decentralized circumstances where there is a need to balance greater decision-making flexibility and autonomy with enhanced accountability for results. Most importantly to this discussion is the �obligation to demonstrate� performance, i.e., in reporting on the achievement of development results. The associated indicators of effective accountability are: clarity of roles and responsibilities, clarity of performance expectations, balance of expectations and capacities, credibility of reporting, and the reasonableness of review and adjustments.

Why this seemingly esoteric discussion on accountability? Well simply because the dialogue on development results has to occur within a context of corporate governance to be effective. Since the bilateral donors are shareholder representatives on the governance bodies of multilateral organization, as such, they have an oversight responsibility to hold the organisation accountable.

Design principle:

1. The assessment approach should generate relevant and credible information to support bilateral donors in fulfilling their governance responsibilities for holding multilateral organizations accountable for managing for results.

10 �MfDR Principles in Action: Sourcebook on Emerging Good Practices� (OECD 2005). 11 �Bilateral Methodologies for Assessing Multilateral Performance: Survey Findings� (CIDA 2006). 12 �Modernizing Accountability Practices in the Public Sector�, a joint paper by the Office of the Auditor

General of Canada and the Treasury Board Secretariat of Canada (1998). http://www.tbs-sct.gc.ca/rma/account/oagtbs02_e.asp#Current

Page 18: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

15

Page 19: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

16

4.4.2 Aligning Monitoring and Evaluation with Results One of the common observations from implementing results-based management is captured in the saying �what gets measured gets managed� and therein lay the risks associated with designing an assessment tool. Bilateral donors, among many others, would like to see MOs align their programming, monitoring and evaluation systems with relevant development results. As the quotation in the adjacent text box suggests, to reduce the transaction costs associated with conducting assessments of multilateral effectiveness it would be preferable to be able to rely on the performance information provided in the annual reports. Apply the above aphorism proactively and this implies an emphasis on assessing the status of RBM system development and the quality of annual performance reports. As this report has already pointed out, Danida�s RBM Assessment approach has made considerable strides in developing a methodology which serves the dual purpose of generating credible assessment information, while focussing MO management�s attention on areas of critical long term importance to bilateral donor assessment objectives.

Design principle:

2. The assessment approach should generate useful information to support multilateral organizations in their efforts to foster a results-based management culture and develop the requisite monitoring and evaluation systems to facilitate organisational learning, improve development effectiveness and produce credible performance reports.

4.4.3 Keeping Measurement and Reporting Simple A key lesson learned in implementing results-based management has been, �keep it simple�, especially in the initial design stages of a new performance monitoring assessment system. Such advice is not new and has been expressed by the OECD and the World Bank in several publications as reported by Andreas Obser, "[O]ne way of doing so is to replace enthusiasm by practicality and to reduce complexity by identifying <minimum conditions> for credible and quality multilateral organisation performance assessments <under real-world constraints>�.13 The 2004 MEFF experience provides some guidance on this topic wherein the staff time, DFID and MO alike, to complete the �checklists� was considered reasonable, it was the data analysis and interpretation which took a considerable amount of time and resources. While on-line survey technology may offer a partial solution, it is still important to focus on the �need to know� as opposed to the �nice to know� assessment information. There will always be opportunities to improve, refine and add indicators after subsequent rounds of data collection.

Design principle:

3. The assessment approach should apply the �keep it simple� principle, keeping the initial number of indicators reasonable and designing the data collection tools with a view to facilitating data analysis and interpretation. 13 Multilateral Organisations Performance Assessment: Opportunities and Limitations for Harmonisation

among Development Agencies, (GDI 2007)

The focus on and assessment of results-based management in multilateral organisations needs to continue in the coming years to allow Denmark and other donors to fully rely on the organisation�s own results-reporting systems.It could be useful to concentrate on those organizations that are relative laggards in the area of RBM, and to do so through a stronger, more inclusive multi-donor effort. It is necessary for donors to join forces in order to ensure harmonisation of reporting requirement as well as a common understanding of results systems. (Danida Annual Performance Report 2007, pp 61)]

Page 20: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

17

4.4.4 Managing For, not by, Results

While many issues could be raised for discussion under this principle, suffice it to say that most of the assessment approaches reviewed in this report were exercises in pure measurement. For example, the MEFF was designed as a measurement �checklist� to be initially completed by the responsible DFID Desk Officer. However, the flexibility shown in its administration garnered some interesting feedback when the �checklist� was jointly completed or used as a self-assessment tool. Apparently, this application of the �checklist� generated a more accurate assessment in some cases, and more importantly incited the MOs to engage in a corporate process of self-reflection. This is a significant accomplishment, even if it only occurs once a year.

Design principle:

4. The assessment approach should be designed as a collaborative management exercise, as opposed to a bilateral donor measurement exercise.

4.4.5 Using Results Information for Learning and Decision Making Bilateral donor agencies can collectively create greater demand for more credible and balanced performance information with a common assessment approach. Not only is there strength in numbers, but speaking with a common voice about the same results information can serve as a powerful incentive for MOs to take the assessment findings seriously. Unlike some of the assessment approaches reviewed in this report, opportunities should be included to collectively review the assessment findings in an atmosphere of openness and transparency, to discuss differences of perspective or opinion without recrimination and to engage in collaborative learning. Depending on the circumstances, final dialogue meetings or high level annual bilateral consultations with the MOs, could serve to develop or review plans of action to redress weaknesses and establish milestones. Opportunities for active management decision making by the multilateral organisations should be included as part of the assessment process in order to foster an RBM culture and develop the requisite systems to meet bilateral donor demand for credible performance information. The Danida RBM Assessments appear to have successfully included this action planning component into the assessment design by not only presenting recommendations but also working with management to identify reasonable timelines and resource requirements for their implementation.14

Design principle:

5. The assessment approach should be designed to engage multilateral organisations in a learning and improvement process.

14 �Assessing Results Management at UNICEF� (Dalberg 2007).

�The process showed a high degree of collaboration, trust and mutuality. UNIFEM would want to convey great appreciation for the process, which is consistent with DFID�s overall approach of valuing learning and self-assessment as a part of enhancing effectiveness�- UNIFEM on the MEFF experience

�We welcome this assessment very much as in the process it has helped us to be more self-critical, as well as given us the confidence to continue to build on the institutional reforms that will make the ADB truly more effective as a development partner in the delivery of aid to our RMCs.� - AfDB on the MEFF experience

Page 21: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

18

4.5 Proposed Common Assessment Approach 4.5.1 Shareholder versus Stakeholder Bilateral donors have been referred in this report as shareholder representatives, that is, representatives of the taxpayers in their respective constituencies who contribute the funds to support the multilateral organizations. As shareholder representatives they sit on the governance bodies of these organizations and presumably hold them accountable not only for the prudent stewardship of the funds that we entrusted to them, but also for their performance, as well as the demonstration thereof. The distinction between the role of shareholder and that of a stakeholder presented below was made quite cogently in an �Independent Evaluation of SDC's Interaction with the UNDP� commissioned by the Swiss Agency for Development and Cooperation (SDC) (August 2003).

Shareholder: As a member country of the UNDP and through its financial contribution, Switzerland is a shareholder that contributes to and shares responsibility with other member countries for UNDP as a whole. Under the shareholder perspective, the organisation as a whole is in the foreground.

Stakeholder: As a stakeholder, Switzerland has development objectives and other national objectives. Under the stakeholder perspective, it is this Swiss agenda that is in the foreground. 15

The common assessment approach being proposed hereinafter will require bilateral organizations to forego their stakeholder role in favour of a shareholder role. The reasoning is quite simple; a common assessment approach must put the multilateral organisation in the foreground, while the uniquely national development objectives of each bilateral donor country are put in abeyance. Otherwise, the prospect of coming to agreement on a common assessment tool that also meets the �keep it simple� principle would be quite challenging.

4.5.2 A Common Set of Indicators The comparative analysis of the assessment tools under review included a combined total of almost 250 KPI/Qs, with the MEFF checklist representing the largest sub-group of 72 empirically formulated questions. The concept mapping technique revealed a significant degree of commonality in four priority areas: strategic management, operational management, relationship management and knowledge management. Coincidently, these areas also map relatively well against the new approach DFID MES-BSC, i.e., managing resources, partnerships, building for the future, and country/global performance. Provided with a facilitated priority setting exercise, and given a pool of over 250 KPI/Qs to choose from, the MOPAN members should be able to agree on a core set of common indicators to populate a common assessment tool. More important than the actual indicators selected, is the process that is used to select them. A participatory decision making process that builds consensus, commitment and ownership of the indicators selected would be recommended for all the same reasons that such processes are used in country development programming.

15 Authors: Jens-Eric Torp, TB Consult, Copenhagen and Fritz Sager, Büro Vatter, Berne.

Page 22: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

19

4.5.3 Multiple Lines of Evidence The most common refrain among MOs regarding bilateral donor initiatives to assess multilateral effectiveness, aside from them not being harmonized, is that they are superficial, subjective and insufficiently sensitive to specific contexts, i.e., organisational and country. While there is some merit in these comments, especially with respect to primarily single source and single instrument methodologies, there is no reason to expect the same if multiple lines of evidence are built into the methodology of a common assessment approach. The present proposal would build on the strengths of the existing assessment approaches and use modified tools to collect data from at least three data sources representing different perspectives on the same indicators of interest. It would avoid seeking the same indicator data from all data sources for reasons of data validity and in the interest of keeping the approach simple. The following data sources would be used depending on the priority areas for assessment as per the Preliminary Performance Assessment Plan presented below:

! Multilateral Organisation Headquarter Staff (MO HQ Staff) ! Multilateral Organisation Country Office Field Staff (MO Field Staff) ! Multilateral Organisation Site Visit (MO HQ & Field) ! Donor Staff in Headquarter Offices and Delegations (Donor HQ-Del. Staff) ! Donor Country Office Field Staff (Donor Field Staff) ! Country Partner Staff

Preliminary Performance Assessment Plan

Priority Areas Data Sources Data Collection Technique Frequency

1) MO HQ Staff Simplified Self-Assessment Checklist Annual

2) Donor HQ-Del. Staff Simplified Checklist Annual

Strategic Management

3) MO HQ & Field RBM Assessment As Required

1) MO HQ Staff Simplified Self-Assessment Checklist Annual

2) Donor HQ-Del. Staff Simplified Checklist Annual Operational Management

3) MO HQ & Field RBM Assessment As Required

1) MO Field Staff Modified MOPAN Self-Assessment Checklist Cyclical

2) Donor Field Staff Modified MOPAN Survey Paris Donor Survey

Cyclical Annual

Relationship Management

3) Country Partner Staff Modified MOPAN Survey Cyclical

1) MO Field Staff Modified MOPAN Self-Assessment Checklist Cyclical

2) Donor Field Staff Modified MOPAN Survey Cyclical Knowledge Management

3) MO HQ & Field RBM Assessment As Required

4.5.3 Assessment Process Strategic and operational management lend themselves well to a rapid empirical assessment that can be done by donor and MO HQ-Del. staff using a criterion-referenced �checklist� approach once a year. An annual meeting to compare notes and discuss differences would serve as a validity check and be recommended.

Relationship management lends itself well to qualitative assessment. The MOPAN survey process

Page 23: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

20

for donors would be continued, but expanded to cover some areas of knowledge management related to managing for results at the country level. The Paris Donor Survey could be used as a reliability check on overlapping indicators. In addition, comparative perspectives on the same indicator set from the MO Field Staff and Country Partner Staff would be advised. Again, a meeting to compare notes and discuss differences between the MOPAN donor representatives in-country and the MO Field Staff would serve as a validity check and be recommended. Since it is clearly not feasible to asses every MO every year in every country, the assessment process would follow the current cyclical pattern of selecting a group of MOs and the countries in which the assessments would focus for that year.

Knowledge management lends itself best to a mixed empirical and evaluative assessment requiring a much broader range of data gathering techniques as employed in the conduct of the Danida RBM Assessments described above. This intensive examination of an MOs RBM culture and systems would be employed on an �as required� basis depending on known factors regarding the MOs included in the annual assessment. For example, MOs known to have been slow to adopt the RBM approach should undergo an RBM assessment at the earliest opportunity. It would not however have to be repeated for a few years thereafter to allow for the organizational change process to take place. Those MOs already capable of some evidence-based reporting on development outcomes could undergo a lighter version of the RBM assessment process. Once an MOs performance reporting meets the agreed upon quality standards there would be no further need for an RBM assessment.

These are the proposed initial building blocks of a common assessment process which understandably could change in light of new concerns or as conditions and priorities change.

4.5.4 Consider Using the Balanced Scorecard The Balanced Scorecard is a strategic management approach developed in the early 1990's by Drs. Robert Kaplan and David Norton from the Harvard Business School. It builds on some key concepts of previous private sector management ideas such as total quality management, including customer-defined quality, employee empowerment, continuous improvement, organisational learning. When adapted appropriately to the public sector, it can be used effectively to visualise assessment priorities, related performance information and to track transition management and performance.16

The DFID application of the Balanced Scorecard for its Quarterly Management Board Reports is a foreshadowing of how this approach could be used in the context of assessing multilateral effectiveness. Again, quite by coincidence, this consultant had arrived at the same conclusion prior to seeing the DFID MES-BSC and had prepared a model for consideration by the MOPAN members (see below). While both models are worthy of serious consideration, what is of greater importance is the process that the MOPAN members will undertake to either modify, adapt and customise either one, or engage in a creative process and start anew to ensure the requisite buy-in and commitment of its members to a common assessment tool.

16 �Balanced Scorecard: Step-by-step for government and nonprofit agencies�, Paul R. Niven (2003).

Page 24: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

21

June 3, 2007 16

STRATEGICSTRATEGICMANAGEMENT

GovernanceCorporate Strategy

Country-Level StrategiesCorporate Indicators/Targets

KNOWLEDGEKNOWLEDGEMANAGEMENT

Organisational StewardshipPerformance MonitoringEvaluation ManagementPerformance Reporting

Lessons Learned RELATIONSHIPRELATIONSHIPMANAGEMENT

Donor Harmonisation w/ Multilaterals and Bilaterals

Alignment withCountry Partners

OPERATIONALOPERATIONALMANAGEMENT

Financial ResourcesHuman Resources

Quality @ EntryCountry Portfolio

MfDR

The MOPAN Balanced ScorecardThe MOPAN Balanced Scorecard

Page 25: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

22

ANNEX A: LEXICON OF TERMS USED The key terms used in this paper17 are defined as follows: Accountability: Obligation to demonstrate that work has been conducted in compliance with agreed rules and standards or to report fairly and accurately on performance results vis-a-vis mandated roles and/or plans. This may require a careful, even legally defensible, demonstration that the work is consistent with the contract terms. Note: Accountability in development may refer to the obligations of partners to act according to clearly defined responsibilities, roles and performance expectations, often with respect to the prudent use of resources. For evaluators, it connotes the responsibility to provide accurate, fair and credible monitoring reports and performance assessments. For public sector managers and policy-makers, accountability is to taxpayers/citizens.

Development Intervention: An instrument for partner (donor and non-donor) support aimed to promote development. Note: Examples are policy advice, projects and programs.

Development objective: Intended impact contributing to physical, financial, institutional, social, environmental, or other benefits to a society, community, or group of people via one or more development interventions.

Effect: Intended or unintended change due directly or indirectly to an intervention.

Effectiveness: The extent to which the development intervention�s objectives were achieved, or are expected to be achieved, taking into account their relative importance. Note: Also used as an aggregate measure of (or judgment about) the merit or worth of an activity, i.e. the extent to which an intervention has attained, or is expected to attain, its major relevant objectives efficiently in a sustainable fashion and with a positive institutional development impact. Related term: efficacy.

Impacts: Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended.

Institutional Development Impact: The extent to which an intervention improves or weakens the ability of a country or region to make more efficient, equitable, and sustainable use of its human, financial, and natural resources, for example through: (a) better definition, stability, transparency, enforceability and predictability of institutional arrangements and/or (b) better alignment of the mission and capacity of an organization with its mandate, which derives from these institutional arrangements. Such impacts can include intended and unintended effects of an action.

Lessons learned: Generalizations based on evaluation experiences with projects, programs, or policies that abstract from the specific circumstances to broader situations. Frequently, lessons highlight strengths or weaknesses in preparation, design, and implementation that affect performance, outcome, and impact.

Outcome: The likely or achieved short-term and medium-term effects of an intervention�s outputs.

Performance: The degree to which a development intervention or a development partner operates according to specific criteria/standards/ guidelines or achieves results in accordance with stated goals or plans.

Results: The output, outcome or impact (intended or unintended, positive and/or negative) of a development intervention. Related terms: outcome, effect, impacts.

17 From the �Glossary of Key Terms in Evaluation and Results Based Management� published by the

Organisation of Economic Cooperation and Development (OECD), Development Assistance Committee (DAC) Working Party on Aid Evaluation. OECD, Evaluation and Aid Effectiveness #6, May 28, 2002.

Page 26: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

23

ANNEX B: SUMMARY OF MOPAN SURVEY METHODOLOGY18 Background Since 2003, the Annual Survey covers 3-4 multilateral organisations and is conducted in 8-10 countries. The first Annual Survey was implemented in 2003 as a pilot followed by three full fledged Annual Surveys in 2004, 2005 and 200619. The organizations selected for the 2007 survey are the UNDP, WHO/PAHO and AfDB. The survey will be conducted in Benin, Egypt, Ethiopia, Mali, Senegal, Zambia, Bangladesh, Serbia, Bolivia, Brazil and Nicaragua.

Purpose The objectives of the MOPAN Annual Survey may be summarized as follows:

! better information and understanding of multilateral organizations, their roles and performance by decision-makers concerned, parliamentarians and the general public in the MOPAN member countries;

! better informed dialogue with the multilateral organizations, both at headquarters and the country level; and

! improved overall performance of multilateral organizations at the country level.

Design Characteristics The design of the MOPAN Survey has two parts. The first part focuses on the quality of MO�s partnership behaviour towards national stakeholders (government, NGOs, private sector). The questions in this part of the survey solicit opinions on the following topics: policy dialogue, capacity development, advocacy, and alignment with national poverty reduction strategies, policies and procedures. The second part of the survey focuses on the quality of MO�s partnership behaviour towards other international development agencies. The questions in this part of the survey solicit opinions on the following topics: information sharing, interagency coordination and harmonisation.

Methodology The MOPAN Annual Survey is light and rapid with minimal transaction costs. It includes the filling-in by the participating MOPAN member embassies and country offices of a questionnaire on each of the multilateral organizations surveyed. Many MOPAN members work with the MOs through co-financing or participation in joint initiatives and donor meetings. They have an opportunity to directly observe their partnership behaviour. This is followed by joint discussions of questionnaire answers among MOPAN members at the country level (country teams). The joint group discussion provides a mechanism for testing individual views and forming a collective opinion. From these inputs, the country teams establish country reports, which are then aggregated into a Synthesis Report. This report is shared with the relevant multilateral organizations for feedback before its public release. The MOPAN Survey methodology has evolved over the years and now includes a series of tools and steps in what has become a well understood process:

• Agency template describes key aspects of each multilateral organization and includes information on mandate, structure and background. They are prepared by the MOPAN headquarter groups and provided to assist the country teams.

18 This is a summary of the essential components of the methodology adapted from CIDA MOPAN web site

and combined with the consultant�s interview notes. 19 The MOs selected for the 2006 MOPAN survey are UNICEF, ILO and ADB. The exercise took place in

the following countries: Burkina Faso, Kenya, Mozambique, Uganda, Indonesia, Nepal, Pakistan, Sri Lanka, Colombia, and Guatemala.

Page 27: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

24

• Standard questionnaires are used to help each participating staff member to document his/her opinions on the multilateral organization�s performance. The questionnaire results are fed into the country and synthesis reports and are used as a basis for the group discussions by the MOPAN in-country teams.

• Country team discussions are used to share individual knowledge and views of the country team members and to �pool� the perceptions collected through the standard questionnaires. A collective overview of the organization emerges and the country team is able to expand their knowledge of the organization as well as develop a feeling of ownership of the results.

• The Country Report is a summary of the findings of the country team discussions. The report also includes information about the group process in reaching its consensus. The use of local consultants in the country report process is discouraged. This country report is generally discussed with the agencies concerned before finalization.

• The Synthesis Report is built up centrally by analyzing and summarizing the country reports for each multilateral organisation. The aggregate questionnaire data is analysed and used in the synthesis report.

• In an effort to promote transparency and dialogue among stakeholders, MOPAN has asked multilateral organizations to provide their feedback on survey conclusions, and post it on their websites together with the MOPAN Synthesis Report.

Page 28: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

25

ANNEX C: SUMMARY OF THE MEFF METHODOLOGY20 Background In 2003-04, the DFID International Division (ID) established a Multilateral Effectiveness Framework (MEFF) for assessing the organisational effectiveness of the multilaterals that it funds centrally. It took 21 months to develop and administer the MEFF with twenty-three organisations, but the exercise has not been repeated since.

Purpose The main objectives of the MEFF are to:

! Provide an information and monitoring system that would support DFID�s reporting on its Public Service Agreement (PSA) objectives;

! Provide inputs to DFID�s corporate engagement with multilaterals via Institutional Strategies (ISs); and

! Provide inputs to future financing decisions.

Design Characteristics The MEFF design focuses on eight corporate management systems, i.e., corporate governance, corporate strategy, resource management, operational management, quality assurance, staff management, M&E lesson learning, and reporting, from three perspectives: internal performance, focus on country-level results, and partnership. It is designed to be applicable to all MOs, but recognizes the need to differentiate among different groupings: multilateral development banks, UN development organizations, UN standard-setting organizations, humanitarian organizations, coordinating organizations and the European Union Commission.

Methodology The MEFF uses three main assessment instruments, comprising a checklist of indicators, expressed as questions; a scorecard rating data in the checklists; and a summary report. These tools, associated processes and reports are described below: • The Checklist is the main assessment instrument which is designed as a matrix, with the eight

organisational systems listed horizontally and the three perspectives, vertically. This gives a total of 24 cells in the matrix, in each of which there are up to four questions.

THE MEFF CHECKLIST

Organizational systems Focus on Internal Performance

Focus on Country Level Results

Focus on Partnership

Corporate Governance 3 questions 3 questions 3 questions Corporate Strategy 3 questions 3 questions 3 questions Resource Management 4 questions 3 questions 2 questions Operational Management 4 questions 2 questions 3 questions Quality Assurance 4 questions 3 questions 2 questions Staff management 4 questions 2 questions 3 questions M&E Lesson learning 3 questions 3 questions 3 questions Reporting 3 questions 3 questions 3 questions

20 This is a summary of the essential components of the methodology adapted from �The MEFF

Methodology: A review of DFID�s multilateral effectiveness framework� by Allison Scott DFID, IDAD, March 2006 and combined with the consultant�s interview notes.

Page 29: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

26

The questions in the Checklist avoided evaluative language that would require a subjective judgement on the part of the assessor. The questions were deliberately factual in nature: they asked whether a system or practice was in place or not or was under development. However, simple Yes or No questions were avoided. Guidelines were developed to provide key definitions and rules for completing the assessment. The answers were entered electronically into the checklist in the form of a short piece of text that supplied factual evidence in response to the question. Where possible, information sources were to be cited. The MEFF information base was thus textual and qualitative.

• The Scorecard summarised and illustrated the information generated from the checklists. It is

a simple traffic light system: green � the system or practice is in place; amber � it is under development; red � it is not in place. A blue score was entered if there was insufficient information available, and it was left blank if the question was not relevant to the agency. While the Scorecard colour system is criterion referenced to a certain extent, it nevertheless leaves considerable room for interpretation.

The Scorecard provided three levels of information. At the most disaggregated level (level 3), it contained the scores for each question in the matrix. At level 2, the scores were aggregated within each organisational system and by perspective, by registering the average or predominant score in the category, with a small chip to reflect the presence of higher or lower scores. At level 1 they were aggregated by perspective. The Guidelines provided rules for attributing scores and aggregating them to levels 2 and 1.

MULTILATERAL EFFECTIVENESS SCORECARD Dream Development Agency - Level 2 Summary

1 - Corporate Governance

2 - Corporate Strategy

3 - Resource Management 4 - Operational Management 5 - Quality Assurance

6 - Staff Quality 7 - Monitoring, Evaluation, 8 - Reporting

INTERNAL PERFORMANCE

COUNTRY-LEVELRESULTS FOCUS

PARTNERSHIPS FOCUS

2004 2005 2006 2004 2005 2006 2004 2005 2006

The system is in place

The system is not yet in place, but improvements are underway

No system in place yet

Predominant Colour at Level 2 Above predominant colour Below predominant colour

Page 30: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

27

• Preparatory Meetings were held during the first three months with the MOs to explain the assessment process and to provide an opportunity for discussion of any questions or concerns. Each MO was provided with the full set of documents (approach paper, checklist and scorecard templates, guidelines).

• The First Draft checklist and scorecard were completed by the DFID desk officers. Thereafter

followed a period of consultation with the agencies on these drafts � more intensive in some cases than others. The MOs responded in various ways: they awaited the first draft and then commented; they implemented the checklist themselves with a view to comparing versions (e.g. WFP, AfDB); or they completed it jointly with DFID staff (e.g. IFAD, UNIFEM). In many cases, DFID lacked sufficient information to complete the checklist and had to request information from the MOs. Sometimes this took the form of them sending relevant reports, other times they suggested text for the checklist.

• While quality assurance was provided throughout the implementation process, the final adjustment to scores was minimal. The final step was to assess scoring consistency between assessments. DFID examined all the scores in the light of the evidence provided in the checklists and consistency of scoring across the 22 assessments. They found a small number of questionable scores (4.7% of all 1,574 scores), which were concentrated in only five MOs. Questionable scores mostly arose on the boundary between green and amber scores, usually because of a tendency to consider not just whether a system was in place, but whether it was working well. In the interest of accuracy and cross-agency consistency, the questionable scores were adjusted. This mainly had the effect of raising scores, but there were also several cases where they were lowered. Because of the small numbers involved, the overall impact of the adjustment was small both for individual MOs and in aggregate.

• The Final Dialogue Meeting was usually attended by senior management staff (Heads of Departments) of the MOs depending on how seriously they took the exercise. It involved detailed discussion of the textual answers as well as the scores. Although this was a consultation, not a negotiation, in most cases it was easy to reach agreement. Undoubtedly, some MOs were concerned to maximise the scores, but many were very self-critical and actually reduced DFID�s scores.

• The Summary Report provided an overview of the multilateral organisation�s strengths and weaknesses on each of the three perspectives, together with relevant background information on its structure and mandate, and any recent performance-related reforms. It also identified three areas that would be used for future monitoring of the organisation�s effectiveness.

Page 31: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

28

ANNEX D: SUMMARY OF THE MES METHODOLOGY21

Background In 2006, DFID devised a �new approach� to assessing multilateral effectiveness. Rather than attempting to create, or recreate, a new all encompassing single tool to measure agency effectiveness, DFID has devised a new system which attempts to collate existing information from a variety of data sources and to present it within a coherent framework. This is the �Multilateral Effectiveness Summary�.

Purpose The MES will give users a general overview of an organisation�s effectiveness. It will include basic data on the organisation, the context in which the organisation is working, as well as a summary assessment of effectiveness. The summaries are aimed at providing senior managers and staff with a concise summary of all available information on the effectiveness of a particular multilateral organisation. While the information will be used to inform the management decision making process, it is not designed to produce a ranking of MOs by their effectiveness or to provide a basis for expenditure decisions. Rather the information will be used to help inform management decision making and to ensure senior managers have a consistent picture of each multilateral.

Design Characteristics In developing the MES, DFID has adapted the Balanced Scorecard (BSC) approach (see below), tailored it to meet the specific context in which multilateral organisations operate. The DFID BSC used to assess MO effectiveness focuses on four broad areas linked to the results chain.

Building for the Future - �How is the organisation building for the future through sharing information, learning and innovating?� This area considers whether the organisation is committed to a culture of continual learning and improvement, how it manages information and its results and whether it is investing in its staff.

Managing Resources � �How is the organisation managing its activities and processes?� This area looks at how the organisation uses its financial and human resources. It includes indicators such as disbursement ratios, resource allocation criteria, staff recruitment, postings and promotions processes and degree of decentralisation.

Partnerships - �How is the organisation engaging with other development partners?� This area looks at how the organisation works with other development partners including developing country governments themselves. It encompasses Paris Declaration Indicators on alignment and harmonisation.

Country/Global Performance - �How well is the organisation performing either at the country level and/or at the global level?� The final area considers any information which is available on actual performance either at the country level or at the global level. This includes results of any evaluations, by the organisation themselves or by other organisations, information on portfolio quality, any data on programme targets and any other indicators on outcomes and impact.

These four areas represent the quadrants of the BSC and can be seen as broadly following the results chain from inputs to outputs to outcomes and impact. They focus on the MOs institutional effectiveness and provide proxy measures of developmental effectiveness.

21 This is a summary of the essential components of the methodology adapted from two DFID Practice papers

�Factsheet� (June 2006) and �Methodology Note� (November 2006) and combined with the consultant�s interview notes.

Page 32: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

29

Methodology While the conceptual framework and the tools designed for the MES are well documented, the data collection and analysis processes are less well described. It appears that the MES will be completed by DFIS staff using secondary data from a variety of sources, including: the MEFF, the MOPAN Annual Survey, the Paris Declaration indicators, as well as the MOs own performance reports. Below are the key tools and reports that would be used to support the MES preparation process. • The Balanced Scorecard that DFID has adapted reflects its own business needs and is used

for its own Quarterly Management Report. DFID proposes to use essentially the same framework as presented below to assess the effectiveness of multilateral organizations.

DFID Quarterly Management Board Report Framework

• A Traffic Light Assessment will be made in the future based on the information collated for each of the above four quadrants. As far as possible, they would be based on objective evidence from the published data sources. However, in cases where the evidence base is weak, judgements of desk officers working with the particular multilateral organisations will also inform the overall assessment.

• The Multilateral Effectiveness Summaries are short publications which will have a

standardised format as follows: MO tombstone data, the BSC/Traffic Lights Assessment, a Disclaimer (below), the findings on the four areas of the BSC and data references.

DFID Mission: Poverty reduction

through partnerships

Managing Resources�How well are we planning

and managing our resources?�

Managing External Relationships

�Are we managing our relationships effectively?�

Building for the Future�Are we delivering our people

and organisation for the future?�

Delivery of Results�Are we delivering against

our PSA targets?�

*Disclaimer This Effectiveness Summary is a tool designed to simply present the latest available published information on multilateral X�s effectiveness. It will help inform our policy decisions but will not be used as the only basis to make policy recommendations on funding. It is intended for internal use only. Although the balanced scorecard assessment aims to be as objective as possible, the colours reflect our judgement of what we think the information tells us. It should be noted that the amount of information available and the quality and reliability of that information varies considerably, so there is a limit to which we can use the summaries for comparative purposes.

Page 33: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

30

ANNEX E: SUMMARY OF THE MERA METHODOLOGY Background The Canadian International Development Agency (CIDA) has developed a standard template of questions called the Multilateral Effectiveness and Relevance Assessment (MERA) system. It is relatively new having been used but once in 2006.

Purpose The main objectives of the MERA are:

! To better inform policy and financial allocation decisions on more solid evidence. Ultimately, we aim at getting more predictable multi-year funding, and an adequate balance between core and responsive/earmarked funding;

! To better exercise our accountability and improve our reporting to Canadian citizens;

! For more effective Board meetings and better identification of areas requiring improvements; and

! To deliver on MPB commitment to complete the multilateral effectiveness review, as part of the Agency Aid Effectiveness agenda.

The findings will be used as a basis for making evidence-based policy decisions and financial allocations. They will help identify areas in multilateral organizations requiring improvement. They will also improve Branch accountability, the effectiveness of our Board interventions, and reporting to Canadians. This should eventually allow CIDA to link funding decisions to the effectiveness of the organizations.

Design Characteristics The MERA provides a framework for comparing multilateral organisations receiving core funding from Multilateral Programs Branch, CIDA. Its design is organised around three main criteria: relevance, effectiveness, and management. Each MO is reviewed through a set of questions covering these three criteria. The �relevance� criteria include questions regarding the MOs strategic coherence with the MDGs, Canadian development priorities, and its role within the multilateral aid architecture. The �effectiveness� criteria include questions regarding the MOs own performance rating, achievement of sustainable results, and the effectiveness of its advocacy and norm setting work. The �managing of the institution� criteria include questions regarding governance and financial management, ability to manage for development results, compliance with aid effectiveness principles, and ability to mainstream gender equality and environment considerations. The majority of the questions are evaluative in nature requiring the assessor to make an informed judgement based on the available information. MOs are rated on a normative five point numeric scale on each of the three criteria.

Methodology Assessments are conducted by the responsible MPB Desk Officers who draw on any multiple data sources: two surveys (MOPAN and field survey on multilateral institutions), reporting from multilateral organizations and their evaluations, external evaluations commissioned by CIDA or conducted jointly with other donors and due diligence. Individual assessments are reviewed and validated by the Branch Management Group to mitigate any extreme opinions. The process is admittedly subjective using mostly evaluative questions with some empirical questions. The next step is said to review the current level of funding to these multilateral organisations and propose a new allocations, as required, to address weaknesses. The following standard template is applicable to all multilateral organisations.

Page 34: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

31

Source of information

Relevance Effectiveness Improvement measures (by the institution and by MPB)

Links to MDGs: # General link between

MO�s mandate and MDGs; # Specific MDGs targeted

by MO

Overall governance # General overview of MO; # Donor or recipient driven; # Resident board or executive

council # Independent evaluators�

impression of governance; # General reputation; # Personal insights of evaluator

# Status of donor or MO

driven specific management improvements (e.g. implementing a report, instituting a new policy, efforts to improve RBM)

# General improvements (e.g. ensuring strong voice on Board on key issues)

Role in the Multilateral architecture: # How MO fits in multi

development system Region /sector specific, UN or IFI or other

# Kind of aid loans, grants # Budget support # Tech assistance # Partnership with another

MDI # role within the MDI

system in a particular area

Efficient use of resources # Operating costs as a percent of

its aid delivery # Strengths and weaknesses,

factors which hinder efficiency

Support to IPS sectors: # Sectoral breakdown of

its most recent year of programming

Donor harmonization and alignment � Paris Declaration # Work towards objectives of the

Paris declaration # Harmonizing/aligning with

other UN orgs or IFIs and bilaterals

Managing for results # Extent of RBM systems # Reporting systems

Quality of reporting, monitoring & evaluation systems

Gender mainstreaming: # Policy # Implementation

CIDA Desk Officers can draw from:

MOPAN surveys

Other donor

surveys

CIDA field surveys

Reports from the MO itself

Evaluation reports

Environment mainstreaming: # Policy # Implementation

Overall assessment: Description of how relevant and effective the institution is both in its own right and compared with other MDIs and donors comes in.

Page 35: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

32

ANNEX F: SUMMARY OF THE SCORECARD METHODOLOGY22 Background The Netherlands Ministry of Foreign Affairs (MFA) has been conducting annual performance assessments for the most important MOs at the country level of the 36 Dutch partner countries since the Beoordelingskader (BOK) questionnaires of 2002 and 2003 sent to embassy staff, permanent mission representatives and constituency offices staff. In 2003 BOK was replaced by MMS (Multilateral Monitoring System) and in 2004 the scorecard method was introduced. This Scorecard method is considerably more detailed and is more robust than the way MOs were assessed in the past. While the Scorecard has been an internal document to the Dutch MFA for sometime, it has recently been used more openly.

Purpose The Scorecard plays an important role in informing the MFA�s policy towards the MOs concerned, including in general policy dialogue, interventions at governing council meetings, and may also form the basis for professionalizing financial relationships with the MOs by influencing the amount and nature of its voluntary contributions. It is not used to draw firm and unequivocal conclusions, because there is a subjective element in the weighting of criteria. In addition, considerations other than MO effectiveness also play a part in decision-making on how Dutch resources are allocated, such as the perceived role of the MOs in the multilateral system.

Design Characteristics Drawing up the Scorecards is not a one-off exercise and must be interpreted from a multi-year perspective. It reflects the quality of the MO�s performance over time, the extent to which it has adopted a results-based approach and how relevant its programming has been to the Netherlands and to Dutch policy objectives. Three assessment criteria are used which focus on: a) internal performance, such as results-based management (RBM), financial management and regularity, and anti-corruption activity; b) external performance, such as focus on core mandate, coordination/cooperation; and c) relevance to international development cooperation priorities, i.e. MDGs. A five point normative Lickert scale is used to rate MO performance and relevance. Various data sources are used in completing the Scorecard, including experiences at MFA headquarters and other Dutch Ministries, the perceptions of policy departments, data collected from the MMS and MOPAN surveys, other findings from the field and the MO�s own evaluation reports.

Methodology While the Scorecard template remains available only in Dutch and the methodology for completing it remains undocumented and somewhat unfamiliar to most, the MMS is much better known which may have lead to the misconception that the latter constitutes the Dutch MO assessment process. What is known is that the Scorecard assessments are conducted by the responsible MFA Policy Officer who solicits the opinions of informed staff in other Ministries and draws on the available data generated from the MMS and the MOPAN surveys, as well as content analysis of MO strategy documents, evaluation reports and performance reports, etc.

The MMS is a perception survey of Embassy staff who is asked to update the previous performance assessment of the MOs active in their respective countries. For methodological uniformity, Embassy staff describes and rate the MOs based on the same criteria as used in the Scorecard. They rate their level of satisfaction on fifteen (15) questions using the following

22 This is a summary of the essential components of the methodology adapted from the �Fact Sheet for

Multilateral Monitoring� (April 2007) and combined with the consultant�s interview notes.

Page 36: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

33

normative rating scale: + satisfactory, - unsatisfactory, and ? don�t know. The perceptions gained are subjective, but taken together they give an impression of how the MOs perform at a country level. So, instead of objectivity and automatic conclusions, the MMS provides collective subjectivity. Since perceptions don�t appear to change significantly over a year, the MMS is updated every other year (biannually) which makes it possible to identify trends in the MO�s performance, both nationally and internationally. The regional departments and the United Nations and International Financial Institutions Department (DVF) check the MMS data for consistency. In case of different views the mission is contacted and asked for additional information. DVF places the results on the intranet, so that they are available to everyone at the Ministry. The 2005 MMS round at country level has been completed and can be found on the MFA intranet site at country level. The next MMS round has taken place in 2007. Results are being received and processed at the moment. The updated MMS data is integrated into the Scorecard Report. A recently updated Scorecard Report contains the following headings and information:

Netherlands Scorecard

GENERAL BACKGROUND Aim and mandate (narrative description) Working method and activities (narrative description) Financial scope and contributions (narrative description) Dutch representation (narrative description) Trends among other donors (narrative description) PERFORMANCE REVIEW Cluster 1: Internal performance

1. Effectiveness of administrative and management (rated assessment and description) 2. Transparency (rated assessment and description) 3. Monitoring and evaluation (rated assessment and description) 4. Results-based management (rated assessment and description) 5. Financial management and legitimacy (rated assessment and description)

Cluster 2: External performance 1. Focus on core mandate (rated assessment and description) 2. Coordination and cooperation with multilaterals (rated assessment and description) 3. Coordination and cooperation with bilaterals (rated assessment and description) 4. Implementation of policy at country level (rated assessment and description)

Cluster 3: Relevance A. General relevance of the institution (rated assessment and description) B. Specific relevance for the priorities in Mutual interests, mutual responsibilities

1. MDGs a) General (rated assessment and description) b) HIV/AIDS (rated assessment and description) c) Reproductive health (rated assessment and description) d) Basic education (rated assessment and description) e) Water and sanitation (rated assessment and description) f) Environment (rated assessment and description) g) Gender (rated assessment and description)

2. Security and stability (rated assessment and description) 3. Private sector development (rated assessment and description) 4. Africa (rated assessment and description) 5. Good governance (rated assessment and description) 6. Regional approach (rated assessment and description) 7. Coherence, trade and debt policy (rated assessment and description)

CONCLUSION (narrative description)

Page 37: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

34

ANNEX G: SUMMARY OF THE PMF METHODOLOGY23 Background In 2003, Danida experienced a corporate restructuring which included the decentralisation of responsibilities for bilateral and multilateral programming and monitoring to the field. While the Copenhagen Headquarters retains responsibility for policy development and guidance, the permanent missions cover the MO Board Meetings. The development of the Performance Management Framework (PMF) by Danida HQ was in response to this context of an increasingly decentralized management structure and the subsequent need for a performance monitoring and management tool. Since 2004 assessments of MO effectiveness have been conducted on an annual basis.

Purpose The objectives of PMF are to:

! enhance the quality of Danish development cooperation through stronger focus on results;

! improve management and continuous learning, through better information and reporting; and

! strengthen accountability through performance assessments and measurement in the context of an increasingly decentralized management structure.

The PMF systematically focuses on results with a view to providing performance information for management decisions that optimises value for money and ensures the prudent use of human and financial resources in support of the overall objective of poverty reduction. The performance information creates a basis for timely and relevant decisions and/or corrective action at all levels in Danida and hence improves the internal management and the decision-making process. By encouraging the formulation of clearer objectives, better documentation of results and systematic monitoring of progress towards achieving these, the PMF helps Danida and partners to strengthen the effectiveness of policies and programmes.

Design Characteristics The PMF is not an assessment tool per se, but rather a corporate framework and a set of specific tools for managing performance. The framework has two major parts for: Bilateral Development Cooperation and Multilateral Development Cooperation. The PMF for multilateral cooperation has three levels of analysis: 1) Danida Corporate level, 2) MO HQ level, and 3) MO Country level. At the Danida Corporate Level, the PMF enables Danida to assess:

1. Overall progress towards reaching the aim of poverty reduction through the multilateral organisations.

2. The multilateral organisations� contribution towards objectives regarding cross-cutting issues of gender equality, environmental concerns, as well as promotion of human rights, democracy and good governance.

3. The multilateral organisations� contribution towards improving aid effectiveness, including support for processes of alignment and poverty

23 This is a summary of the essential components of the methodology adapted from �Performance

Management Framework for Danish Development Cooperation 2006-07� prepared by the Quality Assurance Department, KVA (December 2005) and combined with the consultant�s interview notes.

Page 38: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

35

reduction strategies in partner countries, as well as progress in donor coordination and harmonization.

At the MO HQ Level, the PMF is based on the �organisation strategies� for each of the major multilateral organisations. They are formulated by Danish multilateral Representatives and relevant departments in the MFA in very close consultation with the MO, and define the strategic priorities for the Danish cooperation with the organisation. The agreed to objectives and priorities are to be taken from the strategic document of the MO to ensure full alignment and to facilitate subsequent reporting. While organisation strategies cover a three-four year period, annual action plans define indicators and targets to be achieved during each year. Danida draws on the information generated in the context of high-level consultations and by the assessment forms filled out by multilateral Representations and relevant departments of the MFA. Also, in line with international commitments to reducing the transaction costs of development cooperation, Danida is interested in relying more on performance reporting provided by the multilateral agencies themselves. However, it is a precondition for this approach that reliable information is provided by the MOs themselves. The Danida RBM Assessment program is an initiative that addresses the current weaknesses in MO reporting on development results.

At the Country Level, the PMF may vary from one country to another, and various systems to assess the country-level performance are established both by the MOs themselves and by the bilateral donors. Danida participates together with like-minded donors in the joint donor assessments of multilateral organisations� effort at the country level (MOPAN), which usually focus on three organisations in 8-10 countries every year. To supplement this information, Embassies in programme countries draw up annual assessments of the field-level performance of the three (3) most important MO-partners operating in the country.

Methodology The PFM includes a series of tools for its quality assessment of MOs, including: Organisation strategies govern the Danish cooperation with the multilateral organisations receiving more than DKK 20 million annually as well as organisations that are strategically important to Danish multilateral cooperation. These strategies outline the overall Danish priorities vis-à-vis the organisation and progress in meeting these objectives is measured annually through action plans which contain indicators of performance in pre-determined areas. High-level Consultations between the MFA and each of the multilateral organisations take place at least biannually to assess the organisation�s progress towards implementing the Danish strategy for the organisation and issues relating to the organisation�s handling of all general and earmarked contributions provided by Denmark. The minutes of these meetings constitute important monitoring information. Perception analyses of the general performance of multilateral organisations are made by multilateral representations and relevant departments in MFA. Joint donor assessments of multilateral organisations� effort at country level (MOPAN) are carried out annually in a joint effort between Denmark and like-minded donors. Perception analyses of the performance of multilateral organisations at country level are carried out by Embassies in programme countries.

Page 39: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

36

Evaluations/assessments of multilateral organisations� performance reporting systems are carried out jointly with other donors to provide the basis for support to strengthening the capacity of the organisations to provide high-quality performance information themselves.

Overview of the PMF for Multilateral Development Cooperation Focus of the PMF Key Management

Documents Methodology &

Data Sources Reporting and Responsibilities

Danida Corporate Level

Measure performance and verify results on overall goals, objectives and indicators in key policy documents.

- Partnership 2000

- MDGs

- Various sector or issue-specific policies and strategies

- Annual Report on Action Plans

- MO Assessments (HQ & Field)

- Evaluations

- MOPAN Survey

- Annual Performance Report (KVA)

- Annual Report of Danida (UDV)

MO HQ Level

Monitor performance against the objectives, targets and indicators set out in the MO�s own vision & strategy.

Monitor performance against the objectives, targets and indicators set out in Danida�s Organisation Strategies. (subset of above).

- MO strategies and action plans

- Organisation Strategies and action plans (Danida�s)

- Multilateral and relevant department�s MO Assessments

- RBM Assessment

- Evaluations

- Minutes from High level Consultations and annual Report on action Plans

- Annual Performance Report (KVA)

- Report to the Board of Danida

- RBM Assessment Report (KVA)

- Evaluation Reports (EVAL)

Country Level

Monitor performance at country level against the objectives, targets and indicators, set out in the MO�s own vision and strategy.

- MO strategies and action plans

-Embassy staff assessment of MOs

- MOPAN Survey

- Annual Performance Report (KVA)

- Assessment Reports (KVA)

Page 40: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

37

Page 41: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

38

ANNEX H: SUMMARY OF THE RBM ASSESSMENT METHODOLOGY24 Background In line with international commitments to reduce the transaction costs of development cooperation, Danida was interested in relying more on performance reporting provided by the MOs themselves. Consequently, the Danida PMF included among its various data collection techniques Evaluations/assessments of multilateral organisations� performance reporting systems in order to determine the robustness of the MOs performance reporting systems, but also to strengthen their capacity to provide high-quality performance information. In 2005, the consulting firm Dalberg � Global Development Advisors was contracted to undertake a number of RBM Assessments of a select number of UN Agencies, i.e., UNDP, UNFPA and UNICEF.

Purpose The goals of the RBM assessment are to map, analyse and provide recommendations for the strengthening of an MO�s RBM system. In short, the assessment should seek to identify both what RBM tools are in place as well as how effectively they are used in pursuit of improved organizational effectiveness and efficiency at delivering development results.

Design Characteristics The RBM Assessment is designed to examine three key elements in an RBM system as described in the conceptual framework below.

The assertion is that all three of the above elements need to be sufficiently robust for the organization to have a results-based management culture. A robust RBM architecture replete with logic models, strategic outcomes, indicators, data collection tools and information management systems is necessary but insufficient without the human processes that facilitate data collection, analysis, usage of findings for management decision making and program delivery adjustments. These human processes however remain generally weak and ineffective without staff incentives and disincentives to make the RBM system work, as well as the public demand by senior management for credible and reliable performance information to consider when making resource allocation decisions. 24 This is a summary of the essential components of the methodology adapted from �Expanding the RBM

Assessment Program� prepared by the Netherlands MFA (June 2006) and combined with the consultant�s interview findings.

Page 42: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

39

Methodology The methodology of conducting an RBM assessment must go beyond a systems theory approach and examine the corporate management culture to determine if it is sufficiently supportive of managing for results. Assessing the three key elements in the RBM system is nevertheless a good first step as described in the table below.

However, the methodology must also ensure senior management buy-in and engage the MO in a participatory process during the subsequent steps depicted in the diagram below.

Page 43: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

40

ANNEX I: SUMMARY OF THE FARP METHODOLOGY25 Background A framework for assessing relevance and performance of multilateral organisations was designed and launched in early 2007 by the Swedish Ministry of Foreign Affairs in response to demands for improved strategic action, better results orientation and greater monitoring and accountability for multilateral effectiveness. The need for such a framework and the main criteria therein were spelled out in the Strategy for Multilateral Development Cooperation, which was adopted by the Government on April 3, 2007. The framework was developed in order to be able to feed into the budget process for 2008. Central to the assessment framework is an assessment template (scorecard) which was developed under heavy time-pressure and without sufficient testing. It will, thus, be subject to further refinements for next year�s budget process. Twenty-eight (28) MOs (UN organizations, development banks, EU and vertical funds) were assessed during this first cycle.

Purpose The annual structured assessment procedure serves two main purposes, namely to:

• guide decisions on allocations of funds to MOs,

• provide an analytical basis for strategic dialogue with the MOs, and provide the basis for developing specific organizational strategies for those MOs.

The assessments will also feed in to the Government�s annual reporting on development cooperation to the Swedish Parliament.

Design Characteristics The template for assessing MOs receiving Swedish funding is designed in a qualitative and structured format. The principal criteria in the assessment template are relevance and effectiveness. Relevance means the compatibility of the activities with Swedish development goals and the role of the organisation in the international aid architecture. Effectiveness means whether the organisation contributes to the relevant goals set and whether the activities are organised so as to lead to results and employ aid resources effectively (i.e. internal and external effectiveness). The idea is to provide a snap-shot of where the organization is at and where it is moving. There is, however, a special section, allowing for comments on reform processes and positive/negative trends.

The first section, on relevance of goals, is structured along the lines of priorities established in Sweden�s Policy for Global Development and allows for four categories of response (concerning degree of compatibility, from �very high� to �none�).

The other sections allow for only three categories of response (�yes�, �no�, �unable to assess�). For each of the four sections an assessment is made in terms of �very relevant/effective�, �relevant/effective�, or �not relevant/effective�. Limited space is provided to comment the assessment made. The second section concerns how the organization fits into the international aid architecture. The third examines the internal effectiveness by asking questions about the organizational structure, RBM focus, existing monitoring and evaluation systems, and transparency/disclosure policy, including reporting on auditing. The fourth, examines external effectiveness by asking questions on selectivity, cooperation with others (i.e. in line with the Paris 25 This is a summary of the essential components of the methodology based on a recent unofficial translation

of the country level questionnaire and combined with the consultant�s interview findings.

Page 44: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

41

declaration), and the degree of focus at country level. Finally, there is a fifth section that allows for comments on planned or on-going reforms and other expected changes (or trends).

As an annex to the assessment, staff filling in the template have been asked to summarize the most important findings from other existing evaluations and the like, of which the analysis is based upon. Examples of such source of information have been internal evaluations of the organization concerned, MOPAN studies, and evaluations done by Sida and other development partners. A special template was also designed and sent out to 20 embassies ahead of the overall assessment in order to get a perspective from the field. This template poised questions regarding 11 different MOs. A Lichert scale methodology was used � i.e. the questions were actually statements to which one could indicate, along a scale of five, the degree of compatibility.

A second annex to the assessment provides statistical data about the total size of resources the organization has disbursed over the last five year period. Sweden�s contribution over the same period, Sweden�s ranking as donor as well as total donor contributions were also included.

Methodology A special project group within the Ministry of Foreign affairs has been responsible for developing the assessment template. The group also developed the special template sent out to 20 selected embassies prior to the overall assessment period, as it could serve as an input. Given expectations from the political leadership to have something to show already in the budget process for 2008, the templates were developed without a sufficient quality assurance process. The current template is thus a bit of an experiment, from which there will many lessons learned to be drawn. The overall assessment templates was filled in through a consultative process, lead by the responsible desk officer at the responsible Ministry in Stockholm. Input received from the template sent to the embassies was collected by the project group and put together by organization, as a service to the team. The team leader�s task was then to coordinate the job of filling in the assessment with his/her peers working specifically with the organization. That meant staff at Sida, Sweden�s UN Representations and other Ministries. In theory there were four weeks to undertake the assignment in practice, given Easter and other holidays, on top of the regular work-load, it was much less. The completed assessments were sent to the project group for quality assurance. Three aspects were considered particularly important � completeness, internal consistency between comments and ratings, and to what degree overall assessments seemed honest enough. The project group, together with division heads went through all assessments to examine these aspects. Although, the quality of assessments were of much better quality than anticipated, a number of team leaders were asked to take a second look at their assessments. The final assessments were then sent to a special reference group consisting of heads of departments to discuss if the assessments were of sufficiently good quality and how they could be used in this year�s national budget allocation process. This is where the process stands today! Given the methodological weaknesses and novelty of the whole process, it is probably safe to say that the assessments will mainly serve as a guideline for dialogue with the MOs and to a lesser extent as an impact on budget allocations.

Page 45: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

42

AN

NEX

J: C

RO

SS-C

ASE

CO

MPA

RIS

ON

TA

BLE

OF

MU

LTIL

ATER

AL A

SSES

SMEN

T AP

PRO

ACH

ES

NA

ME

M

EFF

M

ES

� B

SC

ME

RA

Sc

orec

ard

PMF

RB

M

Ass

essm

ent

FAR

P

Bila

tera

l Don

or

DFI

D

DFI

D

CID

A

MFA

- N

ethe

rland

s D

AN

IDA

D

AN

IDA

M

FA -

Swed

en

Bac

kgro

und

and

Purp

ose

Ince

ptio

n Y

ear

2004

20

07

2006

20

04

2003

20

06

2007

Fr

eque

ncy

Onc

e O

nce

Onc

e A

nnua

l A

nnua

l U

pon

Requ

est

Onc

e #

of M

Os (

/ yea

r)

23

N/A

19

25

MO

s / y

ear

3

MO

s/yr

/cou

ntry

ac

ross

15

coun

tries

U

ND

P, U

NFP

A,

IFA

D, O

HC

HR

, U

NIC

EF

28 M

Os /

yea

r

Obj

ectiv

es

Prov

ide

info

rmat

ion

for a

genc

y�s:

(a

) Pub

lic S

ervi

ce

Agr

eem

ent

repo

rting

, (b)

its

Inst

itutio

nal

Stra

tegi

es, (

c) it

s fin

anci

ng st

rate

gies

To e

nsur

e se

nior

m

anag

ers h

ave

a co

nsis

tent

pic

ture

of

the

effe

ctiv

enes

s of

eac

h M

O a

nd to

he

lp in

form

m

anag

emen

t de

cisi

on m

akin

g.

To d

eter

min

e ef

fect

iven

ess a

nd

rele

vanc

e w

ith a

vi

ew to

dire

ctin

g fu

ture

reso

urce

s to

enha

nce

MO

pe

rform

ance

.

To a

sses

s the

re

leva

nce

of M

Os

to D

utch

pol

icy

obje

ctiv

es, t

heir

perfo

rman

ce a

nd

adop

tion

of R

BM.

To:

(a) e

nhan

ce th

e qu

ality

of d

evel

op-

men

t co

oper

atio

n,

(b) i

mpr

ove

man

age-

men

t and

co

ntin

uous

le

arni

ng, (

c) st

reng

- th

en a

ccou

ntab

ility

Prov

ide

supp

ort t

o:

(a) s

treng

then

M

Os R

BM

ca

paci

ty (b

) en

ablin

g bi

late

ral

dono

rs to

dep

end

on M

O re

sults

re

porti

ng fo

r as

sess

men

ts.

To i

nfor

m b

udge

t de

cisi

on-m

akin

g pr

oces

s, an

d to

pr

ovid

e an

an

alyt

ical

bas

is fo

r st

rate

gic

wor

k w

ith

the

MO

s.

Des

ign

Cha

ract

eris

tics

Ass

essm

ent

Crit

eria

and

/or

Pers

pect

ives

Eigh

t or

gani

zatio

nal

syst

ems a

nd th

eir

focu

s on

(a) i

nter

nal

perfo

rman

ce,

(b) c

ount

ry le

vel

resu

lts a

nd

(c) p

artn

ersh

ips

Four

qua

dran

ts o

f th

e B

alan

ced

Scor

ecar

d: a

) B

uild

ing

for t

he

Futu

re, b

) M

anag

ing

Res

ourc

es, c

) Pa

rtner

ship

s and

d)

Cou

ntry

/Glo

bal

Perfo

rman

ce.

(a) r

elev

ance

, (b

) res

ults

(c

) man

agin

g of

the

inst

itutio

n

Scor

ecar

d cr

iteria

: a)

inte

rnal

pe

rform

ance

, b)

ext

erna

l pe

rform

ance

, and

c)

rele

vanc

e

(a) D

anid

a C

orpo

rate

leve

l;

(b) M

O H

Q le

vel;

(c

) Cou

ntry

leve

l (fi

eld

leve

l)

RB

M S

yste

m:

a) A

rchi

tect

ure

b)Pr

oces

s, an

d

c) In

cent

ives

/ R

esou

rce

Allo

catio

ns

Rel

evan

ce: a

) goa

l co

ngru

ence

and

b)

role

rele

vanc

e A

genc

y

perfo

rman

ce c

) in

tern

al p

erfo

rm.,

and

d) e

xter

nal

perfo

rman

ce

# of

Crit

eria

8

4 3

3 3

3 4

Inte

rnat

iona

l and

N

atio

nal p

riorit

ies

Intn

�l:

MD

Gs

and

Paris

D

ecla

ratio

n:

Intn

�l:

MD

Gs

and

Paris

D

ecla

ratio

n:

Intn

�l:

MD

Gs

and

Paris

D

ecla

ratio

n In

tn�l:

M

DG

s an

d Pa

ris

Dec

lara

tion;

In

tn�l:

M

DG

s, PR

SPs

and

cros

s-N

/A

Intn

�l:

MD

Gs

and

Paris

D

ecla

ratio

n;

Page 46: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

43

AN

NEX

J: C

RO

SS-C

ASE

CO

MPA

RIS

ON

TA

BLE

OF

MU

LTIL

ATER

AL A

SSES

SMEN

T AP

PRO

ACH

ES

NA

ME

M

EFF

M

ES

� B

SC

ME

RA

Sc

orec

ard

PMF

RB

M

Ass

essm

ent

FAR

P

addr

esse

d pa

rtner

ship

an

d ha

rmon

isat

ion

partn

ersh

ip

and

harm

onis

atio

n C

anad

ian

prio

ritie

s:

GE,

EN

VIR

, an

d R

BM

Dut

ch

prio

ritie

s:

e.g.

, PS

D,

GG

DD

, an

ti-co

rrup

tion,

etc

.

cutti

ng i

ssue

s, e.

g.,

GE,

H

RD

GG

, EN

VIR

Swed

ish

prio

ritie

s:

HR

GG

DD

,GE,

NR

, ec

onom

ic

grow

th,

soci

al d

evel

opm

ent

and

secu

rity,

et

c.

M

etho

dolo

gy

Ove

rall

Met

hodo

logi

cal

App

roac

h

Ass

essm

ent i

nitia

lly

cond

ucte

d by

DFI

D

staf

f w

ith

vario

us

degr

ees

of

inpu

t fro

m

MO

s an

d en

ding

w

ith

final

di

alog

ue m

eetin

g.

Ass

essm

ent i

nitia

lly

cond

ucte

d by

DFI

D

staf

f us

ing

vario

us

seco

ndar

y da

ta

sour

ces.

Ass

essm

ent

cond

ucte

d by

CID

A

MPB

D

irect

orat

e st

aff

draw

ing

on

pers

onal

per

cept

ions

an

d av

aila

ble

empi

rical

dat

a.

Scor

ecar

d up

date

d by

M

FA

Polic

y O

ffice

r he

avily

ba

sed

on

opin

ion

surv

eys

and

othe

r av

aila

ble

empi

rical

da

ta.

Ass

essm

ent l

ead

by

MFA

Des

k O

ffice

r us

ing

mul

tiple

line

s of

ev

iden

ce,

incl

udin

g Em

bass

y st

aff

surv

ey

and

RB

M a

sses

smen

t

Ass

essm

ent o

f MO

s R

BM

sy

stem

s in

H

Q/fi

eld

cond

ucte

d by

con

sulta

nts

in a

hi

ghly

col

labo

rativ

e m

anne

r.

Ass

essm

ent b

y MFA

D

esk

Offi

cer

who

le

ads

the

team

dr

awin

g on

th

e in

form

ed o

pini

ons o

f e.

g.

cons

titue

ncy

offic

es, D

eleg

atio

ns

and

field

staf

f. D

ata

sour

ces

DFI

D

staf

f w

ith

vario

us d

egre

es o

f in

put

from

M

Os

staf

f and

man

ager

s

Seco

ndar

y da

ta

sour

ces:

M

EFF,

M

OPA

N

Surv

ey,

Paris

Sur

vey,

DA

C

peer

rev

iew

s, M

O

perfo

rman

ce

and

eval

uatio

n re

ports

.

CID

A

staf

f, av

aila

ble

docu

men

tatio

n an

d M

O

perfo

rman

ce

repo

rts.

Scor

ecar

d so

urce

s:

HQ

ex

perie

nces

, po

licy

depa

rtmen

t pe

rcep

tions

, M

MS

Surv

ey w

ith 1

5 Q

s, M

OPA

N

Surv

ey,

othe

r fin

ding

s fro

m

the f

ield

and

anal

ysis

of M

O e

valu

atio

ns.

Mul

tiple

so

urce

s:

MFA

Des

k O

ffice

rs,

Reg

iona

l D

eleg

atio

ns

(NY

, G

EN,

WA

SH,

RO

ME)

, Em

bass

y st

aff,

MO

sta

ff an

d m

anag

ers

(RBM

A

sses

smen

t)

Mul

tiple

so

urce

s:

MO

doc

umen

tatio

n,

staf

f int

ervi

ews,

on-

site

ob

serv

atio

ns,

test

ing

or

view

ing

dem

onst

ratio

n of

on

line

syst

ems

/repo

rting

. G

roup

di

scus

sion

s an

d w

orks

hops

with

MO

m

anag

emen

t & st

aff.

Man

dato

ry:

Surv

ey

of

20

Emba

ssie

s;

Del

egat

ion

HQ

in

put;

SID

A

them

atic

in

put;

anal

ysis

of

au

dit

repo

rts a

nd a

nnua

l re

ports

. A

dditi

onal

: M

OPA

N

repo

rts;

MO

ev

alua

tions

, pe

er

revi

ews

asse

ssm

ents

by o

ther

do

nors

. #

of K

PI/Q

s 72

40

26

15

21

33

33

Type

of K

PI/Q

s Em

piric

al

Mix

ed

Eval

uativ

e M

ostly

eva

luat

ive

Mos

tly e

valu

ativ

e M

ixed

Ev

alua

tive

Rat

ing

scal

es o

r Y

es �

3 p

oint

/col

our

Yes

-Bei

ng

Yes

norm

ativ

e Y

es

� Sc

orec

ard

Yes

Emba

ssy

No

Yes

� 3

poi

nt ra

nk-

Page 47: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

44

AN

NEX

J: C

RO

SS-C

ASE

CO

MPA

RIS

ON

TA

BLE

OF

MU

LTIL

ATER

AL A

SSES

SMEN

T AP

PRO

ACH

ES

NA

ME

M

EFF

M

ES

� B

SC

ME

RA

Sc

orec

ard

PMF

RB

M

Ass

essm

ent

FAR

P

tool

s use

d T

raffi

c Li

ghts

A

sses

smen

t de

velo

ped

B

SC

with

5 p

oint

/ col

our

Traf

fic

Ligh

ts

Ass

essm

ent,

but

won

�t be

ap

plie

d un

til 2

008.

num

eric

1-

5 fo

r ea

ch o

f 3 c

riter

ia

uses

nor

mat

ive

4-5

poin

t ra

nk-o

rder

ed

scal

es

MM

S R

atin

g sc

ale:

+

satis

fact

ory

- uns

atis

fact

ory

? D

on�t

know

surv

ey

uses

no

rmat

ive

rank

-or

dere

d 5

poin

t sa

tisfa

ctio

n sc

ale:

V

S,S,

US,

VU

S

orde

red

scal

e to

rate

re

leva

nce

and

effe

ctiv

enes

s. Fo

r ea

ch s

ub-in

dica

tor:

gene

rally

3 o

ptio

ns:

�yes

�,

�no�

or

�u

nabl

e to

ass

ess�

.

Scor

es u

sed

to

com

pare

Y

es -

Colo

ured

D

ashb

oard

N

o Y

es

No

No

No

No.

Ince

ntiv

es a

nd R

esou

rce

Allo

catio

ns

Rep

ort s

hare

d w

ith M

Os i

n co

untry

Yes

N

/A

No

No

No

Yes

N

o, b

ut w

ill p

rovi

de

feed

back

at

hi

gh

leve

l con

sulta

tions

Su

mm

ary

Rep

ort

publ

icly

ava

ilabl

e Y

es

Effe

ctiv

enes

s Su

mm

arie

s fo

r 16

M

Os

will

be

pu

blis

hed

in

July

20

07.

No

Sum

mar

y re

port

on

each

MO

bas

ed o

n M

MS

data

is g

iven

to

Po

licy

Offi

cer

whi

ch

is

then

in

putte

d in

to

the

Scor

ecar

d w

hich

has

not

been

a p

ublic

do

cum

ent.

Yes

Y

es

Not

yet

dec

ided

for

th

e fu

ture

� f

or t

he

first

yea

r it

will

not

be

.

Rep

ort u

sed

for

Res

ourc

e A

lloca

tion

Yes

N

o, ra

ther

to in

form

de

cisi

on m

akin

g an

d to

en

sure

se

nior

m

anag

ers

have

a

cons

iste

nt p

ictu

re o

f ea

ch m

ultil

ater

al.

Yes

, to

rew

ard

good

pe

rform

ance

w

ith

addi

tiona

l fun

ds.

Not

mec

hani

cally

or

syst

emat

ical

ly,

but

can

influ

ence

pol

icy

dial

ogue

an

d fu

ndin

g de

cisi

ons.

Not

mec

hani

cally

or

syst

emat

ical

ly,

but

can

influ

ence

seni

or

man

agem

ent

deci

sion

-mak

ing.

No

Not

m

echa

nica

lly,

but

inte

nded

to

gu

ide

budg

et

allo

catio

ns

and

fund

ing

deci

sion

s.

Gen

eral

Str

engt

hs a

nd W

eakn

esse

s M

easu

rem

ent o

r M

easu

rem

ent

Mea

sure

men

t M

easu

rem

ent

Mix

ed

Mix

ed

Mix

ed

Mea

sure

men

t

Page 48: Assessing Multilateral Organisation Effectiveness: A ... › lotusquickr › cop-mfdr... · common approach for the assessment of multilateral effectiveness which would involve initiatives

45

AN

NEX

J: C

RO

SS-C

ASE

CO

MPA

RIS

ON

TA

BLE

OF

MU

LTIL

ATER

AL A

SSES

SMEN

T AP

PRO

ACH

ES

NA

ME

M

EFF

M

ES

� B

SC

ME

RA

Sc

orec

ard

PMF

RB

M

Ass

essm

ent

FAR

P

Man

agem

ent

mea

sure

men

t and

m

anag

emen

t m

easu

rem

ent a

nd

man

agem

ent

mea

sure

men

t and

m

anag

emen

t C

ompl

exity

of u

se

Fairl

y si

mpl

e to

co

mpl

ete

Che

cklis

t, ev

en

for

MO

s, ho

wev

er

data

an

alys

is /

rep

ortin

g be

com

e co

mpl

ex

and

time

cons

umin

g

Som

ewha

t com

plex

to

co

mpi

le

seco

ndar

y da

ta

give

n th

e va

riety

of

data

so

urce

s an

d tim

ing,

how

ever

less

tim

e co

nsum

ing

to

data

an

alys

is

for

indi

vidu

al M

Os.

Sim

ple

for

Des

k of

ficer

s to

an

swer

qu

estio

ns

with

no

ad

ditio

nal

prim

ary

da

ta

colle

ctio

n

requ

irem

ents

Scor

ecar

d som

ewha

t co

mpl

ex b

ecau

se o

f va

riety

of

da

ta

sour

ces.

MSS

eas

y fo

r Em

bass

ies

to

answ

er

but

time

cons

umin

g to

an

alys

e fo

r in

divi

dual

MO

s.

Com

plex

gi

ven

thre

e (3

) le

vels

, va

riety

of

da

ta

sour

ces

and

colle

ctio

n m

etho

ds.

Emba

ssy

surv

ey i

s fa

irly

sim

ple

but

time

cons

umin

g to

an

alys

e fo

r in

divi

dual

MO

s.

Com

plex

and

tim

e co

nsum

ing

to

cond

uct

requ

iring

be

twee

n 60

-100

p/d

le

vel

of

effo

rt fo

r tw

o pe

ople

ov

er

thre

e (3

) m

onth

pe

riod.

Fairl

y si

mpl

e sur

vey

appr

oach

, bu

t so

mew

hat

com

plex

gi

ven

it w

as th

e firs

t tim

e R

athe

r tim

e-co

nsum

ing,

Sust

aina

bilit

y N

ot fo

r sin

gle

dono

r N

ot fo

r sin

gle

dono

r Y

es

Yes

Y

es

Not

for s

ingl

e do

nor

Yes