annex a – how this guide relates to wbi's ... - world...

87
How to Collect Data that feed into WBI's Aggregate Results A Guide for WBI Teams Working on Capacity Development Initiatives DRAFT Page 1 of 87

Upload: nguyentruc

Post on 10-May-2018

215 views

Category:

Documents


1 download

TRANSCRIPT

How to Collect Data that feed into WBI's Aggregate Results

A Guide for WBI Teams Working on Capacity Development Initiatives

DRAFT

World Bank Institute Capacity Development and Results Practice

March 5, 2011

Page 1 of 59

Table of Contents

I. Introduction.............................................................................................................................................................................. 6

A. How WBI teams feed data in WBI's results and Key Performance Indicators (KPIs)...........................6

B. How the guide relates to the Capacity Development and Results Framework (CDRF)........................7

C. Why and when should WBI teams use this guide..................................................................................................8

II. Planning for the collection of results data...................................................................................................................9

A. Features of evidence of results that permit aggregation with WBI's results............................................9

B. How to use the suggested data collection tools and approaches.................................................................10

C. Implementation tips on how to collect data at different stages of an initiative....................................16

III. How to collect results data as stakeholders empowered via ICOs.................................................................19

IV. How to collect results data on institutional capacity change............................................................................32

Annex A – How this guide relates to WBI's results agenda.........................................................................................37

Annex B – Frequently-Asked Questions...............................................................................................................................38

A. Policy...................................................................................................................................................................................... 38

1. What is a TTL's accountability for demonstrating and reporting results of an initiative?. 382. What is WBI's policy on participant feedback questionnaires (Level-1 evaluation)?..........383. Which of the tools introduced in this guide are mandatory?...........................................................384. Are TTLs expected to report results per WBI's aggregate result indicators?...........................38

B. Main concepts and rationale for measuring WBI results in this way........................................................39

5. How are WBI's results related to the CDRF?...........................................................................................396. Why is supporting evidence for results required?................................................................................397. Why do the examples of supporting evidence vary broadly?..........................................................398. Why use a mix of self-reports and evidence of results from external sources?.......................399. Why make some questionnaires anonymous and others by name?.............................................4010. What is the difference between a milestone, deliverable, process performance indicator,

and result?.............................................................................................................................................................. 4011. Is a change in network structure a result or a process?.....................................................................4112. How is "client-operated" defined?...............................................................................................................4113. How are the ICOs on altered processes and relationships counted in WBI results?.............4114. Why use a process performance indicator if it won't end up in WBI's results?......................42

C. Practical implementation considerations..............................................................................................................42

15. When should results data be collected?....................................................................................................4216. When should WBI teams report results data?........................................................................................4317. How can results of an initiative be reported after the ACS has been closed?...........................4318. Should a TTL report a result, if the contribution of the WBI initiative was minimal?..........4319. Will WBICR screen results before aggregating them to avoid reputational risks?................4420. Can results be reported confidentially?.....................................................................................................4421. What happens if an initiative has results, but no supporting evidence?.....................................4422. Which media can be used for evidence of results?...............................................................................4423. Will survey data feed into WBI results if their questions and scales differ from WBI's?....4424. Should teams collect and report results in numbers, percentages or both?.............................45

Page 2 of 59

D. Results as "stakeholders empowered through intermediate capacity outcomes"..............................45

25. What does "direct" and "indirect" client (stakeholders) mean?.....................................................4526. Does WBI account for results from direct and indirect clients in the same way?..................4527. Is any participant who took a course based on WBI materials counted in WBI results?....4528. Why not use client survey questions to see if participants enhanced their skills?................4529. Why have client survey questions relate to the client and not the group?................................45

E. Results as "development initiatives advanced through increased institutional capacity"...............46

30. What is meant by "development initiatives"?.........................................................................................4631. What is meant by "advance"?.........................................................................................................................4632. How are changes in institutional capacities demonstrated?............................................................46

Page 3 of 59

Acknowledgments

Under the overall guidance of Samuel Otoo, Manager of the World Bank Institute Capacity Development and Results Practice, Violaine Le Rouzic, Joy Behrens, Anne Gillies and Sara Okada produced this guide and its supporting tools. Sharon Fisher edited this guide.

The authors are very grateful to Natalia Agapitova, Une Lee, and Cristina Ling for their useful review and comments, to the members of the Working Group on WBI's Results and Key Performance Indicators for their valuable input that spurred the creation of this guide, and to the WBI management team for creating a strategy and an environment that fosters results.

Page 4 of 59

Acronyms

ACS Activity Completion Summary

AIS Activity Initiation Summary

AUS Activity Update Summary

BTOR Back-to-office Report

CD Capacity Development

CDRF Capacity Development and Results Framework

FAQ Frequently-Asked Question

FY Fiscal Year

ICO Intermediate Capacity Outcome

IRIS Integrated Records Information System

KPI Key Performance Indicator

M&E Monitoring and Evaluation

Plato WBI's Planning Tool

SAP Systems, Applications and Products in Data Processing

TTL Task Team Leader

WBI World Bank Institute

WBICR World Bank Institute Capacity Development and Results Practice

WBILT World Bank Institute Leadership Team

Page 5 of 59

I. Introduction

This data collection guide and associated tools are designed to help WBI teams in charge of Capacity Development (CD) initiatives1 plan for and then collect the information on the outcomes of their initiatives in a way that could be aggregated with and reported among WBI's results. The introduction (Chapter I) describes: a) how teams contribute to WBI results and Key Performance Indicators (KPIs), including what is meant by WBI results, b) how this guide relates to the Capacity Development and Results Framework (CDRF), and c) why and when use this guide. Chapter II helps teams plan for how they will collect and use results data. Chapter III suggests data collection approaches per targeted intermediate capacity outcome. Chapter IV gives ideas on how to demonstrate results in terms of institutional capacity change. More information, including Frequently-Asked Questions, is in annex.

A. How WBI teams feed data in WBI's results and Key Performance Indicators (KPIs)

The WBI results agenda aims to strengthen WBI's global leadership in the thinking and practice of results-focused capacity development. At the Institute level, WBI monitors its performance through KPIs. Information collected on the KPIs (both qualitative and quantitative) enables WBI to tell its performance story to the World Bank Group, as well as its partners, clients and the general public.

The WBI KPIs comprise indicators structured around the six characteristics of organizational effectiveness of the CDRF. These characteristics are: clarity of mission; achievement of outcomes; operational efficiency; financial viability and probity; communications and stakeholder relations; and adaptability. Under the characteristic "achievement of outcomes," WBI reports on the outcomes of its initiatives using two indicators of WBI's aggregate results2, which are:

1. Direct and indirect stakeholders (a.k.a. clients) empowered through: raised awareness enhanced knowledge and skills Improved consensus and teamwork strengthened coalitions enhanced networks new implementation know-how

2. Development initiatives advanced through increased institutional capacity—of which through: Stronger stakeholder ownership More efficient policy instruments More effective organizational arrangements

These indicators are expressed in number of stakeholders and development initiatives and complemented by descriptions of observed changes to which WBI contributed.

It is the responsibility of WBI teams to: a) collect both numbers and descriptions of what changed due in part to their initiatives; b) archive the evidence of results—results meaning the change(s) observed in relation to any of the two indicators above—, and c) report the information and data on what changed in the Activity Update Summary and Activity Completion Summary in SAP.

1 Throughout this guide, WBI teams in charge of capacity development initiatives will be referred as "WBI teams."2 The guide also refers to the "WBI aggregate results" as "WBI results."

Page 6 of 59

WBICR reviews the results information provided by WBI's teams, includes this information in WBI's results database (if they meet the required criteria), categorizes them, and provides a variety of aggregate reports on WBI's results, including the WBI aggregate results data that are part of the KPIs.

Chart 1 summarizes the flow of information from the results data collected by WBI teams at the level of an initiative to how the results information becomes part of the overall performance story of WBI.

Chart 1. Flow of WBI results data, from initiative to WBI's aggregate results

Page 7 of 59

Results aggregated into KPIs on“Achievement of Outcomes”

Information to demonstrate results before, during, and after CD activities:Work products by participants, stakeholder documents, analysis of responses to questionnaires, etc.

WBI Initiative Team:Articulates results

logic in AIS and Concept Note template

Collects information to demonstrate initiative results

At design stageDuring implementationAfter implementationReports the results

in SAP in the AUS and ACS, includes self-ratings, description & supporting evidence

Results of initiative reported indescription fields in AUS/ACS in SAP

WBICR:Reviews the

information in AUS/ACS in SAP

Categorizes the results information

Aggregates the categorized data to produce reports for WBI's KPIs related to Achievement of Outcomes

a

WBI’s KPIs are grouped into 6 categories of performance, based on CDRF’s Organizational Arrangements capacity factor.

WBI’s KPIsClarity of MissionAchievement of OutcomesAgents of change empoweredDevelopment initiatives advanced through increased institutional capacityOperational EfficiencyFinancial Viability and ProbityCommunications & Stakeholder RelationsAdaptability

WBI:Reports on WBI’s aggregate resultswithin the KPIs

Results categorized according to CDRF in Results Database

B. How the guide relates to the Capacity Development and Results Framework (CDRF)

This data collection guide is part of the CDRF application tools and is therefore intricately related to the CDRF in various ways. The CDRF underpins WBI's strategy and supports WBI teams in designing and managing their initiatives for results. This guide follows the adaptive management approach of the CDRF. The data that teams collect following this guide flow into the WBI KPIs which are articulated around the CDRF's characteristics for effectiveness of organizational arrangements. Within the KPIs, the two aggregate results indicators of WBI have been built on what WBI teams reported in FY10 in their Program Results Summaries—a template that articulates a program's logic according to the CDRF. The CDRF also provides WBICR with a grid for aggregating initiative results reported by WBI teams based on this guide.

More specifically, the CDRF enables WBI teams to design and implement CD initiatives that produce changes in the capacity of WBI's clients, and in their institutional capacities. WBI results are expressed in terms of on whom or on what these capacity changes can be observed. The CDRF also guides the reporting of WBI's results by allowing WBI teams to describe the change process based on the results logic of the initiative. By using the CDRF in design, measurement and then reporting, WBI teams are better able to explain to others what difference is being made by their initiatives and demonstrate how they contribute towards improved capacities of its clients.

Each WBI CD initiative is expected to contribute towards capacity development on two inter-connected levels, and the guide is structured around these levels:

Intermediate Capacity Outcomes (ICOs) As an intermediate step toward institutional capacity changes, an initiative must improve the capacity of stakeholders (clients) and measure what changes occur. WBI teams need to design initiatives that use a strategic combination of ICOs to leverage broader institutional capacity changes. Chapter III of this guide recommends means to collect results in terms of stakeholders empowered through intermediate capacity outcomes.

Institutional Capacity Changes Due to improved capacities achieved at the ICO level, WBI clients can be expected to act as agents producing changes in higher-level institutional capacity areas: stakeholder ownership, policy instruments and organizational arrangements. Chapter IV of this guide provides ideas for collecting results data corresponding to institutional capacity changes.

C. Why and when should WBI teams use this guide

This data collection guide is designed to help WBI teams plan for and then collect the needed measurement information on the outcomes of their initiatives in a way that could be aggregated with and reported among WBI's results. The guide supports WBI teams at several stages of their initiatives, and more generally contributes, to various degrees, to the implementation of several elements of the WBI results agenda.

At the design stage once the initiative's objectives are specified, the guide helps WBI teams identify results indicators and plan for the related data collection as they fill out their Activity Initiation Summaries and their concept notes. For this, teams should read Chapter II on planning for the collection of results data to understand the overall process. Teams should then go to Chapter III, spot in the left column of Table 2 their intended intermediate capacity outcomes, consider the

Page 8 of 59

corresponding data collection approach suggested and decide on their specific approach. For ideas on how to demonstrate results in terms of institutional capacity change, teams can review the examples in Chapter IV. For additional information, teams can refer to the Frequently-Asked Questions in Annex B.

Based on this plan, at implementation stage, the guide and its data collection tools help WBI teams produce the instrument(s) that would gather information on the results of the initiative (or any part of the initiative assessed)—starting with baseline data. As results data are collected throughout the initiatives, WBI teams report this information in the Activity Update Summary (AUS) and Activity Completion Summary (ACS) in SAP.

By following this guide, the WBI teams contribute to WBI's results agenda in a variety of ways. For more information on how this guide relates to the WBI results agenda see Annex A.

II. Planning for the collection of results dataThis chapter helps WBI teams plan how they will collect and use results data (including baseline) throughout the initiative using tools and approaches that enable the demonstration of WBI results at initiative level and in the aggregate. The chapter: a) indicates the features of initiatives' evidence of results that enable their aggregation with WBI's results, b) presents recommended data collection tools and approaches for WBI initiatives, and c) gives practical implementation tips using examples of how to collect data at different stages of an initiative given a variety of circumstances.

A. Features of evidence of results that permit aggregation with WBI's results

WBI teams are to report the results of their initiatives in their Activity Completion Summaries. The last description section of the ACS template asks for links to the evidence supporting these results. Evidence can take a broad variety of shapes, be it quantitative, qualitative, or both. To enable the aggregation of an initiative's results with WBI's results, the supporting evidence must have common features across WBI. Specifically the pieces of information that make up the evidence need to have all of the following features:

The evidence needs to indicate a change in the capacity of clients and/or in their institutional capacities. At minimum, a change is an observed difference between before and after the intervention. Evidence of change can be derived from an analysis of several documents (e.g., comparing clients' annual reports) and/or from information collected on one document (e.g., questionnaire asking participants to rate the extent to which the activity helped move forward the formulation of their action plan).

The change should be logically associated to the contribution of the WBI initiative. WBI teams should be able to explain by themselves and/or provide documents that show or logically imply that their initiative contributed to the change in capacity that they report as result. WBI is concerned with contribution to results—not attribution of results. Therefore, teams only have to provide in their Activity Completion Summary a simple description that would help stakeholders understand the contribution of WBI's initiative to a result. 3

The evidence of the change in capacity must come from a documented external source, i.e., participants, clients, internal or external partners, observers, media, or external records. (Information based solely on self-report from WBI will not be aggregated with other WBI results supported by evidence.)

3 This guide focuses on data collection. Other guidelines explain how to describe results in an AUS or ACS.

Page 9 of 59

The information on the change in capacity is provided in a format which can be counted as in WBI's aggregate result indicators.4 In general, raw, unedited, unprocessed information (e.g., actual questionnaires on which participants answered, unedited videos, copies of client annual reports) strengthen the credibility of the evidence. Teams are encouraged to provide this raw information—whether or not they also have processed pieces of evidence.

The examples provided in this guide and its associated data collection tools are designed to help teams collect evidence of results in a way that could be aggregated in WBI's results. Therefore, teams are encouraged to use this guidance and tools. If WBI teams wish to explore other avenues, they should seek advice from WBICR.

WBI teams are to support the results of their initiatives with evidence for the following reasons:

Monitoring results throughout an initiative informs decisions that help the initiative achieve greater results than if no attention was paid to tracking outcomes.

Monitoring CD results requires using proxy measures, such as observed or reported change in behavior, to indicate whether capacity has been developed. Proxy measures are needed because one cannot determine just by looking at participants the extent to which their capacity is being developed. These proxy measures serve as evidence of results.

Supporting reports of results of an initiative with evidence increases stakeholders' interest in the initiative, as the stakeholders of the initiatives' stakeholders are themselves interested in seeing what results were achieved in part with their resources.

Supporting evidence also enhance the credibility of the results reported.

While collecting evidence of results requires some efforts on the part of the WBI teams, using this guide and its associated tools makes the process practical, simple, and inexpensive. More importantly, the process is meant to be helpful, as the guide recommends embedding the collection of results data in the design and implementation of an initiative, and to use the information to make decision on the next steps.

B. How to use the suggested data collection tools and approaches

This section presents the tools and approaches designed to help WBI teams collect information on the results of their initiatives to feed into WBI's aggregate results. The section explains how to use each tool and approach, starting with the list of data collection tools linked to this guide for teams to customize to their needs before use, and finishing by summarizing other approaches to data collection that are mentioned throughout this guide.

The tools and approaches presented in this chapter are examples made available to WBI teams to facilitate their data collection. Teams decide whether and how they use these tools and approaches.

4 For example, one of WBI's results indicator is the number and description of change agents empowered through Intermediate Capacity Outcomes. If a team only reports the average rating, say 4.3, to a participant feedback question related to an ICO, this information could not be aggregated with the rest of WBI's results in terms of number of change agents empowered through ICOs because—from the average alone—one cannot tell the number of participants empowered through ICOs. The raw data (or questionnaires) that were used to compute this average or the frequency distribution should be provided. This does not require extra data collection because the raw data had to be collected for the average could be computed. However, if the data are collected by partners, WBI teams should arrange for getting the raw data in some form (e.g., the original questionnaires, questionnaire copies, a table associating the rating to a question and a participant.)

Page 10 of 59

None of them is mandated. Other tools and approaches might be preferred by a team given their circumstances. Initiative results will feed in WBI's aggregate results if their supporting evidence has the features listed in Section A above.

All data collection tools and approaches are designed to collect information that would help WBI teams do the following:

Achieve results—through better targeting by using data that inform the design and implementation of initiatives,

Demonstrate results—by collecting data throughout the initiative that show changes taking place due in part to the WBI initiative (whether directly or indirectly), and

Report on results—by feeding data collected into the initiatives' AUS and ACS in SAP, and other reports that teams may produce.

1. Data collection tools linked to this guide5

Five data collection tools are associated with this guide. They are templates and a toolkit which teams can customize to produce: a) a participant application form; b) a participant feedback questionnaire; c) a reflection and action form, d) a Level-2 evaluation, and e) a follow-up survey.

a) Participant Application Form Purpose: The Participant Application Form provides the WBI initiative team with response data that they can use to:

select participants, better understand the participant context and refine the delivery as needed, archive the responses as evidence of the baseline situation the initiative seeks to address,

and report on this element of baseline data in the AUS and ACS.

Context: Before a learning deliverable, WBI teams may ask clients who want to participate to complete an application form that includes questions on the issue they wish to improve. WBI teams who choose to use the WBI Participant Application Form should customize the Participant Application Form template .6 If participants are invited rather than selected based on applications, then similar questions can be asked on a participant registration form. If the application or registration form is handled by a partner, the WBI team should review the partner's form to see if it would yield data equivalent to those provided by questions 7 to 11 of the WBI Participant Application Form template . If not, the WBI teams should consider adding the missing questions.

b) Participant Feedback QuestionnairePurpose: Depending on the questions chosen, a WBI team can use the Participant Feedback Questionnaire to:

to measure progress towards the achievement of intermediate capacity outcomes and collect anonymous feedback on other aspects in order to help guide the initiative's next steps,

collect results data that will be translated in terms of number of stakeholders empowered through intermediate capacity outcomes,7, and

5 See several of these tools, including templates and customized examples. 6 See a demonstration version of a customized application form with customizations highlighted.7 WBI initiative teams collect the data and bring them for tabulation to WBICR. WBICR translates the responses into the number of change agents empowered through intermediate capacity outcomes, based on the number of respondents who gave a "4" or

Page 11 of 59

report on this element of outcome data in the AUS and ACS, including a link to the archived tabulation provided by WBICR.

Context: Participants are given a customized version of the Participant Feedback Questionnaire template to complete typically at the end of an event.8 The questionnaire may ask about that event or about any part of the initiative that participants would be able to assess at this point. For example, if an initiative gathers the same participants monthly for years, the WBI teams may choose to administer the participant feedback questionnaire after each of the first two gatherings to reflect on each gathering, and then every six months asking participants to assess their experience over the past six months.

Preparation and administration: To use this questionnaire, WBI teams determine which part(s) of the initiative the questionnaire should assess, and which questions are relevant based on the objective of the assessed part(s) of the initiative and on the team's feedback need. Teams should then customize9 the questionnaire template to include the questions of interest, prepare the questionnaire forms, administer them to participants, and collect the responses.

This questionnaire collects data anonymously to encourage honest answers. In customizing the Participant Feedback Questionnaire template, teams must keep this questionnaire anonymous. This questionnaire is designed to be administered at the same time as a reflection and action form that asks for answers by name.

Tabulation support provided: WBI teams bring the responses to WBICR for tabulation.

c) Reflection and Action FormPurpose: The Reflection and Action Form captures information that WBI event participants report about results as they see them, actions they plan to take as a result of their participation, and implementation challenges they foresee.

WBI team can use the Reflection and Action Form to: plan or adjust the content, design, and timing of potential follow-up activities or surveys, measure progress towards the achievement of intermediate capacity outcomes based on

participants' descriptions of changes (with names attached to the descriptions), collect results data that will feed into WBI's aggregate results in terms of description related

to stakeholders empowered though intermediate capacity outcomes, and report on this element of outcome data in the AUS and ACS, including a link to the archived

tabulation provided by WBICR.

Context, preparation and administration: WBI teams must first customize the Reflection and Action Form template.10 Participants are given a customized version of the form to complete typically at the end of an event. As with the Participant Feedback Questionnaire above, the form may ask about that event or about any part of the initiative that participants would be able to assess at this point.

a "5" to ICO-related questions asking for a rating on a five-point scale (5 being the maximum).8 This template replaces the Level-1 evaluation form used in FY04-FY10. The new Participant Feedback Questionnaire template differs from the previous Level-1 evaluation form in various ways, including that this new template must be customized by WBI teams to fit each initiative’s data collection needs.9 See a demonstration version of a customized Participant Feedback Questionnaire with customizations highlighted.10 See a demonstration version of a customized Reflection and Action Form.

Page 12 of 59

It is recommended that: a) the Reflection and Action Form and the Participant Feedback Questionnaire cover the same part of the initiative, and b) be administered at the same time.11

Tabulation support: WBI teams bring the responses for tabulation to WBICR (if the responses are in a language that the WBICR tabulation team can read).

d) Level-2 Evaluation ToolkitPurpose: The Level-2 Evaluation Toolkit guides WBI teams on a quantitative way to assess participants' learning during an event, using tests of knowledge systematically developed during learning design and administered at the start and the end of each offering.WBI teams can use the Level-2 evaluation toolkit to:

plan, design, administer, and process knowledge tests for participants, review the results in the context of the course and the initiative adjust the courses (and/or the tests) as needed, obtain data that will feed into WBI's aggregate results as number of stakeholders

empowered through intermediate capacity outcomes (specifically enhanced knowledge and skills), and

report on the number of participants with a learning gain in the description fields of the AUS or ACS in SAP, with a link to the archived tabulation and item analysis files.

Conditions for use and preparation: Level-2 evaluations require a considerably greater investment of time by the CD team (specially the content experts on the team) than using the other data collection tools. Also, Level-2 evaluations apply to specific circumstances only. Therefore, teams should only conduct Level-2 evaluations if all following conditions are met;

The main targeted result of the event assessed is the ICO "enhanced knowledge and skills." The knowledge and skills targeted for enhancement are set before the content delivery. The initiative includes structured learning events that reach large numbers of direct and/or

indirect clients through repeated offerings. The content of each event assessed is equivalent to at least one week of time. All participants learn the same content (as opposed to being split in parallel sessions). The team—including the content experts—is willing and able to commit the resources

required by the evaluation and intends to make use of its data.12

e) Follow-Up SurveyPurpose: WBI teams can use a Follow-Up survey to:

learn lessons and incorporate them in subsequent CD services as needed, collect information on the changes due in part to the initiative that clients can report on

after time has elapsed—these changes can relate to intermediate capacity outcomes and institutional capacities and be translated in any of WBI's aggregate results indicators—, and

report on the changes described by the survey respondents in the AUS or ACS in SAP, with a link to the archived completed surveys or tabulation provided by WBICR.

11 The two forms must remain separate because the Participant Feedback Questionnaire is anonymous in order to encourage frank feedback, whereas the Reflection and Action Form asks for responses by name in order to enable follow-up and placement of responses in the context of the participant’s organization, if applicable. For more information see FAQ 9. 12 For a week-long course, the first Level-2 evaluation typically requires about one week of time by the content experts, including the TTL, and one week time of an assistant. Subsequent evaluations of the same course, using the same or slightly modified tests, take less time.

Page 13 of 59

Preparation and administration: The Follow-Up Survey is designed to be administered several months after the end of a client's involvement with an initiative. WBI teams must customize the follow-up survey template.13

While the four data collection tools presented above require limited efforts to collect data (because the clients have a special interest or a time scheduled in an event to answer), the administration of a follow-up survey often requires more resources (because clients typically receive the survey when they are focused on other tasks). Any follow-up survey requires a carefully designed strategy to obtain a good response rate. This involves writing cover letters, monitoring responses, sending reminders to non-respondents. Such data collection efforts can span several months. Teams should take resources into consideration before launching a follow-up survey.14

Tabulation support: If needed, the WBI team brings the completed surveys to WBICR for tabulation (if the responses are in a language that the WBICR tabulation team can read).

2. Other data collection approaches

Teams can also collect data through means other than the administration of a written instrument that clients complete. Other data collection approaches include: a) gathering documents produced by clients as part of their participation in the initiative (action plans, exercises), b) getting copies of documents not generated through the initiative, but that can be useful to demonstrate initiative results, and c) recording other information from clients, partners, or independent observers.

a) Gathering documents produced by clients as part of their participation in the initiative

Purpose: Teams can gather documents produced by clients as part of their participation in the initiative to show that a change took place, as evidenced in the documents. Depending on the circumstances, changes observed by comparing documents may feed in any of WBI's aggregate results.

Conditions for use: This data collection approach can be used if the initiative includes having participants produce documents which can be compared over time. These may be the same documents which participants progressively develop or refine over the course of an event or an initiative, (e.g., real-life action plans, real-life consensus agreements). Or, these may consist of different documents (e.g., exercises, quizzes) testing about the same knowledge at different times.

In such cases, the WBI teams can collect these documents as part of their strategy to demonstrate results. At minimum, the real-life documents should be collected at the start of the clients' participation in the initiative and at the end of it. If participants did not come with a real-life document to refine during the initiative, WBI teams may use the document produced by the end of the clients' participation in the initiative and explain that the participants started to produce the document at the initiative. This approach to produce a proxy baseline is only valid if the initiative

13 See a demonstration version of a customized Follow-up survey. This version assumes that the questionnaire would be administered by email. A cover email should introduce the survey. The Follow-Up Survey can also be administered by other means, such as mail, or at a follow-up event. In addition to choosing what to ask, each administration mode requires special customization. 14 For the purpose of collecting results data only, however, a high response rate is desirable, but not required, because teams need not conduct scientific evaluations, which are costly, complex and can suffer poor credibility when conducted by insiders. Teams should weigh the value of additional responses with the cost (and possible annoyance to recipients) of reminders.

Page 14 of 59

aims to have participants work on a new document that is part of a real-life plan or project on which participants will continue working beyond their attendance to the initiative.

For documents on which participants work for learning purposes during the initiative (e.g., exercises, quizzes), a minimum of two similar documents showing changes between the start and the end of the client participation in the initiative is required.15

Regardless of the type of document, in addition to collecting them, the WBI team needs to explain what changes can be observed between the before and after documents16 and how these changes relate to the WBI initiative. The changes observed, their relation to the initiative, and the links to the archived documents should be reported in the AUS and ACS.

b) Getting copies of documents not generated through the initiative

Purpose: Teams can use copies of documents produced by clients, partners or other external entities to show that a change took place. These documents may be official minutes of meetings with clients and partners mentioning elements of results, clients' annual reports before and after the initiative, clients' work products (newspaper articles, regulatory reform assessments, etc.) before and after their participation, official lists of courses offered by a university whose effectiveness WBI helped improve, thank you letters specifically mentioning how clients used the initiative, partners' back-to-office reports (BTORs), media features reporting on the outcome of the initiative, etc. Depending on the circumstances, changes observed through such documents may feed in any of WBI's aggregate results.

How they can be used: Collecting documents not generated through the initiative can be planned or opportunistic. Some of these documents may serve as baseline (e.g., minute of meeting with clients defining the issue to address), others may show outcomes (e.g., a partner's report mentioning how WBI helped advance a clients' project). Evidence of results may be found in a stand-alone document, or in combining or comparing several documents. WBI teams may encounter such documents unexpectedly and should archive them as they arise (even if these are incomplete pieces of evidence at the time, e.g., baseline only) to easily retrieve them when they report results in the AUS and ACS.

In addition to collecting these documents, WBI teams need to highlight the changes which can be observed through documents17, and how these changes relate to the WBI initiative. The changes observed, their relation to the initiative, and the links to the archived documents should be reported in the AUS and ACS.

c) Recording other information from clients, partners, or independent observers

Purpose: Teams can record information from clients, partners, or observers independent from the initiative team to collect accounts of change due in part to the initiative. Depending on the

15 A proxy baseline in the form of a statement by the WBI team that participants did not know how to perform the exercise at the start of the initiative would not make up sufficient evidence of results.16 WBI teams may rely on an independent expert review of the documents compared. If teams use resources for expert reviews, they should use a solid evaluation methodology including random assignment and blind assessment to make good use of their investment. WBI teams are advised to consult professional evaluators to design the review. 17 WBI teams may rely on an independent expert review of comparing documents. If teams use resources for expert reviews, they should use a solid evaluation methodology including random assignment and blind assessment to make good use of their investment. WBI teams are advised to consult professional evaluators to design the review.

Page 15 of 59

circumstances, changes reported in these records—in any media support—may feed in any of WBI's aggregate results. The recorded information may also help the initiative teams better understand what works or not for participants and use these lessons in designing future activities.

Conditions for use: Unless by chance encounter, most recorded information is planned by the WBI team. Good practice is to embed this data collection in the design of an initiative. For example, sessions at times in the initiative may be designed for participants to share their starting challenges (to collect baseline data) and their experience with applying what they gained from the initiative (to collect outcome information). Knowledge exchange sessions, including follow-up videoconferences, where participants share lessons can be taped to collect results information. A second example is WBI teams starting a blog asking participants to share on the initiative network their experience in implementing what they learned. A third example may be to invite former participants as resource people in subsequent learning events to share their past experience with current participants, and tape the session. A fourth example would be to interview former participants on their experience applying what they gained from the initiative and use this information both as evidence of results and a materials for case studies in future offerings. In addition to recording such information, WBI teams need to highlight the changes which can be found in these records, and how these changes relate to the WBI initiative. The changes observed, their relation to the initiative, and the links to the archived records should be reported in the AUS and ACS.

C. Implementation tips on how to collect data at different stages of an initiative

Good data collection is adapted to circumstances. This section provides examples of how to collect data according to various circumstances. To use Table 1 below, WBI teams can scan the left columns to see what circumstances might apply to a specific initiative. The right column provides some suggestions on how to proceed with the collection of data that will make up part (or more rarely all) of the evidence of results of the initiative.

Table 1. Examples of how to collect data in different stages and circumstancesStage Circumstance Data collection tipsBefore delivery

Starting the design of the initiative

Follow the 10 steps of the CDRF and document the starting points and progress made throughout the initiative.18

18 Although it is recommended to use the CDRF from the early design stage, WBI teams can start using the CDRF at any point in the process.

Page 16 of 59

Stage Circumstance Data collection tipsBefore delivery

Clients are partnering with WBI to achieve a result of particular interest to them.

Discuss with the clients, during the validation of the development goal and identification of capacity gaps, how they plan to monitor progress toward the development goal. The clients (local stakeholders acting as change agents) should own the results. That means that the clients should be the stakeholders most interested in having data showing the starting point of the capacity gap and the progress made toward closing that gap, because they need these data for their own purpose. The WBI team should discuss how the clients would collect the data on the situation that they, with the support of WBI, seek to improve. The WBI team should also agree on having access to these data and make provision to obtain them from the clients and report them among the initiative's results.

Before delivery

An initiative with large number of applicants where WBI can set the rules for participation.

Embed in the design of the initiative a follow-up mechanism through which you can learn what changed in part as a result of the clients' participation in the initiative. (The follow-up mechanism can take a variety of forms, e.g., a session in a subsequent face-to-face event, a dedicated videoconference, a survey, a blog, etc.)

Make applicants aware at the time they apply that, if selected/awarded, they would be expected to not only participate in the main event/process for which they are applying, but also in the follow-up during which they'll be asked to report on what changed as a result of their participation.

Before and at the end of delivery

Participant selection/registration is taking place.

Embed the collection of baseline results data as part of a late-stage needs assessment19 using questions 7 to 11 of in the Participant Application Form20.At the end of the activity, distribute a copy of the application/registration form for reference, and ask participants to complete by name a Reflection and Action Form.

Before and after delivery

Internal or external partners have sustained relations with clients.

Document the initial request for WBI's contribution and discuss with the partner(s) to help diagnose the client capacity constraints and how WBI could help address these constraints. Document the discussion(s) and any follow-up with the partners, including subsequent request(s). During follow-up with the partner(s), discuss and document what worked and what should be improved. Also ask for examples of how clients used what they gained from the initiative.

19 Preferably, the initiative team would have already assessed the needs at institutional level. Using the WBI Participant Application Form can complement this broader needs assessment by identifying the specific challenges of potential participants.20 If no application forms are used, the questions may be asked on a participant registration form.

Page 17 of 59

Stage Circumstance Data collection tipsDuring sustained delivery

The initiative includes an action planning exercise sustained over several missions.

Collect the relevant BTORs (of WBI staff and of other internal and external stakeholders) over time. Those should show the progress in action plans. If a series of BTORs from non-WBI staff show progression in action plans and mentions the role of the WBI initiative, those could also be used to show the progress achieved.

During sustained delivery

The initiative maintains an ongoing web-based network for clients.

Open an electronic "discussion thread" where participants can share their experience in trying to apply what they gained from the initiative. If needed, send occasional prompts for clients to share their experience in this discussion thread.

Monitor all exchanges among clients. When a client mentions a result, document it, and if needed probe for clarification and/or for the right to use the information in a WBI results report.

During sustained delivery

The initiative plans its last gathering with clients after sustained relations with the same group.

Dedicate a session during the last deliverable/meeting for clients to describe what they gained from their participation, how they were able to apply what they gained. Videotape the session and review what clients said to identify evidence of results that can be reported.

During sustaineddelivery

An activity ends and the team plans to meet these clients again, when taking stock of their results, feedback, and plans would be helpful.

Administer a customized version of the Participant Feedback Questionnaire and of the Reflection and Action Form.

Inform participants that: a) their responses will help you monitor results and potential needs for adjustment in future steps of the initiative, and b) you will also collect more results data later (during subsequent deliveries).

After delivery

An activity ends and the team has no plans to gather these clients again.

Administer a customized version of the Participant Feedback Questionnaire and of the Reflection and Action Form.

Inform participants that you will be sending follow-up materials (e.g., a newsletter, updates on the content of the activity or related materials) including a survey in which they'll be asked to report on what changed as a result of their participation.

Send the Follow-Up Survey (either with other follow-up materials or as a stand-alone piece) a few months after the activity, once clients would have had time to take action in relation to what they gained from the activity.

Page 18 of 59

Stage Circumstance Data collection tipsAfter delivery

The initiative is fully delivered, but no results data were collected.21

Or

The initiative is fully delivered, but the team feels that further results data could be collected.

Contact the initiative's internal and external partner(s) and discuss and document what worked and what should be improved. Also ask for examples of how clients used what they gained from the initiative.

And/or

Send the Follow-Up Survey (either with other follow-up materials or as a stand-alone piece) a few months after the activity, once clients would have had time to take action in relation to what they gained from the activity.

After delivery

You heard of an important policy change inspired by the initiative, but this result is not documented.

Seek an interview (preferably video- or audio-taped22) with the change agent who participated in your initiative and turn the interview data into a case study to use as content for the next initiative or other materials.

And/or

Invite the change agent to a subsequent activity to share her/his experience with other practitioners interested in the policy s/he implemented as a result of her/his previous participation in WBI's initiative.

21 This circumstance is provided as an example that should not be replicated intentionally.22 For the purpose of results documentation, WBI teams should submit the unedited tape. This does not prevent the team from producing materials based on an edited version of the tapes for other purposes.

Page 19 of 59

III. How to collect results data as stakeholders empowered via ICOsThis chapter provides teams with suggested ways of collecting data that feed in the WBI result "direct and indirect stakeholders23 empowered through Intermediate Capacity Outcomes (ICO)." The chapter first indicates what teams should do to collect data related to any targeted outcome, and then presents in Table 2 suggestions that relate to particular ICOs.

The CDRF includes six ICOs listed in Chart 2.

Chart 2. List of CDRF's Intermediate Capacity Outcomes

1. Raised awareness2. Enhanced knowledge and skills } Altered behaviors and actions

3. Improved consensus and teamwork4. Strengthened coalitions 5. Enhanced networks

} Altered processes and relationships

6. Increased implementation know-how } New products and services

To use Table 2 below, WBI teams scan the left column for targeted outcomes relevant to the initiative (or the part of the initiative) for which they want to collect evidence of results. Then, teams read across the "Baseline" and "Evidence of outcomes achieved" columns to plan their data collection. When questions are suggested, teams should retain these questions as they customize the corresponding data collection template.24

In addition, for any targeted outcome listed in the left column, teams are encouraged to follow the good practices below:

Collect baseline data on clients' needs for participating in an initiative in order to better understand and respond to these needs. One way to obtain such data is through a participant application or registration form that asks questions like Q7 to 11 in WBI's Participant Application Form.25

Ask participants to complete a Reflection and Action Form by name along with the participant feedback questionnaire to add valuable contextual information that can feed in WBI's results description and inform teams' planning their initiative's next steps.

23 Direct change agents received capacity development services at least in part from WBI. Indirect change agents received services based on WBI's input, but without WBI involved in the delivery. In their AUS and ACS, teams should indicate whether the change agents about whom they report intermediate capacity outcomes are direct or indirect change agents. For more information, see FAQs 25 and 26. 24 Conversely, teams should delete from the data collection templates the questions that are not relevant to their targeted outcomes.25 These questions are helpful, though not required, to provide baseline descriptive results data.

Page 20 of 59

WBI teams need to demonstrate the results of their initiative as a whole. To do so, they may assess one deliverable at a time, a series of the initiative's deliverables, or the entire initiative at once. Good practice is to collect data at least when a client starts and ends her/his involvement with the initiative. If the client is involved with the initiative during one deliverable, then the client should be asked to assess that deliverable only. If the client is involved in several deliverables, e.g., three action planning workshops supported by an electronic network, the client should be asked to assess all these deliverables. Depending on practical considerations, the WBI team may choose to collect results data of several or all of these deliverables at once, or one deliverable at a time.

In any case, clients should clearly understand what they respond about when completing a data collection tool. For this, WBI teams must customize these tools accordingly. Throughout this guide's associated data collection templates, all terms in italics should be replaced with the term(s) that clients would recognize, as needed. For example, if clients are to assess the entire initiative and if they would call it the "XYZ Program" instead the WBI jargon "initiative," WBI teams should replace the generic term "activity" found on the data collection templates with "XYZ Program." If clients are to rate one deliverable and if they would call it a "workshop" instead of the WBI jargon "deliverable," WBI teams should replace the term "activity" with "workshop." Likewise, if a question in a WBI's data collection template includes the italicized term "issue," WBI teams should replace this term with a brief description of the issue addressed in the part of the activity that clients assess.

In addition to the baseline and outcome information mentioned in Table 2, for any outcome reported in the AUS or ACS, teams should help stakeholders understand how these results relate to the WBI initiative, as follows:

For outcomes related to direct activities—i.e., deliverables for which WBI had a direct involvement in the delivery—Plato records, participant lists and documents showing the content of activities related to the results reported are helpful to establish this link.

For outcomes related to indirect activities—i.e., deliverables that WBI teams report as a consequence of their input, but without having been involved in the delivery—WBI teams should describe how the indirect activities for which they report results are a consequence of the initiative's input. If possible, team should support this description with quotes/documents from external sources. For example, if a trainer trained by WBI delivered a course using the same content, the participant list of the WBI activity (direct) along with the course materials of WBI's activity (direct) and the course materials of the trainer's activity (indirect) are evidence of the link between WBI and the indirect activity.

Table 2. Suggested data collection per targeted intermediate capacity outcomesTargeted outcome

Baseline Evidence of outcome achieved

Raised awarenessRaised client awareness

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "increase." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped increase your awareness of the issue

Page 21 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved client attitude

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "more favorable." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped you feel more favorable about the issue

Page 22 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved client confidence

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "improve." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped improve your confidence that you can do something

Page 23 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved client motivation

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "more motivated." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped you feel more motivated to do something

Page 24 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved client understanding

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "improved." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped improve your understanding of the issue

Page 25 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Enhanced knowledge and skills

Page 26 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients acquire knowledge and skills

Test: Pre-test score for every client who has a matched post-test score. ("Matching" scores means associating each individual's pre-test score with his/her post-test score.)

The test can be designed using WBI's Level-2 evaluation toolkit.

Test: Post-test score for every client who has a matched pre-test score.

WBI results include the number of clients who scored higher on the post-test than on the pre-test of knowledge and skills.

Page 27 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved client application of knowledge and skills

Client products as they stand before their participation in the initiative

Client products individually matched to the same clients and comparable to those used on the baseline since their participation in the initiative. Include an explanation of what changed and how one can observe the change in the areas targeted by the initiative.

And/or

WBI Follow-Up Survey

Page 28 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Improved consensus and teamworkHelped advance clients' projects through teamwork

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which any improvement in your teamwork (from this activity) helped your project to move forward

Page 29 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients reach consensus

Document(s) showing the views of the clients (by name) before their participation in the initiative, with regard to what the initiative addresses

And/or

An assertion by the WBI team that the clients had not reached the consensus reported as a result (see right column) before the initiative

Consensus document(s) clients signed with all of the following information (which may be provided by the WBI team):

Names of the signatories Description of the initiative's role in

the consensus Assertion that the consensus signed

was a real-life consensus as opposed to a training exercise/simulation.

(The number of signatories will feed into WBI's results as number of stakeholders empowered through intermediate capacity outcomes. The consensus document(s) will be used in WBI's result descriptions.)

Page 30 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped advance clients' projects through consensus

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which any progress toward a consensus (from this activity/initiative26) helped your project to move forward

26 The term "initiative" is added to the term "activity" to indicate that the improvement toward a consensus might have resulted from a part of the initiative that took place beyond just the "activity" assessed in other questions on the same participant feedback questionnaire. Teams should replace "activity/initiative" with the part of the initiative that aimed to improve the consensus—worded in a way that participants would recognize.

Page 31 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Strengthened coalitions Helped advance clients' projects through coalition

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which any strengthening in the coalition (from this activity/initiative27) helped your project move forward

27 The term "initiative" is added to the term "activity" to indicate that the coalition strengthening might have resulted from a part of the initiative that took place beyond just the "activity" assessed in other questions on the same participant feedback questionnaire. Teams should replace "activity/initiative" with the part of the initiative that aimed to strengthen the coalition—worded in a way that participants would recognize.

Page 32 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Or

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

Or

a) If the initiative contributed to create the coalition, use both elements below: WBI Participant Feedback

Questionnaire' s question :

Extent to which the coalition helped your project move forward

An assertion by the WBI team that the initiative contributed to create the coalition28

b) If the initiative did not create the coalition, use both Participant Feedback Questionnaire's questions below29:

Did this activity help strengthen your coalition?

Extent to which the coalition helped your project move forward30

Enhanced networksHelped advance clients' projects through network

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

WBI Participant Feedback Questionnaire' s question :

Extent to which any enhancement in the network (from this activity/initiative31) helped your project move forward

28 Existing documents showing the role of the initiative in creating the coalition (e.g., BTOR, official minutes) could also be used.29 The number of clients answering yes to "did this activity help strengthen your coalition?" and with a 4 or 5 to "Extent to which the coalition helped your project move forward" will feed into WBI's aggregate results.30 If teams want to asked these two questions instead of the one question "Extent to which any strengthening in the coalition from this activity/initiative helped your project move forward," on the Participant Feedback Questionnaire they should replace "Extent to which any strengthening in the coalition from this activity/initiative helped your project move forward" with "Extent to which the coalition helped your project move forward." 31 The term "initiative" is added to the term "activity" to indicate that the network enhancement might have resulted from a part of the initiative that took place beyond just the "activity" assessed in other questions on the same participant feedback questionnaire. Teams should replace "activity/initiative" with the part of the initiative that aimed to enhance the network—worded in a way that participants would recognize.

Page 33 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Or

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "move forward." No additional baseline is required.

Or

a) If the initiative contributed to create the network, use both elements below: WBI Participant Feedback

Questionnaire' s question :

Extent to which the network helped your project move forward

An assertion by the WBI team that the initiative contributed to create the network32

b) If the initiative did not create the network, use both Participant Feedback Questionnaire's questions below33:

Did this activity help enhance your network?

Extent to which the network helped your project move forward34

32 Existing documents showing the role of the initiative in creating the network (e.g., BTOR, official minutes) could also be used.33 The number of clients answering yes to "did this activity help enhance your network?" and with a 4 or 5 to "Extent to which the network helped your project move forward" will feed into WBI's aggregate results.34 If teams want to asked these two questions instead of the one question "Extent to which any enhancement in the network from this activity/initiative helped your project move forward," on the Participant Feedback Questionnaire they should replace "Extent to which any enhancement in the network from this activity/initiative helped your project move forward" with "Extent to which the network helped your project move forward."

Page 34 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Increased implementation know-how - Formulated policies and strategiesHelped expand stakeholder involvement in the process of making a development policy, reform, strategy, law, project, program, or plan

Document(s) showing stakeholders' restrained access to the process of making a development policy, reform, strategy, law, project, program, or plan before their involvement in the initiative

Document(s), e.g., proceedings with participant names, showing participation in the process of making a development policy, reform, strategy, law, project, program, or plan by stakeholders (with previously restrained access)—after their involvement in the initiative, with the following information that may be provided by the initiative team:

The names of the stakeholders whose involvement was helped by the initiative35

A description of what the initiative did to help the stakeholders (with previously restrained access) to be involved in the process.36

35 The names of the stakeholders helped may also be on the documents. The number of helped stakeholders will feed into WBI's results as number of change agents empowered through intermediate capacity outcomes. The documents will support the description of what changed as a result if the initiative. WBI results will not be expressed in number of documents. (Documents for which no initiative-related change could be observed, will not feed into WBI's aggregate results.)36 Existing documents showing the role of the initiative in helping clients be involved in the process (e.g., correspondence with stakeholders, official minutes) could also be used.

Page 35 of 59

Targeted outcome

Baseline Evidence of outcome achieved

And/or

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "strengthen." No additional baseline is required.

And/or

WBI Participant Feedback Questionnaire' s question(s) :

Extent to which this activity helped strengthen your voice in the making of a development policy, reform, strategy, law, project, program, or plan

Page 36 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients conduct needs assessments for a development policy, reform, strategy, law, project, program, or plan

a) If the initiative has clients use the needs assessment they had started before their participation in the initiative, provide:

Clients' needs assessment documents as they are just before participating in the initiative

b) If the initiative has clients initiate needs assessment, provide:

An assertion by the WBI team that the clients (whose names and needs assessment documents are provided as part of this initiative's results) started the needs assessment at the initiative

Needs assessment documents/report with all of the following related information (which may be provided by the WBI team):

the names of the clients associated with the initiative who conducted the needs assessment37

a description of what changed in the needs assessment in relation to the initiative

An assertion by the WBI team that the report was a real-life needs assessment (as opposed to a training exercise or simulation).

And/or

WBI Participant Feedback Questionnaire' s question (see right column) asks about a change: "advance." No additional baseline is required.

And/or

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped you advance the needs assessment for a development policy, reform, strategy, law, project, program, or plan

37 The number of clients who conducted the needs assessment will feed into WBI's results as number of change agents empowered through intermediate capacity outcomes. The needs assessment documents/reports will support the description of what changed as a result if the initiative. WBI results will not be expressed in number of documents/reports. Documents/reports collected, but for which no initiative-related change could be observed, will not feed into WBI's aggregate results.

Page 37 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients move forward the formulation/ design of a development policy, reform, strategy, law, project, program, or plan

a) If the initiative has clients use documents (to formulate of a development policy, reform, strategy, law, project, program, or plan) that they had started before their participation in the initiative, provide:

Clients' document(s) as they are just before participating in the initiative

b) If the initiative has clients create documents (formulating a development policy, reform, strategy, law, project, program, or plan), and intends to use these documents as evidence of results (see column on the right), provide:

An assertion by the WBI team that the clients (whose names and documents are provided as part of this initiative's results) started these plans at the initiative

Client document(s) formulating a development policy, reform, strategy, law, project, program, or plan with all of the following information (which may be provided by the WBI team):

Names of the authors for each document38

Description of what changed in the documents(s) in relation to the initiative

Assertion that each document was a real-life document (as opposed to a training exercise or simulation).

38 The number of authors of documents where an initiative-related change is observed will feed into WBI's results as number of change agents empowered through intermediate capacity outcomes. The documents will support the description of what changed as a result of the initiative. WBI results will not be expressed in number of documents. Documents collected, but for which no initiative-related change could be observed, will not feed into WBI's aggregate results.

Page 38 of 59

Targeted outcome

Baseline Evidence of outcome achieved

And/or

WBI Participant Feedback Questionnaire' s question (see right column) ask about change: "move forward." No additional baseline is required.

And/or

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped move forward the formulation/design of a development policy, reform, strategy, law, project, program, or plan

And

An assertion by the WBI team the participant responses to the question above related to real-life development policy, reform, strategy, law, project, program, or plan on which clients worked during the initiative (as opposed to a training exercise or simulation).

Page 39 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients design the M&E plan of a development policy, reform, strategy, law, project, program, or plan

a) If the initiative has clients use M&E plans they had started before their participation in the initiative, provide:

Clients' M&E plan(s) as they are just before participating in the initiative

b) If the initiative has clients create the M&E plans to report as results (see column on the right), provide:

An assertion by the WBI team that the clients (whose names and plans are provided as part of this initiative's results) started these M&E plans at the initiative

Client M&E plan(s) for the a development policy, reform, strategy, law, project, program, or plan with all of the following information (which may be provided by the WBI team):

Names of the authors for each M&E plan39

Description of what changed in the plan(s) in relation to the initiative

Assertion that each plan was a real-life M&E plan (as opposed to a training exercise or simulation).

And/or

WBI Participant Feedback Questionnaire' s questions (see right column) ask about change: "advance." No additional baseline is required.

And/or

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped advance the design of your plan to monitor and evaluate a development policy, reform, strategy, law, project, program, or plan

And

An assertion by the WBI team the participant responses to the question above related to a real-life M&E plan on which clients worked during the initiative (as opposed to a training exercise or simulation).

39 The number of authors of plans where an initiative-related change is observed will feed into WBI's results as number of clients with intermediate capacity outcomes. The plans will support the description of what changed as a result of the initiative. WBI results will not be expressed in number of plans. Plans collected, but for which no initiative-related change could be observed, will not feed into WBI's aggregate results.

Page 40 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Helped clients with their proposal to decision-makers for a development policy, reform, strategy, law, project, program, or plan

The client question (see right column) asks about a change: "Specify the task(s) and/or challenge(s) that you were facing, how the activity helped you with them, and what happened as a result of your actions." No additional baseline is required.

Document(s) describing the proposal to the decision-maker(s) for a development policy, reform, strategy, law, project, program, or plan with the following:

Client answers (preferably by name) to the question:

Please describe how this activity helped you, if at all, with the proposal for a development policy, reform, strategy, law, project, program, or plan submitted to the decision-maker(s). Specify the task(s) and/or challenge(s) that you were facing, how the activity helped you with them, and what happened as a result of your actions.

The question could be added to the Reflection and Action Form template or to the Follow-Up Survey template, asked in an interview, or a group discussion at a meeting with the clients. The responses should be recorded.

Page 41 of 59

Targeted outcome

Baseline Evidence of outcome achieved

Increased implementation know-how - Implemented strategies and plans Helped clients implement a development policy, reform, strategy, law, project, program, or plan

a) If the initiative helps clients implement a development policy, reform, strategy, law, project, program, or plan, after they began its implementation, provide:

Clients' implementation documents as they are just before participating in the initiative

b) If the initiative helps clients start the implementation of a development policy, reform, strategy, law, project, program, or plan, provide:

An assertion by the WBI team that the clients started implementation based on the initiative's assistance

Client implementation documents for the development policy, reform, strategy, law, project, program, or plan, with all of the following related information (which may be provided by the WBI team):

The names of the clients associated with the initiative who took part in the implementation40

A description of what changed in the implementation of the development policy, reform, strategy, law, project, program, or plan in relation to the initiative

An assertion that the implementation documents provided as evidence of results are real-life documents (as opposed to a training exercise or simulation).

40 The number of clients who took part in the implementation will feed in WBI's results as number of change agents empowered through intermediate capacity outcomes. The implementation documents themselves will support the description of what changed as a result if the initiative. WBI results will not be expressed in number of documents. Documents collected, but for which no initiative-related change could be observed, will not feed into WBI's aggregate results.

Page 42 of 59

Targeted outcome

Baseline Evidence of outcome achieved

And/or

The client question (see right column) asks about a change: "Specify the task(s) and/or challenge(s) that you were facing, how the activity helped you with them, and what happened as a result of your actions." No additional baseline is required.

And/or

Client answers (preferably by name) to the question:

Please describe how this activity helped you, if at all, with the implementation of the development policy, reform, strategy, law, project, program, or plan. Specify the task(s) and/or challenge(s) that you were facing, how the activity helped you with them, and what happened as a result of your actions.

The question could be added to the Reflection and Action Form template or to the Follow-Up Survey template, asked in an interview, or a group discussion at a meeting with the clients. The responses should be recorded.

Page 43 of 59

Targeted outcome

Baseline Evidence of outcome achieved

(Continued) Helped clients implement a development policy, reform, strategy, law, project, program, or plan

And/or

WBI Participant Feedback Questionnaire' s question (see right column) ask about change: "move forward." No additional baseline is required.

And/or

WBI Participant Feedback Questionnaire' s question :

Extent to which this activity helped you move forward the implementation of a development policy, reform, strategy, law, project, program, or plan41

And

An assertion by the WBI team the participant responses to the question above related to real-life development policy, reform, strategy, law, project, program, or plan on which clients worked during the initiative (as opposed to a training exercise or simulation).

41 This question should be asked on the Participant Feedback questionnaire only if participants had the chance to implement as least part of their plans by the time they complete the questionnaire. This would typically be possible if the initiative followed participants over time and if the participant feedback questionnaire asked about more parts of the initiative than one event.

Page 44 of 59

IV. How to collect results data on institutional capacity change

This chapter helps WBI teams demonstrate the results of their initiatives related to institutional capacity changes. First the chapter briefly defines institutional capacities as in the CDRF. Then Tables 4 and 5 give ideas about how changes in institutional capacities can be observed and documented by WBI teams and how they will be aggregated in WBI's results in terms of "development initiatives42 advanced through increased institutional capacity."

Institutional capacity changes occur (or start to occur) when a strategic combination of Intermediate Capacity Outcomes are planned and accomplished. Stakeholders who have been involved in WBI initiatives and benefitted in terms of ICOs can help bring about deeper or broader change within their home agencies or institutions. These changes relate to institutional capacities categorized in the CDRF as stronger stakeholder ownership, more efficient policy instruments, and more effective organizational arrangements. Table 3 lists the three institutional capacities and their 19 characteristics.

Table 3. CDRF's institutional capacities and their characteristics

Strongerstakeholder ownership

More efficientpolicy instruments

More effectiveorganizational arrangements

Commitment of leaders

Compatibility with social norms and values

Stakeholder participation in setting priorities

Stakeholder demand for accountability

Transparency of information to stakeholders

Clarity in defining rights and responsibilities

Consistency

Legitimacy

Incentives for compliance

Ease of administration

Risk for negative externalities

Suitable flexibility

Resistance to corruption

Clarity of mission

Achievement of outcomes

Operational efficiency

Financial viability and probity

Communications and stakeholder relations

Adaptability

WBI teams whose initiative aims to help stakeholders advance their development initiatives through increased institutional capacity should review Tables 4 and 5 to get ideas on how to collect related data. Table 4 gives one example of change in each of the three institutional capacities of the CDRF. Nevertheless, WBI teams may have more results around one or several of the 19 characteristics of the institutional capacities. As many WBI initiatives aim to improve the effectiveness of organizations, Table 5 gives more examples about a subset of development initiatives which local change agents advance through increased institutional capacities, namely development initiatives at organizational level advanced through improved effectiveness of organizational arrangements.

42 "Development initiatives" are different from the "capacity development initiatives" managed by WBI teams. Development initiatives are the actions of local agents of change. These actions can be expressed in terms of reforms, programs, projects, or other activities led by agents of change to improve their context. Development initiatives can have different scopes, e.g., national, regional, municipal, organizational level, etc. These development initiatives will be categorized by their scopes and other variables, so one could further understand and analyze the results reported."

Page 45 of 59

Table 4. Examples of how to demonstrate results in terms of development initiatives advanced through increased institutional capacity

Example of result What to use as baseline What to use as evidence of outcome achieved43

CDRF's institutional capacity characteristic for strength of stakeholder ownership: Transparency of information to stakeholdersWBI supported client reforms toward budgetary transparency.

Statement from an external source showing the lack of budgetary transparency at the start of the clients' participation in WBI's initiative. This can be part of the data collected during the needs assessment.

Link to the web page where the budget is posted

CDRF's institutional capacity characteristic for efficiency of policy instrument: Resistance to corruptionWBI helped a government reduce tax evasion by facilitating a process where policymakers add safeguards against corruption to items of their tax code.

The items of the tax code as they were before the policymakers participated in the initiative

And

Official records of the tax administration showing the level of tax raised in relation to these tax items before the change in the tax code

The revised items of the tax code as they are after the policymakers participated in the initiative, along with an assessment by an expert independent from the initiative that the change in the tax codes constitute safeguards against corruption

And

Official records of the tax administration showing the increased level of tax raised in relation to these tax items after the change in the tax code

CDRF's institutional capacity characteristic for effectiveness of organizational arrangement: Financial viability and probityWBI helped a social entrepreneur scale up an innovative development solution by connecting the entrepreneurs with financiers.44

Follow-up survey where the entrepreneur reported retrospectively on the number of clients for the innovative development solution and the income generated by their purchase the year before the financiers invested in the social enterprise

Follow-up survey where the entrepreneur reported on the number of clients for the innovative development solution and the income generated by their purchase a year after the financiers invested in the social enterprise

43 For all results in terms of "development initiatives advanced through increased institutional capacity," the WBI team should also describe how the changes reported relate to the WBI initiative and, if possible, support the description with evidence. Examples of supporting evidence may be the information collected on the intermediate capacity outcome through which the result was achieved (as in Table 2), correspondence with clients, colleagues in Operations or external partners, etc.44 This example could have also been featured in Table 5 below as it relates to an organization's improved organizational arrangement.

Page 46 of 59

Table 5. Examples of how to demonstrate results in terms of development initiatives at organizational level

advanced through improved effectiveness of organizational arrangementsExample of result What to use as baseline What to use as evidence of

outcome achieved45

Clarity of missionWBI convened practitioners across a region and contributed to their founding an association

a) If the initiative had clients work on the association's founding document they had started before their participation in the initiative, provide:

The founding document as it was just before clients participated in the initiative

b) If the initiative helped clients start the founding document reported as result (see right column), provide:

An assertion by the WBI team that the clients started the founding document at the initiative

The founding document including the mission statement of the associationwith the following information (which may be provided by the WBI team):

the names of the authors of the document46

a description of what changed in the founding document in relation to the initiative.

Achievement of outcomesWBI enhanced the capacity of an agency to add a result orientation to their development projects following international standards.

Assessment by an international expert—independent from the initiative—of the results orientation a random sample of projects produced by the agency before its participation in WBI's initiative. (The expert should not know that her/his review will be used to assess the result of WBI's initiative. Also projects' identifiers, e.g., authors, dates, etc., should be blinded.)

Assessment by an international expert—independent from the initiative—of the results orientation a random sample of projects produced by the agency after its participation in WBI's initiative. (The expert should not know that her/his review will be used to assess the result of WBI's initiative. Also projects' identifiers, e.g., authors, dates, etc., should be blinded.)47

45 For all results in terms of "development initiatives advanced through increased institutional capacity" (including the subset on organizations listed in Table 5), the WBI team should also describe how the changes reported relate to the WBI initiative and, if possible, support the description with evidence. Examples of supporting evidence may be the information collected on the intermediate capacity outcome through which the result was achieved (as in Table 2), correspondence with clients, colleagues in Operations or external partners, etc.46 For this indicator, the names of the authors of the documents would be used to establish the link between the document and the initiative (e.g., if the authors were also the participants in the WBI activity where they founded the association). 47 When a WBI team uses funds for an evaluation (beyond the use of simple tools and approaches otherwise provided in this guide), e.g., an expert review, the WBI team should seek professional expertise in M&E to ensure that the evaluation

Page 47 of 59

Example of result What to use as baseline What to use as evidence of outcome achieved

Operational efficiencyWBI helped a school improve the quality of its teaching

School's average student test scores on national tests before its partnership with WBI's initiative

School's average student test scores on national tests soon after the school ended its partnership

WBI helped a university expand its services.

Official university curriculum before its partnership with WBI's initiative showing: content taught delivery mode used,

and/or pedagogical methods used

Official university curriculum after its partnership with WBI's initiative showing: new content taught new delivery mode used,

and/or new pedagogical methods

used

WBI built the capacity of a training center to teach new content by having the training center take an increasing role in the joint deliveries.

Agenda of new courses as it was at the start of its partnership with the initiative showing a limited share of teaching by the training center faculty

Agenda of new courses over time showing increased share of teaching by the training center faculty

WBI helped a government agency be more efficient logistically

Application documents showing the number of days required to process application before the agency's partnership with WBI's initiative

Application documents showing the reduced number of days required to process application after the agency partnership with WBI's initiative

Financial viability and probityWBI connected a start-up company to financing mechanism

Company's financial statement before its partnership with WBI's initiative

Company's financial statement soon after its partnership with WBI's initiative

WBI helped a regional institute serve more clients thanks to its expanded use of technology

Institute's annual financial statement with income from participant fees before its partnership with WBI's initiative

Institute's annual financial statement with income from participant fees after its partnership with WBI's initiative

Communications and stakeholder relationsWBI helped a regional institute improve its credentials and partnership support

Regional institute's annual report listing its partners before its collaboration with WBI's initiative

Regional institute's annual report listing its new partners after its collaboration with WBI's initiative

WBI developed the capacity of the leadership of an NGO so they could be sitting at the negotiation table on government policies

The proceedings mention that this was the first time the NGO participated in the consultation. (See right.) No additional baseline is required.

Proceedings of the consultation event mentioning the NGO's first time participation and quoting their input

methodology would meet professional standards, so the findings would be credible.

Page 48 of 59

Example of result What to use as baseline What to use as evidence of outcome achieved

WBI helped a training institute serve a new clientele thanks to its expanded curriculum

Institute's annual report showing the demographic profile of their clients before their partnership with WBI's initiative

Institute's annual report showing the demographic profile of their clients soon after their partnership with WBI's initiative

AdaptabilityWBI taught citizens watch groups to use technology to mobilize immediate actions against wrongdoing and they took action

The blog mentions that this was the first time the groups used technology to mobilize immediate actions against wrongdoers. (See right.) No additional baseline is required.

Blog of the citizens watch groups describing how for the first time the groups used technology to mobilize immediate actions against wrongdoers

Page 49 of 59

Annex A – How this guide relates to WBI's results agenda

Table 6 lists components of WBI's results agenda in terms of what is expected of WBI, and how this data collection guide relates to these components.

Table 6. How this guide relates to WBI's results agendaExpectation of WBI How the guide relates to this expectationWBI reports on its results at various aggregate levels.

The main purpose of the guide is to help WBI teams in charge of capacity development initiatives collect evidence of results that feed into WBI's KPIs and at various other aggregate levels.

WBI teams demonstrate the effectiveness of an initiative.

The guide helps teams collect data that are part of the data needed to demonstrate the effectiveness of their initiatives. This guide helps teams demonstrate the results of their initiatives, but does not cover other data (e.g., achievement of objectives, timeline, cost), which along with the results data, help to assess the effectiveness of an initiative against certain benchmarks, such as achieving a result target within a given time, budget, etc.

WBI teams use the CDRF to design and manage their initiatives.

When WBI teams design their initiatives, they should use the CDRF to identify the institutional and intermediate capacities they aim to change or influence. Once these steps are completed, this guide helps teams identify how to collect evidence of results related to the targeted capacity change(s).

WBI teams define the targeted results of their initiatives.

The guide does not help teams define which results their initiatives target. This guide is to be used after teams have defined and articulated their targeted results in documents such as Program Results Summaries, AISs and concept notes. Then, this guide helps teams collect the results data that corresponds to their targeted results.

WBI teams design their initiatives to achieve results.

The guide helps teams with part of one aspect needed to design a result-oriented initiative—the part of its M&E plan related to measuring results. Other parts of an initiative's M&E plan, e.g., process performance indicators, are not covered in this guide.48 This guide does not address other design aspects of the initiative.

WBI teams manage for results.

The guide helps teams manage for results by providing tools to monitor the results achieved by their initiative throughout implementation. Teams can use the collected results data to inform the next steps of their initiatives.

The WBI Leadership Team (WBILT) manages for results.

The guide helps WBI teams report results data in a way that can be aggregated at various levels, e.g., across WBI or by practice, region, business line, donor, institutional capacity, intermediate capacity outcome, etc. After enough data are aggregated, WBILT will be able to use them to manage for results.

WBI advances research on capacity development.

Once enough data have been collected and reported by following this guide, WBI will be able to use these data (along with other relevant performance information) to address research questions on capacity development.

WBI transforms how the world develops capacity.

If other CD institutions use WBI's approach to measuring their organizations' results, they could benefit from this guide as WBI does. Use of the CDRF for design and measurement of capacity development initiatives by many agencies would eventually enable comparisons across organizations.

WBI demonstrates how it integrates with World Bank Operations.

This guide does not cover how WBI integrates with World Bank Operations. However, the guide will help WBI collect data that could be used to associate results with their partner regions, and compare results of WBI initiatives that partnered with Operations with those that did not.

48 See definition of process performance indicators in FAQ 10.

Page 50 of 59

Annex B – Frequently-Asked Questions

A. Policy

1. What is a TTL's accountability for demonstrating and reporting results of an initiative?Throughout the initiative, the TTL is accountable for the following: Collecting simple planned and opportunistic evidence of results (including before and after

data) and filing them in the World Bank's official electronic archival system (e.g., IRIS) Updating the AUS in SAP including milestones, progress description, and archival system's link

to the evidence49

Completing an ACS in SAP including ratings, milestones, achievement description, and archival system's link to evidence.

2. What is WBI's policy on participant feedback questionnaires (Level-1 evaluation)? WBI's policy on results asks that WBI teams achieve and demonstrate results at the initiative level. Depending on the context, the best way to demonstrate an initiative's results may (or may not) involve using a participant feedback questionnaire (formerly known as Level-1 evaluation) at the end of every deliverable. Therefore, participant feedback questionnaires are not mandatory, but they are recommended if they can help a team achieve and demonstrate results.

Participant feedback questionnaires help a team achieve results by identifying strengths and weaknesses in their deliverables for the team to make needed adjustments to future activities. Taking corrective measures in response to participant feedback improves the likelihood of achieving results. For this purpose, customized participant feedback questionnaires (including optional questions from WBI's participant feedback toolkit) used as part of a decision-making process are encouraged. Among all of the questions, teams should ask participants to rate the "overall usefulness of this activity" using WBI's scale and instructions. This question helps a team compare a deliverable's perceived usefulness with WBI's long-established benchmark.

Also, the new participant feedback questionnaire helps teams demonstrate results. Table   2 helps to choose questions relevant the targeted outcomes of the part of the initiative assessed. 3. Which of the tools introduced in this guide are mandatory?No tool is mandated. The mandate is to achieve and demonstrate results—not to use a given tool. WBI teams should select and customize the tools that will help them achieve and demonstrate results of their initiatives. The tools are offered to facilitate the process. A particular situation may make the use of tools other than those associated with this guide more practical and/or useful. In such a case, the WBI team may use these other tools.

4. Are TTLs expected to report results per WBI's aggregate result indicators? No, in writing the ACS, the TTL of a capacity development initiative is not expected to categorize its results according to WBI's aggregate result indicators. The TTL only needs to describe—in her/his own words—the changes due in part to the initiative that s/he could observe, and support the description of these changes with evidence. WBICR will categorize the results and report them in the aggregate.

49 Reporting results in the AUS throughout the initiative as soon as results become known is a recommended good practice. In addition, WBI will request that AUSs be completed periodically on dates set by WBI's management.

Page 51 of 59

B. Main concepts and rationale for measuring WBI results in this way

5. How are WBI's results related to the CDRF?The two indicators of WBI's aggregate results relate to the CDRF, since these indicators were built upon the CDRF, based on what WBI teams reported in FY10 in their Program Results Summaries (a template that articulates a program's logic according to the CDRF).

The CDRF guides teams on how to design and implement CD initiatives that produce changes in the capacity of clients and in their institutional capacities.

WBI results are expressed in terms of on whom/what these capacity changes can be seen. The CDRF also guides the reporting of WBI's results by providing its underlying change process and results logic.

Specifically, the "number of direct and indirect stakeholders empowered through intermediate capacity outcomes (with descriptions of the changes)" is an indicator that observes on whom changes in capacity occur. In this case, the capacity changes occur in "stakeholders" (a.k.a. clients) who may produce further changes as individuals or through groups (e.g., coalitions). The outcomes mentioned in this indicator are the six intermediate capacity outcomes of the CDRF. As an intermediate step toward institutional capacity changes, a CD initiative must first improve the capacity of clients. Due in part to this improved capacity, the clients are expected to act as agents producing changes in institutional capacity areas.

The "number of development initiatives" that WBI contributed to help clients "advance" (with descriptions of the changes) "through increased institutional capacity" is an indicator that corresponds to changes observed in any of the 19 characteristics of the three institutional capacities of the CDRF. For definitions of the main terms of this indicator and how one can demonstrate changes in the CDRF's institutional capacities, see FAQ Section E on "Results as development initiatives advanced through increased institutional capacity." For examples of such results, see Table 4 and Table 5.

6. Why is supporting evidence for results required? WBI needs stronger evidence than mere self-reporting by WBI teams because by nature CD is hard to assess. Having the discipline to collect evidence of results helps teams to manage their initiative for results. Also, having evidence helps stakeholders better understand the value added of WBI.

7. Why do the examples of supporting evidence vary broadly? Information that is easy to collect as part of the normal initiative design and implementation should be collected. This evidence is generally simple, and while it does not have the robustness of a professional evaluation, its added cost is marginal, making the approach relatively cost-effective. On the other hand, when a team uses extra resources (beyond the administration of the simple data collection tools associated with this guide), specifically to measure the result of their initiatives, the findings from such dedicated evaluation should be credible enough to justify the cost of the evaluation. In such cases, a professional evaluation design is encouraged. WBICR can help teams with this design.

8. Why use a mix of self-reports and evidence of results from external sources? To balance credibility of the results with practicality, WBI teams self-report what they did and support the results with evidence from external sources (e.g., clients, observers, partners, etc.). The credibility of WBI results would be affected if WBI relied entirely on WBI teams' self-reports.

Page 52 of 59

Questions would be raised as to how a team could assert that capacities were developed. However, as long as the evidence of change relies on data from an external source, the elements regarding what the team did—which can be most easily provided by the WBI team itself50—may be reported by the WBI team. The reason for having supporting evidence is not to question the veracity of WBI teams' reports. Instead, external evidence enables WBI's stakeholders to make their own judgment on the results achieved.

A WBI team may choose to use external sources where this guide mentions that a WBI statement is requested. As long as the required information is provided, an external source can replace or supplement the statement by a WBI team. For a result to be counted among WBI's aggregate results, the reverse is not true, however. Where this guide asks for evidence from an external source, it can't be replaced with a statement by a WBI team.

9. Why make some questionnaires anonymous and others by name?Anonymous questionnaires are better at collecting credible data when respondents assess the effectiveness of the initiative. Questionnaires asking respondents to give their names provide a context for the data collected (and contact information for clarification or follow-up) as well as information on which the initiative team can act in a targeted fashion. To optimize each feature, WBI teams should:

Anonymously collect data reflecting on the level of effectiveness of the initiative, including the quantitative results data from the Participant Feedback Questionnaire and other process performance data typically found on a Level-1 evaluation.

Request respondents' names when asking for descriptive information on results achieved and for information that will help teams target their next activities or initiative based on client responses, notably using the Reflection and Action Form or the Follow-up Survey.

10. What is the difference between a milestone, deliverable, process performance indicator, and result?

A milestone is a major step marking progress toward the completion of an initiative, but milestones are not results. For example, knowing that an agreement has been signed with a partner, a new module has been completed, or an output has been delivered to a client does not tell whether results were achieved.

A deliverable is a single output delivered to clients. Deliverables can be considered milestones. However, not all milestones are deliverables, because some milestones only mark a step forward in a process without delivering anything to clients. Deliverables are needed to achieve results, but deliverables are not results. Delivering something to clients does not mean anything changed in the capacity of that client or in their institutional capacities. For example, the number of deliverables completed does not tell anything about results achieved.

A process performance indicator is an observable variable that helps a CD team monitor whether the conditions they consider necessary or helpful to produce a result are moving in a favorable direction. Monitoring process performance indicators helps a team adjust their plans to achieve results, but process performance indicators are not results. Having an indication that the conditions a team assumed to be favorable to produce a result is not evidence that the result was achieved. For example, the percentage of members of a coalition who state in an anonymous survey that they find the composition of the coalition adequate is a useful indicator to help the team

50 For example, the fact that a plan developed during an activity was not a training exercise/simulation, whether the initiative create a network, the link between the WBI initiative and an indirect activity whose results the team is reporting, etc.

Page 53 of 59

determine the level of efforts needed around the coalition's composition and coalescing. However, by itself, the indicator does not say whether a result was achieved.

A result is a change in the capacity of clients and/or in their institutional capacities, to which WBI contributed, observed and aggregated in WBI as stakeholders empowered through intermediate capacity outcomes and development initiatives advanced through increased institutional capacity.

11. Is a change in network structure a result or a process? As mentioned above, a result is a change in the capacity of clients and/or in their institutional capacities. Therefore, whose network is changed—the clients' or WBI's—matters in determining whether the change is counted as a result or as a process.

With regard to WBI results, a change in network structure (e.g., centrality51) will be counted differently according to who is operating the network. A structural change in a client-operated network is a result.52 A structural change in a WBI-operated network is a process.

In a client-operated network, when, for example WBI helps network members understand weaknesses in their structure and facilitates their restructuration, WBI contributed to produce a change in a client organization, by helping the client-operated network improve its operational efficiency.53 This change is counted among WBI results as an organization whose effectiveness WBI helped improve.

In a WBI-operated network, when, for example WBI assessed weaknesses in its network's structure and restructures it, WBI improved the operational efficiency of its own network. Such improvement is akin to identifying a weakness in a module's content and refining the content accordingly. Monitoring potential weaknesses in WBI's deliverables—including through process performance indicators—and adjusting WBI's operations/contents accordingly is good practice, as these actions could lead to better results in the future. However, by themselves, improvements in the contents/operations of a WBI initiative are processes not counted as WBI's results.

12. How is "client-operated" defined?"Client-operated" means that the client sustains its operations without financial help from WBI and without having WBI perform sustaining functions (e.g., maintain a server). WBI may provide useful capacity development services to the client-operated organization, but the client-operated organization would survive if WBI stopped its services. Typically client-operated organizations existed before their partnership with WBI.

However, conceptually a client-operated organization may have been initially created by WBI. In such case, the organization is considered client-operated if it now sustains itself without financial help from WBI and without having WBI perform sustaining functions.

13. How are the ICOs on altered processes and relationships counted in WBI results?The CDRF has three Intermediate Capacity Outcomes expressed in terms of "altered processes and relationships." They are "improved consensus and teamwork", "strengthened coalitions", and "enhanced networks." Change related to these ICOs can be observed on several levels: 1) process performance indicators; 2) benefits to clients; 3) outcomes of the actions of the

51 For examples of network analyses, see: "Strengthening Networks: Using Organizational Network Analysis to Promote Network Effectiveness, Scale, and Accountability" and "Network Analysis/Mapping." 52 Assuming that WBI contributed to these changes53 Operational efficiency is a characteristic of effectiveness of organizational arrangements in the CDRF's institutional capacities.

Page 54 of 59

consensus/coalition/network. Process performance indicators may predict results, but are not counted among WBI's results. Benefits to clients and outcomes of the actions of the consensus/coalition/network can be part of WBI's results.

1) Process performance indicators are observable variables which help a capacity development team monitor whether the conditions that they consider necessary or helpful to produce a result are moving in a favorable direction. For example, if a capacity development team posited that a large number of visitors on its network's web site would increase the likelihood of achieving the initiative's objectives, the team should monitor the number of visitors to decide on their level of effort toward expanding the site's reach. However, achieving a large growth in number of visitors would only mean that one of the underlying conditions, which the team assumed to be a good predictor of results, would be met. It may be a good sign for the potential performance of the network, but by itself, the growth in visitors would not be counted among WBI's results.

To be counted among WBI results, the changes in capacity to which the initiative contributed should be observable differences in terms of benefits to clients, and/or outcomes of the actions of the consensus/coalition/network.

2) Benefits to clients are improvements due in part to the initiative which clients experienced with regard to their awareness, skills, context for their actions (in terms of consensus, teamwork, coalitions and networks), and/or strategies/projects/plans. For the ICOs on altered processes and relationships, WBI results relate to the benefits to clients in terms of advanced processes. For more details, see Table 2 above.

3) Outcomes of the actions of the consensus/teamwork/coalition/network are the results in terms of development initiatives advanced through increased institutional capacity—either result being achieved through improved consensus and teamwork, strengthened coalitions, and/or enhanced networks.

14. Why use a process performance indicator if it won't end up in WBI's results? In addition to result indicators, teams should use process performance indicators to assess progress toward what they consider to be important conditions for the initiative's ability to achieve results. Monitoring process performance indicators can help teams make adjustments likely to improve result achievement. This is also a reason why altered processes and relationships are part of the CDRF's intermediate capacity outcomes, as they help monitor the change process in progress.

C. Practical implementation considerations

15. When should results data be collected?WBI teams should seek to collect results data before, during and after the initiative.

Before the initiative, as part of its needs assessment, the WBI team should identify the gaps in institutional capacities54 related to the development goal, notably by discussing with internal and external partners and other potential stakeholders and by reviewing strategy papers, data, or other information. The minutes of these meetings and documents reviewed to establish a diagnosis of the situation in terms of institutional capacities that the initiative aims to change should be used as baseline result data against which the team can monitor progress.

54 For a systematic diagnosis, review and assess the gaps using the CDRF list of institutional capacities and their characteristics.

Page 55 of 59

Before a client participates in the initiative, WBI teams should also collect baseline results data more closely related to the participants in the initiative, notably by using a participant application or registration form. WBI's Participant Application Form can help collect baseline results data on intermediate capacity outcomes and possibly additional information on the institutional capacities targeted by the initiative.

During and after the initiative, WBI teams should collect results data on the Intermediate Capacity Outcomes and institutional capacity change. Typically, ICO data are useful to provide information on the early results of an initiative before any change in institutional capacities can be observed.

Intermediate capacity outcome data are most often based on participants' responses or products. These data can be collected at the start of a learning event as baseline information (if appropriate; see Table 2), and at the end of the event. (End-of-event participant surveys are best administered after all substantive sessions and before an appealing non-substantive activity like the distribution of certificates, or group pictures.) Data about one event collected at the end of that event would often yield results data in terms of improving participant readiness to take action for change.

To see if this improved readiness was translated into action, data should be collected after participants could be expected to have taken these actions and after these actions could have led to changes in institutional capacities. The Reflection and Action Forms completed by participants can help determine when to expect these actions.

Therefore, teams are encouraged to follow up with their participants, partners and other stakeholders to collect results data closer to institutional capacity changes. However, stand-alone follow-up data collection efforts can be costly. To limit the cost and get good response rates, teams should embed their follow-up data collection at times when they plan to interact with these stakeholders as part of the initiative's delivery or other work. Table 1 gives examples of such approaches.

16. When should WBI teams report results data?Good practice is for a TTL to add any new result with its evidence to the description field of the Activity Update Summary (AUS) as soon as the result is known, at any point of the initiative. This will simplify the writing of the Activity Completion Summary (ACS), as the AUS acts as a live preparatory version of the ACS. On occasions, when WBI is to report on its results for a given event, or at regular points in the fiscal year, WBI Management will ask that TTLs update their AUSs or complete their ACSs, as appropriate. The ACS should typically be completed within six months after the end of the last deliverable. However, if a results data collection effort about the initiative is underway six months after the end date of the last deliverable, the ACS can be delayed, so that the TTL could incorporate the findings of the results data collection effort in the results reporting sections of the ACS.

17. How can results of an initiative be reported after the ACS has been closed?Teams sometimes learn of results long after an initiative was closed. Although one cannot re-open an ACS after its approval, WBI teams can report these late results directly to WBICR. The WBICR team in charge of aggregating WBI results will apply the same process and report these results in the same way as if they had been obtained via the AUS or the ACS.

18. Should a TTL report a result, if the contribution of the WBI initiative was minimal?The TTL decides whether or not reporting a result in the AUS/ACS (or, if the ACS closed, to WBICR). Once a result is reported, it may be used in a variety of documents, e.g., report on WBI results,

Page 56 of 59

management presentation, brochures, etc. If the TTL feels uncomfortable reporting on a result, because WBI's contribution was minimal (or for any other reason), the TTL may decide not to report the results. Reported results may be highlighted by any internal or external stakeholders. If the TTL feels that claiming a given result would pose a reputational risk for WBI, s/he should refrain from reporting that particular result. The reputational risk to WBI would be much greater if a reported result was challenged by stakeholders than if a result ends up not being accounted for.

19. Will WBICR screen results before aggregating them to avoid reputational risks?The WBICR team in charge of aggregating results will check whether the results reported by the task teams are supported by adequate evidence and whether the information is provided in a usable format for aggregation.55 However, WBICR will not make any judgment as to the importance or desirability of the results or whether WBI's contribution to the result was deemed sufficient. Instead, WBICR will assume that if a TTL reported on a result, s/he feel comfortable with potential questions related to its importance, desirability or level of WBI's contribution.

20. Can results be reported confidentially?If a TTL has evidence of results, but considers that publishing it beyond the WBG would create sensitive situations, s/he should discuss the matter with WBICR.

21. What happens if an initiative has results, but no supporting evidence?WBI teams should report the result anyway in the AUS/ACS, and if possible collect evidence later. Table 1 provides examples of how to collect evidence retrospectively. However, as long as a result is not supported by evidence, WBICR will not include it in its aggregate reports on WBI's results.

22. Which media can be used for evidence of results?Evidence of results can use any media: paper, electronic text, audio, video, etc. A link to the unedited raw materials should be supplied. The evidence should be filed in the Bank's official electronic archival system and then linked to the AUS/ACS.

23. Will survey data feed into WBI results if their questions and scales differ from WBI's?WBI works in partnerships, so some client surveys won't follow the wordings or scales of WBI's templates. In such cases, WBI teams should still report these data, preferably with the raw data on a questionnaire or in a matrix showing the responses of each respondent. WBICR will determine on a case-by-case basis how, if at all, the data could feed into WBI's aggregate results. WBICR will count as results in terms of "stakeholders empowered through intermediate capacity outcomes" those clients who rated relevant questions as follows: selected the highest rating of scales using between two and four points; selected any the highest two ratings of scales using five or six points; and selected any of the highest three ratings of scales using seven to ten points.

To count among WBI's aggregate results, data from a partner's survey should meet the same criteria as data collected based on a WBI survey template. See Chapter II Section A for the list of required features. Although, WBICR will make its best efforts to include data collected with a partner's survey, when WBI teams have an opportunity to use WBI's standard client survey questions, they should, because the responses would be comparable to other WBI initiatives and thereby provide more useful information than using questions that might have no benchmarks.

24. Should teams collect and report results in numbers, percentages or both?Numbers and percentages are both useful, but for different purposes. The WBI's aggregate results 55 If needed, WBICR may also contact the initiative team to clarify the results reported, and potentially encourage the TTL to adjust specific elements of the results reported, so that external stakeholders would better understand the initiative's results.

Page 57 of 59

reported in the KPIs are expressed in numbers (rather than percentages) because WBI's strategy is about scaling up impact. So, the examples in Tables 2, 4 and 5 ask for numbers (and descriptions) as the basic piece of data required to feed in WBI's results. However to the extent possible, in terms of quantitative data WBI teams should collect not only the number of stakeholders or development initiatives for which the initiative contributed a change, but also how many of each of them were targeted. The percentages (number changed divided by number targeted; and number changed divided by number for which we have data) give a sense of the effectiveness of the initiative in achieving its objectives.

D. Results as "stakeholders empowered through intermediate capacity outcomes "

25. What does "direct" and "indirect" client (stakeholders) mean?Direct clients are clients who receive CD services at least in part from WBI directly. Indirect clients are clients who received services based on WBI's input but without WBI being involved in delivery. For example, the participants in a course that incorporated materials developed by WBI and was taught by a partner organization without WBI being involved in the delivery are "indirect" clients.

In their AUS and ACS, WBI teams should indicate whether the clients about whom they report intermediate capacity outcomes are direct or indirect clients.

26. Does WBI account for results from direct and indirect clients in the same way?The number of stakeholders (clients) empowered through intermediate capacity outcomes (one of WBI's result indicators) will include both direct and indirect clients. At the aggregate level both types will be summed up. However, to enable further analysis, WBI teams must indicate whether the clients about whom they report empowerment through intermediate capacity outcomes are direct or indirect clients.

27. Is any participant who took a course based on WBI materials counted in WBI results?No. Whether with direct or indirect clients, WBI teams need to provide evidence of changes that occurred at least at an ICO level to be included in WBI results. For example, proof of participation is not a result in itself, as it does not provide evidence that a participant learned from WBI's materials.

28. Why not use client survey questions to see if participants enhanced their skills?Asking people how much they learned (through one-time, pre-post, or post-then surveys) gives very different results from measuring their actual learning though knowledge test. People largely over-report learning in self-assessment questions. A question like: "extent to which you learned the content of the course" yields data more closely related with self-confidence in having acquired the new knowledge and social desirability (wanting to be seen as having learned) than with actual knowledge or skill mastery.

29. Why have client survey questions relate to the client and not the group?For results in terms of clients empowered through ICOs, WBICR will count only the clients who benefitted, not whether they observed whether someone in the group benefitted from the initiative. A question asking: "Extent to which the activity helped participants increased their awareness" would not feed into WBI results because it is impossible to calculate with this question wording the actual number of participants with raised awareness. For example, if only a couple of participants had their awareness increased at the activity, and if all participants observed this result, all participants may legitimately report that the activity helped increases participants' awareness.

In addition, to yield reliable data, surveys should ask questions of people who are in a good position

Page 58 of 59

to answer. Participants are in a position to assess an effect on themselves, but not on others.

E. Results as "development initiatives advanced through increased institutional capacity"

30. What is meant by "development initiatives"?Development initiatives56 are real-life actions by local stakeholders. These actions can be expressed in terms of reforms, strategies, policies, laws, plans, programs, projects, measures, innovations, or other activities led by stakeholders to improve their context.

These development initiatives can be of diverse scopes, affecting people in one or several countries, regions, cities, or organizations (including universities, ministries, parliaments, start-up companies). In reporting WBI's results, WBICR will also provide data that disaggregate the development initiatives by their scope, e.g., number of development initiatives at country level, at city level, at organization level.

31. What is meant by "advance"?The result indicator of intent is "development initiatives that WBI contributed to help clients advance through increased institutional capacity."

Sometimes the situation on the ground is such that a full development initiative cannot be implemented within the timeline of a WBI initiative. However, WBI is concerned with the "how" of reform—advancing the process. The indicator is not necessarily saying that the development initiative was completed, only that it was advanced, by alleviating capacity constraints listed under the CDRF's institutional capacities.

32. How are changes in institutional capacities demonstrated? When we observe evidence of alleviation of some institutional capacity constraint(s) that hinder(s) the advancement of a targeted development initiative, WBI considers this as an advancement in a development initiative through increased institutional capacity.

The CDRF offers generic characteristics of institutional capacities to help teams diagnose and address the capacity constraints related to their initiative's development goal. For the list of institutional capacities and their characteristics, see Table 3. For examples of how to use these characteristics to demonstrate results in terms of development initiatives that WBI helped clients to advance through increased institutional capacity, see Table 4 and Tables 5.

The evidence of institutional capacity changes are typically apparent in official documents, such as declarations, agreements, program and project plans, budgets, strategy papers, laws, policy guidelines, contracts, or loans (including from the World Bank). Such evidence can also be collected through follow-up surveys or interviews.

56 "Development initiatives" are different from the "capacity development initiatives" managed by WBI teams.

Page 59 of 59