results-based programming, management and monitoring...

30
Results-Based Programming, Management and Monitoring (RBM) at UNESCO Guiding Principles Bureau of Strategic Planning January 2008

Upload: nguyenthien

Post on 29-Mar-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

Results-Based Programming, Management and Monitoring (RBM) at UNESCO

Guiding Principles

Bureau of Strategic Planning

January 2008

INDEX 1) Preface 2) Brief historical background 3) What is RBM? 4) What is a result? 5) How to formulate an expected result? 6) What is the relationship between interventions, outputs and results? 7) RBM within UNESCO’s management framework 8) UNESCO’s Results Chain 9) Monitoring of Implementation Annex RBM Glossary

– Page 2 –

1) PREFACE It is said that if you do not know where you are going, any road will take you there. This lack of direction is what results-based management (RBM) is supposed to avoid. It is about choosing a direction and destination first, deciding on the route and intermediary stops required to get there, checking progress against a map and making course adjustments as required in order to realise the desired objectives. In the “Programme for Reform”, presented by UN Secretary General to the UN General Assembly in 1997, he proposed that the UN “place greater emphasis on results in its planning, budgeting and reporting and that the General Assembly moves the budget of the United Nations from input accounting to accountability for results… shifting the focus of planning, budgeting, reporting and oversight from how things are done to what is accomplished”. Since then, the notion of results and “results-based management” became a central aspect within the UN system and a global trend among international organizations, reinforced by new commitments. For many years, the international organizations community has been working to deliver services and activities and to achieve results in the most effective way. Traditionally, the emphasis was on managing inputs and activities and it has not always been possible to demonstrate these results in a credible way and to the full satisfaction of taxpayers, donors and other stakeholders. Their concerns are straightforward and legitimate: they want to know what use their resources are being put to and what difference these resources are making to the lives of people. In this line, RBM was especially highlighted in the “2005 Paris Declaration on Aid Effectiveness” as part of the efforts to work together in a participatory approach to strengthen country capacities and to promote accountability of all major stakeholders in the pursuit of results. It is usually argued that complex processes such as development are about social transformation, processes which are inherently uncertain, difficult, not totally controllable and - therefore - which one cannot be held responsible for. Nonetheless, these difficult questions require appropriate responses from the professional community and, in particular, from multilateral organizations to be able to report properly to stakeholders, and to learn from experience, identify good practices and understand what the areas for improvements are. The RBM system being put in place by UNESCO aims at responding to these concerns by setting out clear expected results expected for programme activities, by establishing performance indicators to monitor and assess progress towards achieving the expected results and by enhancing accountability of the organization as a whole and of persons in charge. It helps to answer the “so what” question, recognizing that we cannot assume that successful implementation of programmes is necessarily equivalent to actual improvements in the development situation. This paper is intended to assist UNESCO staff in understanding and using the basic concepts and principles of results-based management.

– Page 3 –

2) BRIEF HISTORICAL BACKGROUND As such, the concept of RBM is not really new. Its origins date back to the 1950’s. In his book “The practice of Management”, Peter Drucker introduced for the first time the concept of “Management by Objectives” (MBO) and its principles:

- Cascading of organizational goals and objectives, - Specific objectives for each member of the Organization - Participative decision-making - Explicit time period - Performance evaluation and feedback

As we will see further on, these principles are very much in line with the RBM approach. MBO was first adopted by the private sector and then evolved into the Logical Framework (Logframe) for the public sector. Originally developed by the United States Department of Defense, and adopted by the United States Agency for International Development (USAID) in the late 1960s, the logframe is an analytical tool used to plan, monitor, and evaluate projects. It derives its name from the logical linkages set out by the planners to connect a project’s means with its ends. During the 1990s, the public sector was undergoing extensive reforms in response to economic, social and political pressures. Public deficits, structural problems, growing competitiveness and globalization, a shrinking public confidence in government and growing demands for better and more responsive services as well as for more accountability were all contributing factors. In the process, the logical framework approach was gradually introduced in the public sector in many countries (mainly member States of the Organization for Economic Co-operation and Development (OECD)). This morphed during the same decade in RBM as an aspect of the New Public Management, a label used to describe a management culture that emphasizes the centrality of the citizen or customer as well as the need for accountability for results. This was followed by the establishment of RBM in international organizations. Most of the United Nations system organizations were facing similar challenges and pressures from Member States to reform their management systems and to become more effective, transparent, accountable and results-oriented. A changeover to a results-based culture is however a lengthy and difficult process that calls for the introduction of new attitudes and practices as well as for sustainable capacity-building of staff. UNESCO and RBM Results based programming, management and monitoring (RBM) has been on UNESCO’s agenda for several years and underpins the reform processes of the Organization aimed at shifting the focus from the planning of activities linked to the organizational mandate to the achievement of results derived from that mandate. The reform was aimed at shifting the focus from activities and programmes linked in broad terms to UNESCO’s mandate to clear expected results derived from the Organization’s mandate.

– Page 4 –

Where in the past it had sufficed to break down a domain into sub-domains down to a number of initiatives and activities, it is now required to carefully determine what concrete results the Organization can achieve in an environment where many other and often competing actors operate, and to identify the most suitable means for that purpose. The introduction of RBM in UNESCO is marked by a number of milestones expressing the commitment to progressive shift towards RBM: Table A - Milestones of the introduction of RBM in UNESCO

DATES EVENTS

April 1997 UNESCO’s Information Technology Master Plan is finalized and sets the stage for the design of SISTER (System of Information on Strategies, Tasks and Evaluation of Results)

May 1998

The Bureau of Planning and Evaluation (predecessor of the Bureau of Strategic Planning) starts developing SISTER to accompany the introduction of results-based programming, management, monitoring and reporting

Nov 1999 The Director-General, upon taking office, officially introduces SISTER and launches a comprehensive programme of reforms of which RBM is one important pillar

During 2000 UNESCO transfers all of its programming for the Programme and Budget 2000 – 2001 (30C/5) to SISTER

2000 – 2001 Substantive training on logical framework and results formulation is provided to more than 300 professionals (provided inter alia by the University of Wolverhampton)

Jan – April 2001 UNESCO hires RTC Services and the Centre on Governance of the University of Ottawa to assess UNESCO within the context of RBM and to provide tools designed to improve internal capacity

2000 – 2002 SISTER training is provided to more than 2000 staff members

Nov 2001 – Mar 2002

SISTER is used systematically for the first time to prepare and approve the work plans for the Programme and Budget 2002-2003 (31C/5) and to integrate extrabudgetary projects

Jun 2003 An RBM team is created within BSP to develop and implement a UNESCO-wide results formulation training programme as a precondition for a meaningful RBM practice

Sept 2003 – 2006 The team delivers training at Headquarters and in the field responding to needs of sectors and bureau as well as field offices

Sept 2005 – 2007 Results formulation training expanded to include UNESCO’s contribution to common country programming exercises

– Page 5 –

3) WHAT IS RBM? Results -based management (RBM) can mean different things to different people/organizations. A simple explanation is that RBM is a broad management strategy aimed at changing the way institutions operate, by improving performance, programmatic focus and delivery. It reflects the way an organization applies processes and resources to achieve interventions targeted at commonly agreed results. Results-based management is a participatory and team-based approach to programme planning and focuses on achieving defined and measurable results and impact. It is designed to improve programme delivery and strengthen management effectiveness, efficiency and accountability. RBM helps moving the focus of programming, managing and decision-making from inputs and processes to the objectives to be met. At the planning stage it ensures that there is a necessary and sufficient sum of the interventions to achieve an expected result.

During the implementation stage RBM helps to ensure and monitor that all available financial and human resources continue to support the intended results. .

– Page 6 –

To maximize relevance, the RBM approach must be applied, without exceptions, to all organizational units and programmes. Each is expected to define anticipated results for its own work, which in an aggregative manner contributes to the achievement of the overall or high-level expected outcomes for the organization as a whole, irrespective of the scale, volume or complexity involved. RBM seeks to overcome what is commonly called the “activity trap”, i.e. getting so involved in the nitty-gritty of day-to-day activities that the ultimate purpose or objectives are being forgotten. This problem is pervasive in many organizations: project/programme managers frequently describe the expected results of their project/programme as “We provide policy advice to the Ministries of Education”, “We train journalists for the promotion of freedom of expression”, “We do research in the field of fresh water management”, etc., focusing more on the type of activities undertaken rather than on the ultimate changes that these activities are supposed to induce, e.g. in relation to a certain group of beneficiaries. An emphasis on results requires more than the adoption of new administrative and operational systems, it needs above all a performance-oriented management culture that supports and encourages the use of new management approaches. While from an institutional point of view, the primordial purpose of the RBM approach is to generate and use performance information for accountability reporting to external stakeholders and for decision-making, the first beneficiaries are the managers themselves. They will have much more control over the activities they are responsible for, be in a better position to take well-informed decisions, be able to learn from their successes or failures and to share this experience with their colleagues and all other stakeholders.

– Page 7 –

The processes or phases of RBM The formulation of expected results is part of an iterative process along with the definition of a strategy for a particular challenge or task. The two concepts – strategy and expected results - are closely linked, and both have to be adjusted throughout a programming process so as to obtain the best possible solution. In general, organizational RBM practices can be cast in twelve processes or phases, of which the first seven relate to results-oriented planning.

1. Analyzing the problems to be addressed and determining their causes and effects;

2. Identifying key stakeholders and beneficiaries, involving them in identifying objectives and in designing interventions that meet their needs;

3. Formulating expected results, in clear, measurable terms;

4. Identifying performance indicators for each expected result, specifying exactly what is to be measured along a scale or dimension;

5. Setting targets and benchmarks for each indicator, specifying the expected or planned levels of result to be achieved by specific dates;

6. Developing a strategy by providing the conceptual framework for how expected results shall be realized, identifying main modalities of action reflective of constraints and opportunities and related implementation schedule;

7. Balancing expected results and the strategy foreseen with the resources available;

8. Managing and monitoring progress towards results with appropriate performance monitoring systems drawing on data of actual results achieved;

9. Reporting and self-evaluating, comparing actual results vis-à-vis the targets and reporting on results achieved, the resources involved and eventual discrepancies between the “expected” and the “achieved” results;

10. Integrating lessons learned and findings of self-evaluations, interpreting the information coming from the monitoring systems and finding possible explanations to eventual discrepancies between the “expected” and the “achieved”.

11. Disseminating and discussing results and lessons learned in a transparent and iterative way.

12. Using performance information coming from performance monitoring and evaluation sources for internal management learning and decision-making as well as for external reporting to stakeholders on results achieved.

– Page 8 –

4) WHAT IS A RESULT? A result is the “raison d’être” of an intervention. A result can be defined as a describable and measurable change in state due to a cause and effect relationship induced by that intervention. Expected results are answers to problems identified and focus on changes that an intervention is expected to bring about. A result is achieved when the outputs produced further the purpose of the intervention. It often relates to the use of outputs by intended beneficiaries and is therefore usually not under full control of an implementation team. 5) HOW TO FORMULATE AN EXPECTED RESULT? Formulate the expected results from the beneficiaries’ perspective Formulating expected results from the beneficiaries’ perspective will facilitate focusing on the changes expected rather than on what is planned to be done or the outputs to be produced. This is particularly important at the country level, where UNESCO seeks to respond to the national development priorities of a country. Participation is key for improving the quality, effectiveness and sustainability of interventions. When defining an intervention and related expected results one should therefore ask:

Who participated in the definition of the expected results? Were key project stakeholders and beneficiaries involved in defining the scope of the

project and key intervention strategies? Is there ownership and commitment from project stakeholders to work together to

achieve identified expected results? Use “change” language instead of “action” language The expected result statement should express a concrete, visible, measurable change in state or a situation. It should focus on what is to be different rather than what is to be done and should express it as concretely as possible. Completed activities are not results, results are the actual benefits or effects of completed activities. Action language Change language … expresses results from the provider’s perspective:

• To promote literacy by providing schools and teaching material.

… can often be interpreted in many ways:

• To promote the use of computers. … focuses on completion of activities:

• To train teachers in participatory teaching.

... describes changes in the conditions of beneficiaries: • Young children have access to school facilities and learn to read and write. ... sets precise criteria for success: • People in undersupplied areas have increased knowledge of how to benefit from the use of a computer and have access to a computer. ... focuses on results, leaving options on how to achieve them (how this will be achieved will be clarified in the activity description): • Teachers know how to teach in a participatory way and use these techniques in their daily work.

– Page 9 –

Make sure your expected results are SMART Although the nature, scope and form of expected results differ considerably, an expected result should meet the following criteria (be “SMART”): Specific: It has to be exact, distinct and clearly stated. Vague language or generalities

are not results. It should identify the nature of expected changes, the target, the region, etc. It should be as detailed as possible without being wordy.

Measurable: It has to be measurable in some way, involving qualitative and/or quantitative characteristics.

Achievable: It has to be achievable with the human and financial resources available (‘realistic’).

Relevant: It has to respond to specific and recognized needs or challenges and to be within mandate.

Time-bound: It has to be achieved in a stated time-frame or planning period. Once a draft expected results statement has been formulated, it is useful to test its formulation going through the SMART criteria. This process enhances the understanding of what is pursued, and is of help in refining an expected result in terms of their achievability and meaningfulness. Example: if we consider a work plan to be undertaken in a specific country that includes the expected results statement “Quality of primary education improved”, the application of the SMART questioning could be as follows: 1. Is it “Specific”?

What does “quality” actually mean in this context? What does an “improvement” of quality in primary education amount to concretely? Who are the relevant stakeholders involved? Are we working on a global level, or are we focusing on a particular region or country? In responding to the need of being specific, a possible expected result formulation could finally be: “Competent authorities in Country X adopted the new education plan reviewed on the basis of international best practices and teachers and school personnel implement it.”

2. Is it “Measurable”? Can I find manageable performance indicators that can tell about the level of achievement? Possible Performance Indicators could be: - % of teachers following the curriculum developed on the basis of the new education plan (baseline 0%, benchmark 60%) - % of schools using quality teaching material (baseline 10%, benchmark 90%)

– Page 10 –

3. Is it “Achievable”? Do I have enough resources available to attain the expected result? I need to consider both financial and human resources. If the answer is negative, I have to either reconsider and adjust the scope of the project or mobilize additional resources.

4. Is it “Relevant”? Is the expected result coherent with the upstream programming element based on. UNESCO results chain (see Chapter 8) and with country/regional needs? (e.g. CCA/UNDAF, One Plan/ One Programme documents, PRSP, regional strategies) or the global remit of UNESCO in one of its domains? If the answer is negative I should drop the activity.

5. Is it “Time-bound”?

The given timeframe for programming processes at UNESCO is normally a biennium (the duration of a biennial programme and budget as expressed in a C/5 document). The expected result should be achievable within this period.

Improving the results formulation: the SMART process

Quality of primary education improved

Competent authorities in Country X adopted the new education plan reviewed on the basis ofinternational best practices and teachers and school personnel implement it Performance indicators: – % of teachers following the curriculum developed on the basis of the new education plan (baseline

0%, benchmark 60%) – % of schools using quality teaching material (baseline 10%, benchmark 90%)

TIME-BOUND

Is it

feasible within

the given time-

frame?

RELEVANT

Is it contributing

to the overall

goal, is it answering

to perceived needs?

ACHIEVABLE

Do I have enough

resources to attain it?

MEASURABLE

Can I find manageable performance indicators

that can tell about the level of

achievement?

SPECIFIC

What does “quality” mean? Are we

working at a global level or are we

focusing on the region/

community?

– Page 11 –

Find a proper balance among the three Rs Once an intervention is formulated, it can be useful to check and improve its design against yet another concept – namely, establishing a balance between three variables Results (describable and measurable change in state that is derived from a cause and effect relationship), Reach (the breadth and depth of influence over which the intervention aims at spreading its resources) and Resources (human, organisational, intellectual and physical/material inputs that are directly or indirectly invested in the intervention).

Unrealistic project plans often suffer from a mismatch among these three key variables. It is generally useful to check the design of a project by verifying the three Rs by moving back and forth along the project structure and by ensuring that the logical links between the resources, results and the reach are respected. It is rather difficult to construct a results-based design in one sitting. Designs usually come together progressively and assumptions and risks have to be checked carefully and constantly along the way.

– Page 12 –

6) WHAT IS THE RELATIONSHIP BETWEEN INTERVENTIONS, OUTPUTS AND RESULTS?

Interventions, outputs and results are often confused. Interventions describe what we do in order to produce the changes expected. The completion of interventions leads to the production of outputs. Results are finally the effects of outputs on a group of beneficiaries. For example, the implementation of training workshops (activity) will lead to trainees with new skills or abilities (outputs). The expected result identifies the behavioral change among the people that were trained leading to an improvement in the performance of, say, an institution the trainees are working in, which is the ultimate purpose of the activity. If we move our focus from what we do to what we want the beneficiaries to do after they have been reached by our intervention, we may realize that additional types of activities could be necessary to make sure we will be able to achieve the expected results. It is important that a project is driven by results and not activities. Defining expected results: is not an exact science; includes a solid understanding of the socio-economic, political and cultural context; is influenced by available resources, the degree of beneficiary reached and potential risk

factors; requires participation of key stakeholders.

The following examples may help to understand the relationship between interventions, outputs and results, but should not be seen as a generally applicable master copy as every intervention is different from another.

– Page 13 –

TIT

LE

OF

TH

E

AC

TIV

ITY

IN

TE

RV

EN

TIO

NS

OU

TPU

TS

RE

SUL

TS

Cap

acity

bui

ldin

g fo

r su

stai

nabl

e de

velo

pmen

t in

the

field

of w

ater

m

anag

emen

t

• C

ompi

latio

n of

bes

t pra

ctic

es in

the

field

of

wat

er in

form

atio

n m

anag

emen

t. •

Dev

elop

In

tern

et-b

ased

in

form

atio

n sy

stem

s/w

eb p

ages

and

dat

abas

es a

nd o

ther

to

ols

(sof

twar

e, g

uide

lines

, da

taba

ses)

to

trans

fer a

nd sh

are

wat

er in

form

atio

n.

• O

rgan

izat

ion

of

Wat

er

Info

rmat

ion

Sum

mits

. •

Prov

isio

n of

tech

nica

l ass

ista

nce.

• B

est

prac

tices

for

man

agem

ent

of w

ater

inf

orm

atio

n co

llect

ed

and

diss

emin

ated

. •

Softw

are,

gui

delin

es,

data

base

, et

c. a

vaila

ble

to t

he i

nstit

utio

ns

conc

erne

d.

• W

ater

In

form

atio

n Su

mm

its

atte

nded

by

polic

y m

aker

s an

d re

pres

enta

tives

of

in

stitu

tions

co

ncer

ned.

Rel

evan

t ins

titut

ions

ass

iste

d fo

r th

e im

plan

tatio

n of

be

st

prac

tices

.

• R

elev

ant

inst

itutio

ns h

ave

adap

ted

and

impl

emen

ted

best

pra

ctic

es fo

r w

ater

info

rmat

ion

man

agem

ent.

Con

trib

utio

n to

a

reco

nstr

uctio

n pr

ogra

mm

e fo

r th

e ed

ucat

ion

syst

em in

co

untr

y X

• A

sses

sing

the

situ

atio

n w

ith r

egar

d to

the

na

tiona

l edu

catio

n sy

stem

Org

aniz

atio

n of

wor

ksho

ps f

or d

ecis

ion-

mak

ers

and

expe

rts to

dis

cuss

and

sel

ect a

ne

w e

duca

tion

plan

. •

Def

initi

on o

f an

edu

catio

n pl

an e

nsur

ing

an

adeq

uate

lea

rnin

g en

viro

nmen

t in

cou

ntry

X

. •

Prov

isio

n of

tech

nica

l ass

ista

nce

• O

rgan

izat

ion

of

train

ing

wor

ksho

ps

for

teac

hers

and

scho

ol p

erso

nnel

Dev

elop

men

t of

te

achi

ng

and

lear

ning

m

ater

ial.

• Si

tuat

ion

anal

ysis

re

port

com

plet

ed.

Rel

evan

t au

thor

ities

hav

e ad

opte

d th

e ne

w

educ

atio

n pl

an

and

teac

hers

an

d sc

hool

pe

rson

nel

impl

emen

t it.

• W

orks

hops

atte

nded

by

rele

vant

de

cisi

on-m

aker

s and

exp

erts

. •

Educ

atio

n pl

an d

evel

oped

and

av

aila

ble

to lo

cal c

ount

erpa

rts.

• Te

ache

rs a

nd s

choo

l pe

rson

nel

train

ed.

• Im

plem

enta

tion

capa

citie

s of

lo

cal c

ount

erpa

rts im

prov

ed.

• Te

achi

ng a

nd le

arni

ng m

ater

ials

de

liver

ed.

UN

ESC

O P

rize

for

Tol

eran

ce

• Se

lect

ion

and

info

rmat

ion

of ju

ry.

• Pr

epar

atio

n of

br

ochu

res,

info

rmat

ion

mat

eria

l. •

Dev

elop

men

t an

d or

gani

zatio

n of

an

In

form

atio

n ca

mpa

ign.

Adv

ertis

ing

the

priz

e.

• D

evel

opm

ent

of

partn

ersh

ips

for

the

iden

tific

atio

n of

a li

st o

f can

dida

tes.

• O

rgan

izat

ion

of th

e aw

ard

cere

mon

y.

• O

rgan

izat

ion

of p

ress

con

fere

nces

.

• Ju

ry n

omin

ated

and

agr

eeab

le to

m

ain

stak

ehol

ders

. •

Bro

chur

es,

leaf

lets

, vi

deos

pr

oduc

ed a

nd d

isse

min

ated

. •

Info

rmat

ion

cam

paig

n im

plem

ente

d •

List

of

ca

ndid

ates

co

mpl

eted

an

d ag

reea

ble

to

mai

n st

akeh

olde

rs.

• Pr

ize-

win

ner n

omin

ated

.

• C

once

pt o

f tol

eran

ce s

prea

d am

ong

the

gene

ral

publ

ic

in

a co

untry

/regi

on/g

loba

lly.

Page

14

• Fo

llow

-up

and

asse

ssm

ent

of

med

ia

cove

rage

.

• Pr

ess

Con

fere

nce

orga

nize

d an

d at

tend

ed

by

iden

tifie

d jo

urna

lists

. •

Med

ia c

over

age

of th

e ev

ent.

Prom

otio

n of

the

UN

ESC

O

Uni

vers

al D

ecla

ratio

n of

C

ultu

ral D

iver

sity

• D

evel

op g

uide

lines

on

the

appl

icat

ion

of th

e D

ecla

ratio

n in

diff

eren

t fie

lds.

• In

form

abo

ut th

e D

ecla

ratio

n.

• O

rgan

izat

ion

of

wor

ksho

ps

on

the

Dec

lara

tion

for

polic

y-m

aker

s an

d ke

y de

cisi

on-m

aker

s.

• G

uide

lines

on

the

appl

icat

ion

of

the

Dec

lara

tion

in d

iffer

ent f

ield

s pr

oduc

ed.

• W

orks

hops

on

the

Dec

lara

tion

orga

nize

d an

d gu

idel

ines

on

the

Dec

lara

tion

dist

ribut

ed

to

pers

ons a

ttend

ing

the

wor

ksho

ps.

• D

ecis

ion-

mak

ers p

ut to

pra

ctic

e th

e gu

idel

ines

on

the

appl

icat

ion

of th

e D

ecla

ratio

n in

diff

eren

t fie

lds.

Incr

easi

ng a

cces

s to

qual

ity

basi

c ed

ucat

ion

for

child

ren

thro

ugh

com

mun

ity le

arni

ng

cent

res

• D

iscu

ssio

ns w

ith lo

cal a

utho

ritie

s. •

Ass

essi

ng

the

feas

ibili

ty

of

com

mun

ity-

base

d le

arni

ng c

entre

in c

omm

unity

X.

• Pr

elim

inar

y di

scus

sion

s w

ith

loca

l st

akeh

olde

rs.

• Se

nsiti

zatio

n se

min

ars

for

loca

l lea

ders

and

co

mm

unity

mem

bers

. •

Cur

ricul

um

deve

lopm

ent

of

com

mun

ity-

base

d le

arni

ng c

entre

s. •

Sele

ctio

n of

tra

inin

g pe

rson

nel

amon

g th

e lo

cal c

omm

unity

. •

Ada

ptin

g tra

inin

g m

ater

ial

• Tr

aini

ng o

f per

sonn

el.

• Pr

oduc

tion

of in

form

atio

n m

ater

ial f

or lo

cal

auth

oriti

es.

• M

eetin

gs

with

lo

cal

auth

oriti

es

for

the

repl

icat

ion

of su

ch c

entre

s. •

Prov

isio

n of

tec

hnic

al a

ssis

tanc

e to

loc

al

auth

oriti

es f

or r

eplic

atin

g su

ch c

entre

s in

ot

her c

omm

uniti

es.

• Pr

inci

ple

agre

emen

t by

lo

cal

auth

oriti

es.

• Fe

asib

ility

stu

dy c

ompl

eted

and

ge

nder

an

alys

is

prod

uced

an

d di

ssem

inat

ed.

• Pr

inci

ple

agre

emen

t by

co

mm

unity

lead

ers.

• C

omm

unity

C

entre

pr

opos

al

com

plet

ed a

nd su

bmitt

ed to

loca

l au

thor

ities

an

d co

mm

unity

le

ader

s. •

Pers

onne

l sel

ecte

d.

• C

urric

ulum

and

trai

ning

mat

eria

l fo

r co

mm

unity

-bas

ed

lear

ning

ce

ntre

s dev

elop

ed.

• M

anag

ers

and

teac

hers

hav

e th

e ne

cess

ary

skill

s to

im

plem

ent

thei

r fun

ctio

ns.

• B

roch

ures

& v

ideo

s de

velo

ped

and

diss

emin

ated

. •

Loca

l le

ader

s an

d co

mm

unity

m

embe

rs

info

rmed

, se

nsiti

zed

and

conv

ince

d.

• Th

e ce

ntre

is o

pera

tiona

l and

is a

n in

tegr

al p

art o

f the

com

mun

ity li

fe.

• St

eps

are

take

n by

loca

l aut

horit

ies

for

repl

icat

ing

this

in

itiat

ive

in

othe

r com

mun

ities

.

Page

15

7) RBM WITHIN UNESCO’S MANAGEMENT FRAMEWORK This chapter focuses on how to apply the RBM approach within the specific UNESCO planning and management framework. UNESCO has three main sets of institutional planning documents: the Medium-Term Strategy (C/4 document), the biennial Programme and Budget (C/5 document) and the work plans (available in SISTER). Documents C/4 and C/5 together constitute the programmatic and conceptual framework for all of UNESCO’s action. A) UNESCO’S MEDIUM-TERM STRATEGY – C/4 (6 YEARS) The Medium-Term Strategy is the overarching planning document of UNESCO. It is a 6-years rolling document determining the corporate strategy of the organization that can be revised by the General Conference, if so required. The 34 C/4 Medium-Term Strategy is built around the following mission statement for UNESCO, focusing on themes and areas where UNESCO can make a difference through purposeful, strategic action in all its fields of competence: “As a specialized agency of the United Nations, UNESCO contributes to the building of peace, the eradication of poverty, sustainable development and intercultural dialogue through education, the sciences, culture, communication and information”. Throughout the strategy, priority is being accorded to Africa and gender equality. Action in favour of Africa respects the priorities articulated by African countries, the African Union (AU), including through its New Partnership for Africa’s Development (NEPAD) programme, and other organizations. The emphasis on gender equality reflects the strong commitment by world leaders at the 2005 World Summit as well as the subsequent proposals that have arisen throughout the United Nations system in the context of the United Nations reform process. The pursuit of gender equality through all of UNESCO’s fields of competence is supported by a two-pronged approach pursuing both women’s empowerment and gender mainstreaming in Member States and within the Organization. During the 34 C/4 period, the Organization will focus on its core competencies to contribute to the attainment of internationally agreed development goals, including the Millennium Development Goals (MDGs). One of UNESCO’s comparative advantages within the United Nations system is its ability to respond to complex problems in a comprehensive and substantively appropriate manner through an intersectoral and interdisciplinary approach. The new Medium-Term Strategy is therefore structured around five programme-driven overarching objectives for the entire Organization designed to respond to specific global challenges, representing core competencies of UNESCO in the multilateral system:

- Attaining quality Education for All and lifelong learning; - Mobilizing science knowledge and policy for sustainable development; - Addressing emerging ethical challenges; - Promoting cultural diversity, intercultural dialogue and a culture of peace; - Building inclusive knowledge societies through information and communication.

– Page 16 –

These overarching objectives respond to specific global challenges in UNESCO’s domains and delineate areas for which UNESCO has a unique profile and core competency in the UN system, indeed areas where, internationally, the Organization enjoys a comparative advantage. A limited number of strategic programme objectives – 14 for the entire Programme - then concretize the overarching objectives in programme-relevant and thematic terms. For each Overarching Objective and for each Strategic Programme Objective, a limited number of expected outcomes have been identified for the six-year period, further clarifying and concretizing the scope of the action of the Organization. The Medium-Term Strategy therefore indicates how UNESCO sets about the task of meeting the dual requirement of:

• concentrating the Organization’s efforts on a limited number of priority areas in order to ensure that its action has lasting effect and is coherent with its role in the reforming United Nations system;

• ensuring consistency of the global strategies and objectives pursued by the Organization.

B) UNESCO’S PROGRAMME AND BUDGET – C/5 (2 YEARS) The roadmap laid out in the Medium-Term Strategy document must be translated into three consecutive biennial programme and budget documents (C/5). C/5 documents are designed to cast UNESCO’s action in response to the overarching and strategic programme objectives identified in the Medium-Term Strategy document, and form the basis for a limited set of biennial sectoral priorities for each major programme, thereby ensuring a seamless transition between UNESCO’s medium-term and biennial programme priorities and guaranteeing alignment between specific programme activities and medium-term objectives. Biennial sectoral priorities contained in the 34 C/5 specify a sector’s contribution to the achievement of document C/4 objectives, identifying as appropriate areas of intersectoral/interdisciplinary engagement and commitment. These biennial sectoral priorities are developed into a limited number of main lines of action (MLAs); whose overall number in the 34 C/5 has been significantly reduced compared to previous C/5 documents in an effort to concentrate and focus on UNESCO's core strengths. There are three programming levels at UNESCO. The first two levels are to be found in the C/5 document and in SISTER, whereas the last level is to be found only in SISTER. Level: 1 Major Programme Level: 2 Main Line of Action Level: 3 Activity The 3 programming levels of UNESCO Intersectorality and interdisciplinarity are given special emphasis in the 34 C/5. A number of priority themes and challenges calling for a concerted and comprehensive response by the Organization across its programme sectors are accordingly identified and are summarized in a separate chapter of document 34 C/5 on "intersectoral platforms". This chapter identifies the strategies and expected results as well as an indicative financial amount to be allocated during the biennium to each intersectoral platform. UNESCO's ability to combine the contributions of different sectors and disciplines in a strategic manner will increase the relevance and impact of its action.

– Page 17 –

The Major Programme texts include biennial sectoral priorities, with specific reference to agreed international development goals, including the MDGs, and the main strategic approaches to be pursued for the attainment of the strategic programme objectives drawing among others on action plans adopted by relevant international conferences and relevant international decades. These strategic approaches provide the rationale and framework for action to be pursued through the various Main Lines of Action (MLAs), thereby avoiding the formulation of separate strategies for each MLA. As requested by the General Conference, the document 34 C/5 is built on the principle of RBM. For each major programme, information on strategies to be followed during implementation is given for the biennial sectoral priority level. Main lines of action focus on the presentation of expected results and performance indicators and, where necessary, of benchmark targets. In its programme execution, UNESCO will continue to follow the SMART approach (specific, measurable, attainable, relevant and time-bound) approach.

01012 Main line of action 1: Global leadership in EFA, coordination of United Nations priorities in education and development of strong partnerships

Expected results

Coordinated, harmonized and effective partnerships pursued within the framework of the EFA Global Action Plan for strengthened political commitment at the global, regional and national levels for the EFA agenda.

Performance indicators:

� Number of countries where national plans and policies reflect a tangible political commitment to EFA;

� Degree to which DESD, UNLD, HIV and AIDS education, peace and human rights

education, teacher education and preparation for the world of work have prominence on the global, regional and national EFA agendas

Example of Expected Results and Performance Indicators included in the 34C/5

– Page 18 –

C) WORK PLANS At UNESCO the term “work plan” refers to a planning document for an activity in UNESCO. Level 2 of the C/5 document (MLAs) is thus translated into work plans outlining the activities (level 3) to be undertaken. The complete information related to work plans is to be found in SISTER. 8) UNESCO’S RESULTS CHAIN The Results Chain shows the linkages between the expected results at different programming levels. Each level of programming has to be linked to the next one, thus forming a results chain. Therefore, each and every element should be designed in a way that makes it not only consistent in itself but also fitting well within the overall structure. As seen previously, at all levels an expected result should express a change, not the process itself. The relationship between two results at different levels should be causal, that is, the achievement of one result is necessary for and contributes to the achievement of the expected result at the level above. The causal connection between two results should be direct. It should not be necessary to infer additional intermediate results to understand the linkage between two results. Similarly, it should not be necessary to accept many or broad assumptions to move from a “lower” to a “higher” result. The relationship between results should not be categorical or definitional; that is, lower level results should not merely describe component parts of a related “higher” level result. For example: if we consider the expected result “increased biodiversity in critical ecosystems”, a categorical relationship could be the one between the two results statements – “increased biodiversity in marine ecosystems” and “increased biodiversity in forest ecosystems”. A results statement causally related would be, for example, “reduced population pressure on critical ecosystems”. The definition of expected results within the UNESCO results chain is in this respect a top- down process (with appropriate Member States and Field Offices consultations during the preparation of the C/4 and C/5 documents). When conceiving a programming element, one has to start considering the expected results defined at the level above and “plug-in” adequately to fully contribute through its own achievements to the broader expected result.

– Page 19 –

UNESCO’s results chain

The chart shows UNESCO’s established results chain cascading from the C/4 document over the C/5 programme and budget to the work plans, as it is applied for regular and extrabudgetary resources alike. It also puts in perspective the relation to the pursuit of national development plans through United Nations common country programming tools. Evaluations are a critical tool for accountable, transparent and effective management – and so its findings will be built into the results chain in order to benefit from lessons learned. Using both quantitative and qualitative techniques, evaluations are an essential source of data and information for the assessment of organizational performance in managing for and achieving results. In building a culture of evaluation, UNESCO particularly encourages evaluations which contribute to organizational learning and support accountability. In the spirit of the Paris Declaration on Aid Effectiveness and in order to partake fully in the One United Nations approach at the country level, UNESCO contributes to inter-agency reviews of its RBM and evaluation approaches, the compatibility of its IT tools (SISTER, FABS and STEPS) with those of other agencies and, to the extent possible, common evaluation approaches.

– Page 20 –

Defining your intervention at level 3 within UNESCO results chain: where to start from?

The mechanism through which a results chain is formalized in UNESCO relies on the relationship between the respective levels. Expected results of the upstream programming elements represent the starting point for conceiving your intervention. Each intervention must “plug” adequately in the level above to fully contribute, through its own achievements, to the broader expected result. Therefore, a mutually agreeable arrangement has to be found between the responsible officers of two subsequent levels: the responsible officer of the level above who, in order to achieve his/her expected results, relies on the lower level results will agree to fund downstream interventions once he will be confident that the aggregation of their expected results will make it possible to achieve the expected results of the programming element he/she is responsible for. The results of the elements attached to the same programming element combine to produce the results of the upstream element to which they relate. This mechanism swirls bottom up throughout the programming tree and is conceived to ensure the consistency among the programming levels. It is important to note that the result of an element must not be defined as the sum of the results of downstream elements: if this were the case, the results at the MLA level would be only a list of results of the activity level. The result of a programming element therefore relies on the results of downstream elements, but is not made of them.

Expected results at MLA

level

MLA Level 2

…contribute to…

Expected Results

Expected Results

Expected Results

Expected Results

Activities Level 3

Expected Results

…lead to…

Activity 3 Activity 4 Activity nActivity 1 Activity 2…make it possible to implement…

Resources Resources Resources Resources Resources

UNESCO’s results chain – MLA and below

– Page 21 –

– Page 22 –

We can anticipate a few challenges in this process: - The nature of expected results: it is obvious that the nature, magnitude, meaning of “expected results” cannot be the same among the different levels. Nevertheless, it is crucial that all these results build a chain of meaningful achievements, bridging the gap between the mandate and the strategic objectives of UNESCO and what the Organisation actually achieves in its daily operations. - Reconciling global and local dimensions: RBM stresses results and greater focus; this should be done without sacrificing the organisation’s global mandate and its commitment to decentralisation and responsiveness to country needs and priorities: a good balance has to be found between global and field-oriented approaches. UNESCO’s intellectual, ethical and normative functions cannot be divorced from implementation and operational action to ensure an effective feedback loop between theory and practice. - Responding to specific requests of local stakeholders: It often happens that staff in field offices receives official requests from Member States representatives concerning activities to be implemented in the country. It has to be recalled that UNESCO Governing Bodies decide the areas of intervention of the Organization and it is important that ownership is reconciled with UNESCO objectives and priorities. A specific request at country level does not legitimize the use of resources in areas that did not receive the approval of the organization’s governing bodies. 9) MONITORING OF IMPLEMENTATION Monitoring can be described as “a continuing function that uses systematic collection of data on specified indicators to provide management and the main stakeholders of an ongoing (…) intervention with indications of the extent of progress and achievement of objectives and progress in the use of allocated funds” (Source: OECD RBM Glossary). The function of a monitoring system is to compare “the planned” with “the actual”. A complete monitoring system needs to provide information about the use of resources, the activities implemented, the outputs produced and the results achieved. What we are focusing on here is a results-based monitoring system: at the planning stage, through its monitoring system, the officer in charge has to translate the objectives of the intervention in expected results and related performance indicators and to set the baseline and targets for each of them. During implementation, he needs to routinely collect data on these indicators, to compare actual levels of performance indicators with targets, to report progress and take corrective measures whenever required. As a general rule, no extra resources (neither human, nor financial) will be allowed for monitoring tasks, hence the responsible person has to ensure that these tasks can be undertaken with the budget foreseen (as a general rule of thumb, about 5% of the resources should be set aside for this purpose).

– Page 23 –

Selection and formulation of performance indicators Monitoring is done through the use of appropriate performance indicators. When conceiving an intervention, the responsible officer is also required to determine appropriate performance indicators that will allow to track progress and assess the effectiveness of our intervention, i.e. if it was capable of producing the intended results. Indicators support effectiveness throughout the processes of planning, implementation, monitoring, reporting and evaluation. Indicators may be used at any point along the chain inputs, interventions, outputs, results, but a results-based monitoring system does not address compliance to the rate of expenditure or to the implementation plan (answering the question: “have we done it?”), but on the actual benefits that our interventions were actually able to bring to the targeted populations (answering the question: “we have done it, so what?”). Performance indicators are aimed at giving indications of change caused or induced by the intervention. This core purpose does not require sophisticated statistical tools, but reliable signals that tell, directly or indirectly, about the real facts on which one undertakes to have leverage. A fair balance is to be sought between the cost - both in terms of time and money - to collect the information required and its capacity to reflect the desired changes. Even a carefully selected, clearly defined indicator is of little use unless it is actually put to use. A critical test of an indicator is how practical it is to monitor. Thinking about an indicator is one thing, actually finding, recording and presenting the data is another. Indicators need to be approached as a practical tool, not merely as a conceptual exercise. Performance indicators are signposts of change. They enable us to verify the changes the interventions we are dealing with seek to achieve. The purpose of indicators is to support effective programme planning, management and reporting. Indicators not only make it possible to demonstrate results, but they can also help produce results by providing a reference point for monitoring, decision-making, stakeholder consultations and evaluation. We should bear in mind, however, that indicators are only intended to indicate, and not to provide scientific “proof” or detailed explanations about change. In addition, we should avoid the temptation to transform the measurement of change into a major exercise with a burdensome workload. Measuring change should not take precedence over programme activities that generate the changes to be measured. The critical issue in selecting good indicators is credibility, not the number of indicators, nor the volume of data or precision in measurement. The challenge is to meaningfully capture key changes by combining what is substantively relevant with what is practically feasible to monitor. At the end of the day, it is better to have indicators that provide approximate answers to some important questions than to have exact answers to many unimportant questions. Selecting substantively valid and practically possible performance indicators presupposes an in- depth understanding of the situation and of the mechanisms subtending change. Therefore, the use of pre-designed or standardised performance indicators is not recommended, as they often do not address the specificities of the situation in which the intervention is carried out. Performance indicators have to be designed on the basis of the ambition of an intervention, its scope and the environment they are implemented in.

– Page 24 –

Failing to design good indicators often means that the results are not clearly defined or that they are too wide-ranging. The process of selecting indicators can help identify the core issues of the intervention and translate often intangible concepts into more concrete and observable elements. A result and its indicator should not be mixed up. The result is the achievement. Indicators should tell about the achievement. Indicators for “soft assistance” activities A problem that programme officers often refer to is that while support in the so-called "soft" areas of capacity building, policy advice, advocacy, etc. may well be the greatest comparative advantage of UNESCO, these areas may be the most difficult against which to assess results. The experience of a number of development cooperation agencies with the shift to a results-based approach has shown that, unless guarded against, there could be a tendency for country operations to focus more explicitly on quantifiable initiatives. It is therefore critical that UNESCO guard against the development of any disincentives that would prevent it from focusing on capacity-building and advocacy work, both of which are complex and long-term and against which it may be much more difficult to assess results than it is in other sectors. The term “capacity” in this framework refers to the abilities, skills, understandings, attitudes, values, relationships, knowledge, conditions and behaviours that enable organizations, groups and individuals in a society to generate benefits and achieve their objectives over time. Capacity also reflects the abilities of these actors to meet the needs and demands of the stakeholders for whom they were established or to whom they are accountable. These attributes cover formal, technical and organizational abilities and structures and also the more human, personal characteristics that allow people to make progress. Quantitative vs. qualitative indicators Indicators can comprise a variety of types of “signals” such as numbers, ranking systems or changes in the level of user approval. A signal also features a “scale” of observation. For example, the indicator “65 per cent of enrolled students graduate primary school” features a percentage signal with a scale of 65 per cent. Signals and scales lend themselves to indicators that express qualitative and/or quantitative information. Quantitative indicators are numerical. Qualitative indicators use categories of classification, based on individual perceptions. The concept of quantitative versus qualitative indicators has been a subject of frequent discussion over the past few years. The common belief is that quantitative indicators are measurements that stick to cold and hard facts and rigid numbers and there is no question about their validity, truth and objectivity while qualitative indicators are seen as subjective, unreliable and difficult to verify. No one type of indicator or observation is inherently better than another; its suitability depends on how it relates to the result it intends to describe. New UNESCO guidance indicates a shift away from the approach that indicators should be quantitative rather than qualitative. Programme specialists are expected to select the type of indicator that is most appropriate for the result being measured. If a qualitative indicator is determined to be most appropriate, one should clearly define each term used in the measure, make sure to document all definitions and find possible ways (such as using rating scales) to minimize subjectivity.

– Page 25 –

For example, if the result under consideration is in the area of improving the functioning of the government, in particular concerning its preparation to respond to local needs, we could measure the degree of results achievement through indicators measuring the change in levels of end-user approval (or client satisfaction). Possible indicators could therefore be:

- Average rate of perceived responsiveness of the government to the needs of population, on a scale from 1 to 10.

- Proportion of people who perceive local government management as “very participatory”. If this proportion increases from 40 per cent to 65 per cent over a certain period of time, this increase provides some measure of the degree of qualitative change. This kind of numerical expression of qualitative considerations may also be obtained through indicators that use rating systems that rank, order or score given categories of attributes.

- Proportion of people who rate central government adequateness to their own needs 6 or more.

Qualitative indicators are particularly helpful - for example - when the actions involve capacity development for service delivery. The perceptions of end-users regarding service delivery gets straight to the issue of whether the services are wanted, useful and effectively delivered. The satisfaction of end-users (or clients) has the advantage of some comparability. Results may be compared and data disaggregated by kind of service, location, time, etc. This approach is not without its problems, however. The only way of getting this information may be through a survey that may reveal itself too costly, clients may not always be easy to identify, and their perceptions of satisfaction with services is subject to influences other than the service itself. Types of Performance Indicators Several types of performance indicators can be used to assess progress towards the achievement of results: a) Direct Statistical Indicators Direct statistical indicators show progress when results are cast in terms of readily quantifiable short-term changes. For example, if the result is “Nominations of cultural and natural properties from regions or categories of heritage, currently under-represented or non-represented on the World Heritage List increased”, it should not be difficult to secure direct quantifiable data about the number of new nominations over the time period of a biennium (or less). Care must be taken to ensure that the time span of the result lends itself to the collection of such data for use in review. b) Proxy Indicators Proxy indicators are normally quantitative, but do not directly relate to the result. A proxy is used to show progress. It should be used when getting the full data is too time-consuming, or when the timeliness of complete data would fall outside the need for review. However, there

must be a prima facie connection between the proxy and the result. For example if the result is “Public recognition improved of the importance of the mathematical, physical, and chemical sciences for life and societal development”, a good Proxy Indicator might be the improvement of the media coverage concerning these issues. c) Narrative Indicators When the results are not easily quantifiable (changing attitudes, building capacities, etc.) over the time period of the biennium, and the number of recipients is not too big, a non-statistical approach can be envisaged to develop an indication of “progress”. Narrative indicators largely focus on the “process of change”. This technique works especially well in instances where capacity building, training, conferences, network development and workshops are the planned interventions. However, when dealing with stakeholders, care needs to be taken to avoid a focus simply on “satisfaction”. Rather, the focus should be on what happened (or at least on what the recipients have planned to do) as a result of the intervention/participation. For example if the expected result is “National capacities in educational planning and management strengthened”, then a valid narrative indicator might be a follow-up questionnaire to be circulated among those individuals who participated in training, or conferences or other activities to ask them what they did (or what they have planned to do) in their countries as a result of UNESCO’s actions. Narrative indicators enable an organization to begin to explore complex interrelationships among factors without recourse to extremely expensive statistical research. In this way, UNESCO could demonstrate “partial success” even if other factors may have prevented the overall “enhancement of the national capacity”. Given the extent to which many of UNESCO’s results may seem to be intangible, narrative indicators offer the prospect of great utility. However, they should not be seen as a widespread substitute for data that can be quantified to some degree. Risks when identifying performance indicators: There are a number of risks when defining and using performance indicators. The most frequent are:

• Oversimplification and misunderstanding of how development results occur and confusion over accountability for results;

• Overemphasis on results that are easy to quantify at the expense of less tangible, but no less important results;

• Mechanical use of indicators for reporting purposes in ways that fail to feed into strategic

thinking and organizational practices. In the following table, possible performance indicators are added to the previously introduced examples completing the RBM planning and monitoring cycle.

– Page 26 –

TIT

LE

OF

TH

E

AC

TIV

ITY

O

UT

PUT

S

R

ESU

LT

SPE

RFO

RM

AN

CE

IND

ICA

TO

RS

Cap

acity

bui

ldin

g fo

r su

stai

nabl

e de

velo

pmen

t in

the

field

of w

ater

m

anag

emen

t

• B

est

prac

tices

for

man

agem

ent

of

wat

er

info

rmat

ion

colle

cted

an

d di

ssem

inat

ed.

• So

ftwar

e, g

uide

lines

, dat

abas

e, e

tc.

avai

labl

e to

th

e in

stitu

tions

co

ncer

ned.

Wat

er

Info

rmat

ion

Sum

mits

at

tend

ed

by

polic

y m

aker

s an

d re

pres

enta

tives

of

in

stitu

tions

co

ncer

ned.

Rel

evan

t in

stitu

tions

as

sist

ed

for

the

impl

anta

tion

of b

est p

ract

ices

.

• R

elev

ant

inst

itutio

ns h

ave

adap

ted

and

impl

emen

t be

st

prac

tices

fo

r w

ater

info

rmat

ion

man

agem

ent.

• N

umbe

r an

d “i

mpo

rtanc

e” o

f in

stitu

tions

re

pres

ente

d at

th

e W

ater

In

form

atio

n Su

mm

its.

• “P

rofil

e” o

f rep

rese

ntat

ives

par

ticip

atin

g at

th

e W

ater

Info

rmat

ion

Sum

mits

. •

Num

ber

of

inst

itutio

ns

whe

re

ther

e is

ev

iden

ce

that

be

st

prac

tices

ar

e be

ing

impl

emen

ted.

Acc

ess

(dis

aggr

egat

ed

per

coun

try)

to

Inte

rnet

-bas

ed

info

rmat

ion

syst

ems/

web

pa

ges a

nd d

atab

ases

, etc

.. •

Num

ber

of

inst

itutio

ns

offic

ially

re

ques

ting

tech

nica

l as

sist

ance

fo

r th

e im

plem

enta

tion

of b

est p

ract

ices

C

ontr

ibut

ion

to a

re

cons

truc

tion

prog

ram

me

for

the

educ

atio

n sy

stem

in

coun

try

X

• Si

tuat

ion

anal

ysis

repo

rt co

mpl

eted

.•

Wor

ksho

ps

atte

nded

by

re

leva

nt

deci

sion

-mak

ers a

nd e

xper

ts.

• Ed

ucat

ion

plan

de

velo

ped

and

avai

labl

e to

loca

l cou

nter

parts

. •

Teac

hers

an

d sc

hool

pe

rson

nel

train

ed.

• Im

plem

enta

tion

capa

citie

s of

loc

al

coun

terp

arts

impr

oved

. •

Teac

hing

an

d le

arni

ng

mat

eria

ls

deliv

ered

.

• R

elev

ant

auth

oriti

es h

ave

adop

ted

the

new

edu

catio

n pl

an a

nd te

ache

rs

and

scho

ol p

erso

nnel

impl

emen

t it.

• N

ew e

duca

tion

plan

ado

pted

Perc

enta

ge

of

scho

ols

in

coun

try

X

whe

re t

he n

ew e

duca

tion

plan

is

bein

g im

plem

ente

d •

Atte

ndan

ce

rate

s of

sc

hool

s im

plem

entin

g th

e ne

w e

duca

tion

plan

UN

ESC

O P

rize

for

Tol

eran

ce

• Ju

ry n

omin

ated

and

agr

eeab

le t

o m

ain

stak

ehol

ders

. •

Bro

chur

es,

leaf

lets

, vi

deos

pr

oduc

ed a

nd d

isse

min

ated

. •

Info

rmat

ion

cam

paig

n im

plem

ente

d•

List

of

cand

idat

es c

ompl

eted

and

ag

reea

ble

to m

ain

stak

ehol

ders

. •

Priz

e-w

inne

r nom

inat

ed.

• Pr

ess

Con

fere

nce

orga

nize

d an

d at

tend

ed b

y id

entif

ied

jour

nalis

ts.

• M

edia

cov

erag

e of

the

even

t.

• C

once

pt o

f to

lera

nce

spre

ad a

mon

g th

e ge

nera

l pu

blic

in

a

coun

try/re

gion

/glo

bally

.

• M

edia

cov

erag

e of

the

priz

e

Page

27

Prom

otio

n of

the

UN

ESC

O

Uni

vers

al D

ecla

ratio

n of

C

ultu

ral D

iver

sity

• G

uide

lines

on

the

appl

icat

ion

of th

e D

ecla

ratio

n in

di

ffer

ent

field

s pr

oduc

ed.

• W

orks

hops

on

th

e D

ecla

ratio

n or

gani

zed

and

guid

elin

es

on

the

Dec

lara

tion

dist

ribut

ed

to

pers

ons

atte

ndin

g th

e w

orks

hops

.

• D

ecis

ion-

mak

ers

put t

o pr

actic

e th

e gu

idel

ines

on

the

appl

icat

ion

of t

he

Dec

lara

tion

in d

iffer

ent f

ield

s.

• “P

rofil

e”

of

pers

ons

atte

ndin

g th

e w

orks

hops

on

the

decl

arat

ion

• N

umbe

r of

in

stitu

tions

of

ficia

lly

requ

estin

g te

chni

cal

assi

stan

ce f

or t

he

impl

emen

tatio

n of

the

guid

elin

es

• C

ount

ries

whe

re t

he g

uide

lines

hav

e be

en p

ut t

o pr

actic

e (d

isag

greg

ated

by

field

) In

crea

sing

acc

ess t

o qu

ality

ba

sic

educ

atio

n fo

r ch

ildre

n th

roug

h co

mm

unity

lear

ning

ce

ntre

s

• Pr

inci

ple

agre

emen

t by

lo

cal

auth

oriti

es.

• Fe

asib

ility

st

udy

com

plet

ed

and

gend

er

anal

ysis

pr

oduc

ed

and

diss

emin

ated

. •

Prin

cipl

e ag

reem

ent b

y co

mm

unity

le

ader

s. •

Com

mun

ity

Cen

tre

prop

osal

co

mpl

eted

an

d su

bmitt

ed

to

loca

l au

thor

ities

and

com

mun

ity le

ader

s. •

Pers

onne

l sel

ecte

d.

• C

urric

ulum

an

d tra

inin

g m

ater

ial

for

com

mun

ity-b

ased

le

arni

ng

cent

res d

evel

oped

. •

Man

ager

s an

d te

ache

rs

have

th

e ne

cess

ary

skill

s to

im

plem

ent

thei

r fu

nctio

ns.

• B

roch

ures

& v

ideo

s de

velo

ped

and

diss

emin

ated

. •

Loca

l le

ader

s an

d co

mm

unity

m

embe

rs

info

rmed

, se

nsiti

zed

and

conv

ince

d.

• Th

e ce

ntre

is

oper

atio

nal

and

is a

n in

tegr

al p

art o

f the

com

mun

ity li

fe.

• St

eps

are

take

n by

loca

l aut

horit

ies

for

repl

icat

ing

this

initi

ativ

e in

oth

er

com

mun

ities

.

• Pi

lot

com

mun

ity

lear

ning

ce

ntre

op

erat

iona

l on

th

e ba

sis

of

the

curr

icul

um d

evel

oped

Num

ber

of

com

mun

ity

child

ren

bene

fitin

g fr

om th

e ce

ntre

Atte

ndan

ce ra

tes

• B

udge

t allo

cate

d by

loca

l aut

horit

ies f

or

repl

icat

ing

the

proj

ect

in

othe

r co

mm

uniti

es

Page

28

UNESCO RBM Glossary Definitions (alphabetical order)

Définitions

□ ACHIEVEMENT: Result(s), or part of the result(s) accomplished at a given point in time

□ REALISATION: Résultat(s), ou partie de résultat(s) atteint(s) à un moment donné.

□ ACTIVITY: intervention at level 3 □ ACTIVITE: Intervention au niveau 3 □ BACKGROUND: Context and history (local, national, sectoral etc...) of the intervention and its specified area and beneficiaries (local, institutional, ...).

□ ANTECEDENTS: Contexte et histoire (locaux, nationaux, sectoriels, etc...) de l'intervention, de son domaine d’activités spécifique et de ses bénéficiaires (locaux, institutionnels,...).

□ BASELINE DATA: Data describing the situation before the implementation of the intervention, related to each result, at each level. It is the starting point from which progress towards expected results will be measured.

□ DONNEES DE BASE: Données décrivant la situation avant la mise en œuvre de l’intervention, en relation avec chaque résultat, a chaque niveau. C’est le point à partir duquel les progrès vers le résultats attendus seront mesurés.

□ BENCHMARK(S): Reference points or Standards set in the recent past by an institution or by other comparable organizations against which performance or achievements can be assessed. In UNESCO we use the term “benchmark” as one achievable target for a performance indicator over a biennium.

□ NORMES DE REFERENCE: Standards et Normes établies dans le passé récent par une institution ou par d’autres organisations comparables permettant d’apprécier la performance ou les résultats obtenus. A l’UNESCO nous utilisons le terme « normes de référence » (benchmark) comme un niveau atteignable à la fin du biennium pour un indicateur de performance donné.

□ BENEFICIARIES: The individuals, groups, or organizations, whether targeted or not, that benefit, directly or indirectly, from the intervention.

□ BENEFICIAIRES: Individus, groupes ou organisations (ciblés ou pas) qui bénéficient de l’action, directement ou non, intentionnellement ou non.

□ CONSTRAINTS: Factors external to the intervention that could jeopardize its success and that need to be taken into account when defining the implementation strategy as well as when assessing the results achieved.

□ CONTRAINTES: Facteurs externes à l’intervention qui peuvent compromettre son succès et qui doivent être pris en considération pendant la définition de la stratégie de mise en œuvre ainsi que lorsqu’on évalue les résultats obtenus.

□ C/3: ‘Report of the Director General’ on the implementation of the (previous) Programme and Budget.

□ C/3: Rapport du Directeur général sur l’exécution du programme et budget (précédent).

□ C/4: UNESCO Medium-Term Strategy (six years) □ C/4: Stratégie à moyen terme de l’UNESCO (six ans). □ C/5: UNESCO Programme and Budget (two years). □ C/5: Programme et Budget de l’UNESCO (deux ans). □ EVALUATION: The systematic and objective assessment of an on-going or completed project, programme or policy, its design, implementation and results. The aim is to determine the relevance and fulfillment of objectives, development efficiency, effectiveness, impact and sustainability.

□ EVALUATION: Appréciation systématique et objective d’un projet, d’un programme ou d’une politique, en cours ou achevé, de sa conception, de sa mise en oeuvre et de ses résultats. Le but est de déterminer la pertinence et l’accomplissement des objectifs, l’efficience en matière de développement, l’efficacité, l’impact et la durabilité.

� Ex-ante evaluation: An evaluation that is performed before implementation of a development intervention.

� Évaluation ex-ante : Évaluation qui est conduite avant la mise en oeuvre d’une action de développement.

� Ex-post evaluation: Evaluation of a development intervention after it has been completed. Note: It may be undertaken directly after or long after completion. The intention is to identify the factors of success or failure, to assess the sustainability of results and impacts and to draw conclusions that may inform other interventions.

� Évaluation ex-post : Évaluation d’une intervention de développement une fois celle-ci terminée. Remarque : ce type d’évaluation peut être réalisé tout de suite après l’achèvement de l’intervention ou longtemps après. Le but est d’identifier les facteurs de succès ou d’échec, d’apprécier la durabilité des résultats et des impacts, et de tirer des conclusions qui pourront être utiles pour d’autres interventions.

� External evaluation: The evaluation of a development intervention conducted by entities and/or individuals outside the donor and implementing organizations.

� Évaluation externe : Évaluation d’une action de développement conduite par des services et/ou des personnes extérieures au bailleur de fonds et à l’organisation responsable de la mise en œuvre.

� Self-evaluation: An evaluation by those who are entrusted with the design and/ or planning and/ or delivery of a programme, project, activity.

� Autoévaluation: Évaluation réalisée par ceux qui ont la responsabilité de concevoir et de mettre en œuvre l’action de développement à évaluer.

□ FUNCTIONS (UNESCO’s functions in the C/4): The range of functions that UNESCO performs. These are: laboratory of ideas, standard setter, clearing house, capacity-builder in Member States, catalyst for international cooperation.

□ FONCTIONS (de l’UNESCO dans le C/4): Les différentes fonctions que l’UNESCO assume. Celles-ci sont: laboratoire d’idées, organisme normatif, centre d’échange d’information, organisme de développement des capacités dans les États membres, catalyseur pour la coopération internationale.

– Page 29 –

– Page 30 –

□ IMPACT(S): The positive and/or negative, long-term effects produced by an intervention, directly or indirectly, intended or unintended.

□ IMPACT(S): Effets à long terme, positifs et/ou négatifs, induits par une intervention, directement ou non, intentionnellement ou non.

□ IMPLEMENTATION STRATEGY: The plan, method and the series of interventions designed to achieve a specific expected result.

□ STRATEGIE DE MISE EN ŒUVRE: Le programme, la méthode et la série d’interventions conçues afin de réaliser un résultat attendu spécifique.

□ INDICATOR: see Performance Indicator. □ INDICATEUR: voir Indicateur de Performance. □ MLA: Main Line of Action. Programming level 2. □ AA : Axe d’action. Niveau 2 de la programmation. □ MILESTONE: A significant stage or event in the progress or development of a society or a project.

□ ETAPE: Evénement significatif dans le déroulement ou développement d’une société ou d’un projet.

□ OPPORTUNITIES: The factors external to the intervention that can be utilized by the intervention in order to improve the effectiveness of the activities implemented. Need to be taken into account when defining the implementation strategy.

□ OPPORTUNITES: Les facteurs externes à l’intervention qui peuvent être utilisés par l’intervention de façon a améliorer l’efficacité des activités réalisées. Elles doivent être prises en considération lors de la définition de la stratégie de mise en œuvre.

□ OUTCOMES: The medium-term effects of UNESCO’s programmes (as identified in the C/4).

□ REALISATIONS/EFFETS: Les effets à moyen terme de du programme de l’UNESCO (identifié au C/4).

□ OUTPUTS: The products, capital goods and services produced by the intervention, which are necessary for the achievement of results.

□ EXTRANTS/ PRODUITS : Biens, équipements ou services produits par l’intervention, nécessaires à l’obtention des résultats.

□ PERFORMANCE INDICATORS: The objectively verifiable units of measurement that are used for assessing and measuring progress – or lack thereof - towards a Result. Using indicators is essential for managing, monitoring and reporting as well as for evaluating the implementation of programming elements.

□ INDICATEURS DE PERFORMANCE: Unités de mesure objectivement vérifiables, pouvant être utilisées pour évaluer et mesurer les progrès - ou l’absence de progrès – vers l’obtention d’un résultat. L’utilisation des indicateurs est essentielle aussi bien pour la gestion, le suivi et la préparation des rapports que pour l’évaluation des éléments de programmation.

□ RATIONALE: The justification for the intervention itself as well as for its specific characteristics (e.g. timing, resources allocated, delivery mechanisms chosen, defined expected results).

□ FONDEMENT: Justification pour la mise en œuvre de l’intervention ainsi que de ses caractéristiques spécifiques (ex. temps, ressources allouées, mécanismes de prestation choisis, résultats attendus).

□ REACH: The people, groups or organizations who will benefit directly or indirectly from, or who will be affected by the results of the intervention.

□ RAYONNEMENT: Les individus, groupes ou organisations bénéficiant de l’action de développement, directement ou non, ou affectés par les résultats de l’intervention.

□ RESOURCES: Organisational, intellectual, human and financial inputs necessary for the implementation of an intervention.

□ RESSOURCES: Intrants de nature organisationnelle, intellectuelle, humaine et matérielle, nécessaires à la mise en œuvre de l’intervention.

□ RESULT: The describable and measurable change in state that is derived from a cause and effect relationship.

□ RESULTAT : Le changement d’état pouvant être décrit et mesuré et qui est dû a une relation de cause à effet.

□ EXPECTED RESULTS: The expected results statement should refer to a result expected to be achieved by UNESCO interventions, within the biennium and with the available resources.

□ RESULTATS ATTENDUS: Les résultats attendus doivent être réalisables par les interventions de l’UNESCO, dans le laps de temps à disposition (le biennium) et avec les ressources disponibles.

□ RESULTS CHAIN: The design of an “expected result” is required for each programming element, at any programming level. These results should form part of a chain of results: those of downstream elements combine to produce the result of the element to which they relate, a mechanism that swirls bottom up throughout the programming tree.

□ CHAINE DE RESULTATS: La formulation d'un "résultat attendu" est requise pour tout élément de programmation, à tout niveau de programmation. Ces résultats doivent faire part d'une chaîne de résultats, ceux des éléments aval s’agrégeant pour produire le résultat de l’élément dont ils dépendent, et ainsi de suite du bas vers le haut le long de l’arbre de programmation.

□ STAKEHOLDERS: Agencies, organisations, groups or individuals who have a direct or indirect interest in the development intervention or its evaluation.

□ PARTIES PRENANTES: Agences, organisations, groupes ou individus qui ont un intérêt direct ou indirect dans l’intervention de développement ou dans son évaluation.

□ TARGETS: Quantitative and qualitative levels of performance indicators that an intervention is meant to achieve at a given point in time (e.g. at the end of the biennium).

□ NIVEAUX CIBLES: Niveaux quantitatifs et qualitatifs des indicateurs de performance qu’une intervention est censée atteindre à un moment donné

□ WORK PLAN: Planning documents at level 3.

□ PLAN DE TRAVAIL: Document de Programmation au niveau 3.