a walk through the world of participatory monitoring and ... · a walk through the world of...
TRANSCRIPT
1
A Walk through the World of Participatory Monitoring and
Evaluation
UNITAR 2012 Fellowship Program for
Afghanistan Phil Cox
(www.plannet.ca)
3
Monitoring and Evaluation (M&E) - What is it?
• A form of inquiry whose focus is some evaluand (program, process, organization, person, etc) which results in “merit” and/or “worth” judgments about it. (Guba and Lincoln, 2001) – Merit = intrinsic quality (sourced within itself, rather than
because of associations) – Worth = extrinsic quality (sourced from its interactive effects
with its context)
4
Evaluation - Critical Questions
• What gets evaluated? • Who makes the judgment? • On what criteria? • For what purpose?
Why are these so important?
5
Participatory Monitoring and Evaluation (PME) - What is it? • A set of principles and a process of
engagement in the monitoring and evaluation endeavour
6
PME - Key Features
• Evaluation “process” is as important as the “content”
• Local people are “actors” not “subjects” • Evaluators can be insiders or outsiders, or
both – Outside evaluators are facilitators or catalysts who
share, rather than hold power – Insider evaluators know the project well
• Evaluation Inquiry must be locally relevant and as unhurried as possible
7
PME - Key Features • Inquiry methodology must be respectful of
different ways of knowing and cognizant of power dynamics
• Making sense of the data requires lots of dialogue/iteration
• Findings and conclusions need validation - those who participate in the inquiry should be included in the analysis
8
Conventional M&E vs PME - Contrasts
Empowerment - to help people initiate, navigate, control
Accountability - to address donor requirements
More frequent, small scale evaluations
Usually Mid and End of Project
Self-evaluation, simple methods adapted to local conditions; open, immediate sharing of results through local involvement
Focus on scientific objectivity, evaluator distanced from other participants; delayed and limited access to results
People identify their own indicators of success
Predetermined indicators of success
Community members, project staff, facilitator(s)
External experts
Participatory Conventional
Why
When
How
What
Who
Adapted from Narayan-Parker, 1993: 12 (taken from Estrella and Gaventa, 1998)
9
One point of view about PME K
• Participatory Methods, a Qualified OK… – Advantages…"
• Low cost. • Can be conducted quickly. • Provides flexibility to explore new ideas.
– Disadvantages… "• Findings usually relate to specific communities or
localities thus difficult to generalize from findings. "• Less valid, reliable, and credible than formal surveys.
(World Bank, 2004) • Labour intensive and costly
10
Another Point of View about PME J
• If facilitated well, participatory methods of inquiry may yield more accurate and reliable information than more ‘conventional’ methods - such as (randomized) households surveys.
• Participatory methods (e.g. mapping, ranking, theatre improv.) address limitations of: – Intimidation - questions “out of nowhere” – Inconvenience - “many questions” – Local Relevance - questions often based on externally
derived indicators (Mayoux & Chambers, 2006) • Maybe more labour intensive and costly, but worth it
to the extent that it fully engages stakeholders and propels the project forward
11
Evaluation – Conventional vs Participatory?
1. Have you experienced evaluation in one form or another?
2. Was it (were they) more classical or participatory? How so?
3. What do you think are the most important considerations when designing an evaluation?
12
PME Purposes
• Understand changes in organizations or
communities • Show how stakeholders view the project from
their vantage point, and engage them in dialogue
• Generate insight for project management • Address accountability relationships
13
PME - Process
1. Deciding to use PME - Considerations 2. Building a PME Team 3. Making a PME Plan 4. Collecting the Data 5. Synthesizing, Analysing and Verifying the
Data 6. Planning Forward
14
1. Deciding to use PME…
• Under what circumstance might you not want to employ participatory approaches?
• M and E approach should be consistent with the way in which the program/project has been designed and implemented
15
1. Deciding to use PME…
• Important to situate the actors and their interests - those – funding – implementing – participating and benefiting
Donor/funder
Evaluator/facilitator Implementor & community(ies)
16
1. Deciding to use PME…
• When in the program or project cycle would it be best to start thinking about PME design?
• Up front, when the program/project is being designed
17
1. Deciding to use PME- Key Consideration
• The level of rapport and trust – within the implementing team – between the implementing team and main
community stakeholders
• The greater the rapport / trust… – the easier it will be to evaluate together – the more can be done - breadth, depth
18
2. Building a PME Team
• Insiders - What can they bring to the evaluation table?
• On the positive side… – Deep knowledge of project implementation – Local knowledge (community) – A forward perspective - a stake in the
future
19
2. Building a PME Team
• Insiders - What can they bring to the evaluation table?
• On the LESS positive side… – A singular or narrow perspective – A vested interest – Their position of power (over)
20
2. Building a PME Team
• Outsiders - What can they bring to the evaluation table?
• On the positive side… – Curiosity from an independent vantage point – A perspective, uncluttered by project detail
“can see the forest not just the trees” – An ability to facilitate an evaluative process - to encourage
dialogue, to challenge, to mediate – Local knowledge/ appropriate cultural perspectives
independent of the project
21
2. Building a PME Team
• Outsiders - What can they bring to the evaluation table?
• On the LESS positive side… – They don’t (or can’t) really understand the project
and its context – They bring their outside ideas in too strong and
make in inappropriate assumptions
22
2. Building a PME Team
• A Joint Team - a balancing act… – Disinterest - Vested Interest – Distance - Closeness – General - Particular
23
2. Building a PME Team
• Co-evaluators…How many is too many?
• Procedurally awkward with too many • Best to engage stakeholder
representatives in specific areas of inquiry - on matters most pertinent to them
24
3. Making a PME Plan • Behind most projects
lurks a logic model – Important to know
• Who created it and how • How much people use it
– Tension - how much to follow the logic model and its indicators?
• Reconciling top down and bottom up perspectives
• Prescriptive vs Emergent
25
Results Logic Models Love them or Hate them?
Have you ever created a logic model? or had to work with one? How would you describe the experience?
L K J
26
3. Making a PME Plan • Orientation
– Clarify purpose of the Monitoring and Evaluation
– Get a sense of the overall project (logic model)
– Generate stakeholder questions
– Explore possible inquiry methods, sources, timeframes
27
3. Making a PME Plan
• Team Building/ Training – Rare, but great if you
can do it! • Builds trust,
confidence and skills in some of the information gathering methods.
28
4. Collecting the Data
• Conventional Methods - surveys, focus groups, observation, document reviews, counts, etc.
• Participatory Reflection and Action Methods - modeling/mapping, ranking exercises, calendars, walks, historical profiles
29
4. Collecting the Data
• Participatory Reflection and Action Methods - modeling/mapping, ranking exercises, calendars, walks, historical profiles
30
4. Collecting the Data - considerations
• Important to guard against… – Being too extractive - asking too many
questions framed by outsiders – Large, imposing groups moving through
communities – Interruptions to peoples’ daily routines – Springing PRA exercises on unsuspecting
groups – Overuse, of the PRA “toolbox” - contrived
31
4. Collecting the Data - Images
• Mapping communities – Showing before and
after differences • Land use • Housing • Community
Infrastructure • Boundaries and
contested areas
32
4. Collecting the Data - Images
• Mapping Bodies – Showing before and
after differences • Anatomy • Reproductive cycles • Reproductive health
issues
33
4. Collecting the Data - Images
• Encountering Unanticipated Indicators of Success – Literacy Outcome - A
shift from thumb prints to signatures in the record book
34
4. Collecting the Data - Images
• Bringing Stakeholders Together – Using the evaluation
as a means to build bridges, explore new possibilities
35
Data Collection What has worked for you?
• Describe a moment when you felt like you were getting very good information
• What were you doing to make this so?
36
5. Synthesizing, Analyzing and Validating Data
• PME generates lots of data, simply because of the … – number of people
acting as co-evaluators
– and variety of different encounters that might take place in the course of a day
37
5. Synthesizing, Analyzing and Validating Data
• Daily debriefs are essential – Informal and relaxed,
but facilitated to ensure each co-evaluator has their turn
– Project logic model, key evaluation questions a helpful organizing tool
38
5. Synthesizing, Analyzing and Validating Data
• Daily debriefs… – An opportunity to…
• Identify information/learning gaps – Conflicting or contradictory information – Unanswered questions – Unheard perspectives
• Allocate information gathering tasks – Places to go – People to see – Areas of inquiry to pursue
39
5. Synthesizing, Analyzing and Validating Data
• Reporting back preliminary findings… – An obligation to share findings with
those who engage with the evaluation
• Community • Project Team • Project as a Whole • Organization
– Important to place emphasis on findings that are relevant at each level
– Idea is to validate and/or elaborate on insights generated to date
40
6. Planning Forward
– All being well, corrective or reinforcing decisions are being taken by project stakeholders even before the main evaluation report is written… • In PME the distance between evaluation
and planning is (should be) short