measuring policy influence: like measuring thin air? · measuring policy influence: like measuring...
TRANSCRIPT
Measuring policy influence:
like measuring thin air? John Young: [email protected]
Outline
• RAPID
• Policy processes
• Why M&E of research?
• Some approaches
• A systematic approach
2
RAPID
Power, Politics and evidence use
Evidence production and communication
Knowledge intermediaries and interactions
http://www.odi.org.uk/rapid
Policy processes and
why it’s difficult to
influence them
Identify the problem
Commission research
Analyse the results
Choose the best option
Establish the policy
Evaluation
Implement the policy
The linear logical model
Identify the policy solution
Commission research
Massage the results
Media campaign
Establish the policy
Avoid evaluation
Implement the policy
Policy-based evidence making
Monitoring and Evaluation
Agenda Setting
Decision Making
Policy Implementation
Policy Formulation
The policy cycle
Civil Society
Donors Cabinet
Parliament
Ministries
Private Sector
7
The Cynefin Framework
Treating diseases
Building hospitals Setting up health services
… in complex emergencies
8
Deming Cycle
9
The “RAPID” Approach
Academic research
communications
Develop a network
or partnership
Online
communications
Media strategy
More research
Policy advocacy
coalition
10
What is policy change?
11
• Discursive: Client-focused services
• Attitudinal: Farmers have good ideas
• Procedural: Participatory approaches to service development
• Content: UU20, UU25. New guidelines and programmes
• Behavioural: Approach being applied in practice
It’s all about behaviour change
12
Inputs Activities Outputs
By the Project
Outcome Impact Outcomes Impact Outcomes Impact
Behaviour Change
By other Actors L
evel of
Activity
Outcome Mapping
13
OUTCOME MAPPING: Building Learning and Reflection into Development Programs Sarah Earl, Fred Carden, and Terry Smutylo
http://www.idrc.ca/en/ev-9330-201-1-DO_TOPIC.html
Measuring the
impact of research
on policy
Why do it?
• To find out if you’re making a difference
• To learn about what works
• To manage better
• To account:
– to recipients
– to donors
15
Lots of methods
• Classical case studies (IDRC, IFPRI)
• Episode studies (ODI/RAPID)
• Stories of Change (Denning)
• Most Significant Change (Davies)
• Micro-Narratives / Sensemaker (Snowden)
• Outcome Mapping (IDRC)
• Impact matrices (Davies)
• Peer evaluations (CHSRF)
• HERG Payback Framework (Brunel)
• Systematic reviews (DFID)
• RCTs (IDS)
16
1. Strategy and direction –are you doing the right thing?
2. Management –are you doing what you planned to do?
3. Outputs – are the outputs appropriate for the audience?
4. Uptake – are people aware of your work?
5. Outcomes and impacts –are you having any impact?
A systematic approach
1. Strategy and direction
2. Management
3. Outputs
4. Uptake
5. Outcomes and impacts
17
1. Strategy and direction - Logframes; Social
Network Analysis; Impact Pathways etc
2. Management –‘Fit for Purpose’ Reviews;
Quality Audits; Horizontal Evaluation;
3. Outputs – Peer review; Evaluating websites;
Evaluating networks; After Action Reviews
4. Uptake – Impact Logs; New Areas for
Citation Analysis; User Surveys
5. Outcomes and impacts –Outcome Mapping;
RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies
A systematic approach
1. Strategy and direction
2. Management
3. Outputs
4. Uptake
5. Outcomes and impacts
18
Logical frameworks
Goal Indicator MOV
Purpose Indicator MOV Assumptions/Risks
Output 1 Indicator MOV Assumptions/Risks
Output 2 Indicator MOV
Output 3 Indicator MOV
Output 4 Indicator MOV
√ √
Theories of change
• Causal Chain - succession of elements with logical links (eg log-frame approach)
• Dimensions of influence - overlapping domains which interact, where it is possible to influence (eg RAPID CEL Framework)
• Actor-centred theories - where the behaviour of actors can be influenced (eg Outcome Mapping)
20
21
Theories of change
After Action Review
• What was supposed to happen?
• What actually happened?
• Why was there a difference?
• What can we learn from it?
15 minute team debrief, conducted in a “rank-free” environment.
22
ODI CommStats
23
Stories of change
1. Essential elements:
• Situation before
• Context
• Situation after
• What changed and why
2. Most Significant Change (Davies)
• Stories of change from different stakeholders
• Systematic analysis of significance.
3. Micro-narratives (Snowden)
http://www.mande.co.uk/docs/MSCGuide.pdf 24
RAPID Outcome Assessment
25 www.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
M&E in ODI
Think Tank Initiative evaluation
27
Theory of Change and Assumptions
Theory of Change and Ass Theory of Change and Ass Theory of
Theory of Change and Assumptions
Impact Uptake Management
• Review of ToC: testing the assumptions
– Literature review
– (Desk-based) political economy analysis
– Interviews
• Assessment of capacity change
– Quantitative analysis of M&E data
– Interviews and focus-groups
– Stories of change
• Policy impact
– (In country) political economy analysis
– Case studies
– Stories of change
– Interviews and focus-groups
28
Think Tank Initiative evaluation
Lessons
For researchers:
• It’s not unreasonable
• It’s not rocket science
• Be systematic
• Be proportionate
• Have a clear focus
• Have a theory of change
• Focus on behaviour
• Collect the easy stuff
• Look for the unexpected
• Write it down
• Stories / logs / episodes
29
For funders:
• Don’t be unreasonable
• Contribution cf attribution
• Be proportionate
• Do not expect all research to have (immediate) impact
• Be systematic
• Provide guidance
• Provide “space”
• Impact of programmes cf projects
• Learn together!
• Test novel approaches