jonathan a. morell, ph.d. director of evaluation – fulcrum corporationfulcrum corporation...
TRANSCRIPT
Jonathan A. Morell, Ph.D.Director of Evaluation – Fulcrum [email protected]://evaluationuncertainty.com(734) 646-8622
Presented to theUnited Nations Development ProgrammeFebruary 20th, 2014
© 2012 Jonathan Morell
Strong Evaluation Designs for Programs with Unexpected
Consequences
The Essence of the Problem
2© 2012 Jonathan Morell
Complex system behavior drives unexpected outcomesNetwork effectsPower law distributionsIgnoring bifurcation pointsState changes and phase shiftsUncertain and evolving environmentsFeedback loops with different latenciesSelf organization and emergent behaviorIgnoring full range of stable and unstable conditions in a systemEtc.
Guaranteed evaluation solutionPost-test onlyTreatment group onlyUnstructured data collection
But we loose many evaluation toolsTime series dataComparison groupsSpecially developed surveys and interview protocolsQualitative and quantitative data collection at specific times in a project’s life cycleEtc.
Why the loss? Because establishing evaluation mechanisms requireTimeEffortMoneyNegotiations with program participants, stakeholders, and other parties
Some Examples of the Kinds of Problems we may Run Into
Program Outcome Evaluation is Looking for
Possible Unexpected Outcomes
Evaluation Design Weakness
Free and reduced fees for post-natal services
Survey/interviewHealth indicators for mother and childChild development indicators
Drug and supply hoarding
New sets of informal fees
Lower than expected use of service
No interview or observation to estimate amount of fees
No way to correlate fees with attendance or client characteristics
Improve agricultural yield
Records, interviews, observationsYieldNew system costProfit
Perverse effects of increased wealth disparities
No other communities to check on other reasons for disparity
No interviews to check on consequences disparities
Improve access to primary education
Records, surveysAttendanceGraduationLife trajectory
Interaction with other civil society development projects
Networking effects of connections
Census of other civil society projects
Data on interaction among projects
Data on consequences of interaction
© 2012 Jonathan Morell
Adding “Surprise” to Evaluation Planning
4
FundingDeadlinesLogic modelsMeasurement Program theoryResearch designInformation use plansDefining role of evaluatorLogistics of implementationPlanning to anticipate and respond to surprise
© 2012 Jonathan Morell
Overall Summary: Methods
5© 2012 Jonathan Morell
· Get lucky· Knowledge from stakeholders· Good program theory· Use research literature· Use experts
· Complex system behavior makes prediction impossible no matter how clever we are.
PS – do not assume that complex systems are always unpredictable!
Foreseeable Unforeseeable
Forecasting & program monitoring
System based logic modeling
Limiting time frames
Exploiting past experience
Theory
Retooling program theory
Agile methodology
Data choices
6© 2012 Jonathan Morell
Let’s look at this one.
· Get lucky· Knowledge from stakeholders· Good program theory· Use research literature· Use experts
· Complex system behavior makes prediction impossible no matter how clever we are.
PS – do not assume that complex systems are always unpredictable!
Foreseeable Unforeseeable
These methods are most useful early in evaluation life cycle
Limiting time frames
Exploiting past experience
Theory
Example Improve Access to Primary Education
Outcome Evaluated For Possible Unexpected Outcomes
Evaluation Design Weakness
Records, surveysAttendanceGraduationLife trajectory
Interaction with other civil society development projects
Networking effects of connections
Census of other civil society projects
Data on interaction among projects
Data on consequences of interaction
© 2012 Jonathan Morell
A Relevant Theory: We Know About Phase Shifts When Network Connections Increase
EvaluationRedesign
Identify other civil society programs Measure connections Ignore details of which programs are
connected Collect data frequently to detect timing of
change
8© 2012 Jonathan Morell
Let’s look at this one.
These methods are most useful for detecting leading indicators
Forecasting & program monitoring
System based logic modeling
· Get lucky· Knowledge from stakeholders· Good program theory· Use research literature· Use experts
· Complex system behavior makes prediction impossible no matter how clever we are.
PS – do not assume that complex systems are always unpredictable!
Foreseeable Unforeseeable
The trick is to do a little better than the
Delphic oracle
Example: Agricultural Yield
Outcome Evaluated For
Possible Unexpected Outcomes Evaluation Design Weakness
Records, interviews, observationsYieldNew system costProfit
Perverse effects of increased wealth disparities
No other communities to check on other reasons for disparity
No interviews to check on consequences disparities
Evaluation Methodology: Expand Monitoring Outside Boarders of Agriculture Program
Evaluation Redesign Adopt a “whole community” perspectiveIdentify a wide range of social indicatorsIdentify a diverse set of key informantsConduct regular open-ended interviewing
© 2014 Jonathan Morell
10
How can an evaluation be designed to change?
Agile Evaluation
© 2012 Jonathan Morell
Let’s look at this one.
· Get lucky· Knowledge from stakeholders· Good program theory· Use research literature· Use experts
· Complex system behavior makes prediction impossible no matter how clever we are.
PS – do not assume that complex systems are always unpredictable!
Foreseeable Unforeseeable
Retooling program theory
Agile methodology
Data choices
Example Free / Reduced Fees for Post-Natal Services
Outcome Evaluated For Possible Unexpected Outcomes Evaluation Design Weakness
Survey/interviewHealth indicators for mother and childChild development indicators
Drug and supply hoarding New sets of informal fees Lower than expected use of
service
No interview or observation to estimate amount of fees
No way to correlate fees with attendance or client characteristics
Add a process component to the evaluation designSurvey of mothers to assess total cost of service Open ended interviews with clinic staff about consequences of the new system for their work lives
Nice to say, but agile evaluation can be expensiveDo we want both?Do we want only one of these tactics?These are the kinds of questions that have to be added to all the other decisions we make when designing an evaluation
© 2014 Jonathan Morell