mee for learning organizations lant pritchett and salimah samji a cutting edge in development...

35
MeE for Learning Organizations Lant Pritchett and Salimah Samji A Cutting Edge in Development Thinking Harvard Executive Education May 13, 2010

Post on 22-Dec-2015

214 views

Category:

Documents


1 download

TRANSCRIPT

MeE for Learning Organizations

Lant Pritchett and Salimah SamjiA Cutting Edge in Development ThinkingHarvard Executive Education May 13, 2010

http://www.youtube.com/watch?v=ZrzC_KLI8KM

“Evaluation” as an innovation/movement/advocacy

position to improve “development”

Successful Movements

Clearly articulated vision

Politically feasible coalition

“Career” trajectories

Patina of “normal science”

…but can be ineffective

Insularity, not open to question fundamental premises

Lock-in of movement specific “human capital” politically defensive

Takes too long to shift if proves ineffective

How does evaluation fit in “development”

“Development” is a coalition of narrower sub-movements both objective specific (e.g. education, health, gender, environment) and instrument specific (e.g. micro-credit, irrigation)

Help to make “successful” movements also effective

Eventually weed out the successful but ineffective sub-movements (but this is hard and unlikely to be the result of Big E evaluation)

Overview of session

Defining terms: What is “M” and “E”

Introducing “e”: The missing middle

“e” as a learning tool: The 7 step process

Aggregating up from organizational learning to system learning

Why you should care …

To identify whether there were any benefits for the investments made

Were objectives met?What factors explain the result?How can the program be improved?

Compare alternative models to get the biggest bang for your buck

To inform next generation projects

Evidence-based policy making – demonstration effect for government

What is “M” and “E”?

Monitoring (“M”):

Regular collection and reporting of information to show what progress has been made in the implementation of programs. Focuses on inputs and (sometimes) outputs.

Evaluation (“E”):

Measuring changes in outcomes and evaluating the impact of specific interventions on those outcomes. Focuses on “with and without” interventions (needs “control” group) and identifies causal impacts.

There is a difference between M and E!

Complementary roles for M and E

Monitoring

Routine collection of information

Tracking implementation progress

Focus on inputs and sometimes outputs

“Is the project doing things right ?”

Evaluation

Ex-post assessment of effectiveness and impact

Confirming project expectations

Measuring impacts

“What is the project doing?”

What do the poor say?

“Is this information you are gathering from us just to help you write your report or can you really be helpful to us?”

Woman in South Sudan

Introducing “e”: The missing middle

“e” = experiential learning

“e” lies in between M and E

Analyzing existing information (baseline data, monitoring data)

Drawing intermediate lessons

Serves as a feed-back loop into project design

Don’t always have to do Impact Evaluation

Uses within project design variations to identify differentials in the efficacy of the project on inputs and outputs for real time feedback into project/program implementation

The problem in pictures

T-1 T+1T T+2 T+5

Pre-appraisal Project effectiveness

Project closure

Lost opportunity: No timely “e” to help the project!!

Lets begin with the project time line

Lots of “M” – passing data unto God for whatever use …

Findings of “E” come too late to be of much assistance to implementers

“e” as a learning tool: The 7 step process

Step 1 •Reverse engineer from goals back to instruments

Step 2 •Design a project

Step 3 •Admit we do not know what will work

Step 4 •Identify the design space and design two more project variants

Step 5 •Strategically crawl your design space

Step 6 •“e” feeds back into a pre-specified sequential design process

Step 7 •Go back to authorizing environment

Step 1: Reverse engineer from goals back to instruments

a) Begin with a clear definition of the problem you are trying to solve. Then state the goal as well as the magnitude of the desired impact.

b) Reverse engineer your goal to program/policy/project instruments.

Clear objectives of the project (what is the problem?)

Clear idea of how you will achieve the objectives (causal chain or storyline)

Outcome focused: What visible changes in behavior can be expected among end users as a result of the project, thus validating the causal chain/ theory of change?

Magnitude Matters

Ex ante threshold justifies the cost.

If you’re hunting for hippos don’t look under the grass.

Using a storyline to structure a design concept:

PresentUnsatisfactory

Situation

FutureVisionof Success

Results

River of Uncertainties

You need a complete coherent causal chain from proposed action to desired outcome

for “how” the “what” will happen.

A dysfunctional storyline fails to deliver results

Results

River of UncertaintiesPresent

UnsatisfactorySituation

FutureSatisfactorySituation

Example: Storyline for education project

If you train teachers

Teachers acquire

usable skills

Teachers use these skills

in class

Children’s learning

increases

Make your theory of change explicit

Step 2: Design a project

a) Design a project (P1) that will help you achieve your goals.

b) Specify the timing, magnitude and gain from the project for each link in the chain.

c) Determine the indicators (input, output and outcome) that you will collect to test if your theory of change works or not.

Review: Log Frame, Results Framework, Theory of Change

Impacts

Outcomes

Outputs

Activities

InputsProcurement & Disbursements

Deliverables

Effectiveness

Efficiency

Longer-term benefits

Results

Example: Indicators for education project

Input: Money, Materials, Trainers

Output: Teachers trained

Indicators:Attendance, Participation

Output: Teachers acquire

usable skills

Indicators:Written

assessment, observation

Output: Teachers use

skills

Indicators:Observation

Outcome: Children’s learning

increases

Steps 1 and 2 are standard operating procedure

But not rigorous enough and no “E”valuation of outcomes.

In theory if not in practice (cost benefit analysis is done for only 20% of bank projects).

Haphazard/unstructured learning – ad hoc responding at mid term review.

So what are the next 5 steps …

Step 3: Admit we do not know what will work … and we certainly do not know what will work best

Acknowledge that implicit choices were made in designing the project P1.

Admit that there might be differentials in magnitude that depend on the selection of the design elements/parameters.

The mythical “alternatives considered”

Step 4: Identify the design space and design two more

project variantsa) Articulate your design space. Specify the

key parameter/elements within the design space.

b) Specify the timing, potential magnitude and uncertainty of the gain for each of these possible project variants.

c) Select two (or more) new projects based on the highest uncertainty and upside potential.

d) Repeat step 2(c) for each of the new projects (i.e. determine indicators for P2 and P3).

4a. Articulate your design space

Design Elements

Design Space

D1 D2 D3 D4 D5 D6 D7 D8

Location (A,B)

A A A A B B B B

Content (α,β)

α α β β α α β β

Follow-up (I,II)

I II I II I II I II

Using our education example of teacher training, assume 3 design parameters with 2 options each:

• Location: Centrally (A) or in School (B)• Content: Subject matter (α) or Pedagogy (β)• Follow-up: Semi-annually (I) or Annually (II)

4b. Specify potential magnitude and level of

uncertainty of impact for all project variants

Location

A A A A B B B B

Content

Followup

I II I II I II I II

P1

Step 5: Strategically crawl your design space

Pilot the projects P1, P2 and P3 for the duration of time that you determined.

P2 and P3 could serve as an “internal counterfactual” for P1, if randomly assigned.

Collect all input and output indicators for all three projects.

Step 6: “e” feeds back into a pre-specified sequential

design processAnalyze the data you collected for P1, P2 and P3.

Based on analysis crawl to next most promising component of the design space and repeat Step 4.

Example of a sequential design process: Electricity

provision to slums

Source: Anand&Garcia 2010

The problem in pictures - revisited

T-1 T+1T T+2 T+5

Pre-appraisal Project effectiveness

Project closure

“e” feeds back into design process helping implementers learn

Feedback loop between Step 6 and 4

Step 1 •Reverse engineer from goals back to instruments

Step 2 •Design a project

Step 3 •Admit we do not know what will work

Step 4 •Identify the design space and design two more project variants

Step 5 •Strategically crawl your design space

Step 6 •“e” feeds back into a pre-specified sequential design process

Advantages of “e” over “E” evaluation

Project implementers feel part of the process, see the benefits, are bought in and knowledge is co-produced

No collection of data on a “no program” group required—the comparisons are “within program/project” variants

Can handle truly universal programs if a control is simply impossible.

You can learn or generate hypotheses you did not anticipate

Ability to explore the interactions of the policy or policies with all kinds of background variables

Big E evaluation often cannot usefully distinguish causes of failure—many projects simply fail to be implementedcan explore only a tiny part of the design space (even with 5 design parameters, 2 options each, with complementarities the dimensionality blows up)generalization beyond places where the specific distribution of all variables that can influence the outcome is precisely the same as the original study location

Step 7: Go back to authorizing environment

o How does evaluation fit into Ministry of Finance, Planning Ministry and/or Chief Economist of Countries.

o “e” helps sectors come back with the best possible project.

o “e” creates legitimate space for organization failure.

Organization portfolio of MeE

Projects MeE Portfolio

Routine Long on “M” 80%

Innovation Long on “e” 10%

Large flagship Long on “E” 10%

The Achievements of the best of aid look like the Conditions

of the worst of aidEvery Hollywood movie has the same plot: a sympathetic character overcomes increasingly difficult obstacles to achieve their final objective

William Goldman

The problem wasn’t that Rocky had the same plot as all other Hollywood movies, it was the inauthentic repetition