methods and good practices for influential evaluations uganda evaluation week 2014

Post on 16-Dec-2015

212 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Methods and good practices for influential evaluationsUganda Evaluation Week 2014

What do we mean by influential?• Direct effects on an intervention – a basic objective,

but narrow to the interventionWider influence• Changing perceptions – core feature of the role of

evidence• Setting an agenda - by reframing the way an issue is

debated and creating pressure for change• Developing capacity - within organisations to allow

them to understand and respond to an issue.• Changing institutions – influencing policy, strategy,

resource allocation within organisations (both government and non-government sectors)

2

What matters

3

Influence

Timing

Credibility

Commun-ication

Topic relevance

Topic relevance: selecting programmes• High stakeholders interest • Strong evidence base • Innovative, or pilot programme• Cross-cutting concerns, for example, anti-corruption,

absenteeism or value for money• Level of contentiousness and risk • Financial value• Strategic importance to government objectives or is a

particular policy priority • Evaluability (whether it is possible realistically to

evaluate a programme)4

Timing

• To deliver findings and recommendations in time for key decisions:• A new phase of activities• Expansion to different locations• Annual budget deliberations• Financing decisions

5

The political context is critical … • Programs are political creatures:• Identified, designed, debated, endorsed and funded

through political processes• Values, interests and policy horizons vary • Survival is a potent political force

• Evidence based policy vs. policy based evidence:• Selective evidence, data mining, seeking ‘good results’

etc.

• Evaluation is politics:• Evaluation governance • Decisions to evaluate or not evaluate • Choice of evaluation methods

6

Credible design1. Understand

the intervention

logic

2. Develop questions

3. Structure according to evaluation

criteria4. Decide on an

evaluation theory or approach

5. Select appropriate

methods

6. Use suitable tools for data

collection

7

Improving the quality of evidence

8

Confidence about the evidence

Detailed evaluation

and statistical design

Calculation of error terms and

confidence interval for

statistics

Details of purposive

and randomised

sampling Notes on

triangulation of enquiries

Openness about bias

and limitations

Availability of data for

independent analysis/

verification

Governance and independence

• Independent or in-house team?•Whether or not to appoint:• Steering group•Reference group•Policy advisers• Technical advisers

9

The evaluation

Analysis – core source of

information

Commissioning process

Stakeholder consultation

VideoOn-line access

Press release

Statisticalsynopsis

Full Report

Short summaries

Topic briefs

Meetings &workshops

Elements of communication

10

Social media

END 11

top related