system review of evaluation in german development cooperation

Post on 02-Jan-2016

19 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

System Review of Evaluation in German Development Cooperation. DAC Network on Development Evaluation meeting, 18 November 2008 Michaela Zintl, Head of Evaluation and Audit Divison, BMZ. Rationale. First review in 1998 / implementation of recommendations monitored in 2001 - PowerPoint PPT Presentation

TRANSCRIPT

System Review of Evaluation in German Development CooperationDAC Network on Development Evaluation meeting, 18 November 2008Michaela Zintl, Head of Evaluation and Audit Divison, BMZ

Rationale

First review in 1998 / implementation of recommendations monitored in 2001

Significant changes since: International level: MDGs & Paris Declaration;

new developments also in the field of evaluation National level: changes in the budget code; „joint

up“ German development cooperation

Purpose

...to assess whether the evaluation system of German development cooperation performs adequately with a view to strategic and conceptual demands and in terms of organisational, managerial and methodological benchmarks (DAC norms and standards, international good practice)

...to develop recommendations for improvements of the system

Scope : evaluation systems of

BMZ (Ministry) Six implementing agencies (KfW, GTZ, ...) and the

private sector arm (DEG) 12 CSOs receiving funds from BMZ further 30 CSOs (online questionnaire)

...covering 80% of bilateral aid funded by BMZ

Methodology

Guided self-assessment Document analysis (including assessment of

evaluation reports) Face to face interviews with staff and management

of all organisations involved, external experts and MoP: 170 interviews

Caveat:

Not included: Partner countries‘ views DC managed by other federal ministries and

federal states Voluntary contributions to international

organisations „Light“ review of evaluation reports

Findings: Independence and credibility + Significant improvements since last review

+ Most organisations acknowledge DAC norms and standards

- Evaluation underfunded almost everywhere

- Coverage appears wanting (insufficient data)

- Lack of ex post evaluations, impact assessments and comprehensive comparative studies

Findings: Quality+ Has generally improved, due to

Acceptance of standards & training Results orientation Improvement in process quality

- Insufficient methodological awareness – mix of methods more rhetoric than practice;

- Almost no impact evaluations

- Evaluation knowledge not considered a priority for selection of staff and consultants

Findings: Participation

o No significant changes since last review

- Evaluations continue to be donor centered except for some NGOs

- Insufficient use of local independent consultants

- (Too) little evaluation capacity building

Findings: Utility

+ Learning is usually the primary objective

+ Evaluations are considered valuable by intended users

o Mostly used for management of individual programms – little conceptual learning

- Little inter-institutional learning

- Focus on learning by donor agencies not partners

Conclusions I: Improvements on many dimensions but challenges remain Institutional independence increased but insufficient use

of external consultants Credibility increased because of better quality but limited

by lack of transparency Quality improved but limited range of methods used Participation: no significant improvement Utility: internal and instrumental learning: good and

improved; inter-ageny and conceptual learning: poor

Conclusions II: System development

Heterogenity between agencies has rather increased over time

System tilt between BMZ and implementing agencies

Steering capacity of BMZ too low Most pressing problem: overcoming institutional

fragmentation

Recommendations

General recommendations along the various dimensions in synthesis report – specific recommendations in case study reports

On the system level: Massive strengthening of BMZ Independent evaluation agency (SADEV model) Independent evaluation advisory board (DFID

model)

Lessons learned/ follow-up: too early to say but in any case

Combination of external experts and peers essential Task to review 20 organisations is overwhelming Partly a very political process, partly benign neglect Change triggered already during the process in

almost all organisations, although some prefer to say otherwise

top related