setting the scene: the state of humanitarian evaluations in canada
TRANSCRIPT
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
1/13
Setting the scene:The state of humanitarian
evaluations in CanadaFranois Audet
Canadian research institute for emergencies and aid
OCCAHUniversit de Montral
December 13, 2011
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
2/13
Objectives
What do we mean by evaluation ofhumanitarian interventions in Canada?
What kind of Canadian expertise dowe have, need and expect?
What are the challenges of evaluating theimpact of humanitarian programs?
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
3/13
Methodology
Research and review of scientific andinstitutional literatures;
Interviews with 15 experts & Canadianhumanitarian organizations;
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
4/13
What the literature has to sayabout it?
No common definitions (OECD,DARA, OCHA, Harvey, Beck, etc)
Evaluation of humanitarian action(EHA) is defined by ALNAP as a systematic and impartial
examination of humanitarian actionintended to draw lessons to improvepolicy and practice and enhance
accountability;
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
5/13
Basic assumptions about evaluation of humanitarian interventions
It needs to be planned as soon as possible in the project-process;
It should address the recommended eight evaluationcriteria: Relevance; Efficiency; Effectiveness; Impacts;
Sustainability; Connectedness; Coherence; & Coverage There are different kinds of evaluation, whichdepend of the objectives of the process:
M&E ; Real Time Evaluation; Lessons learned;Internal VS External, etc.;
What the literature has to sayabout it?
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
6/13
Evidences of conflict of interest between theevaluator, the evaluated, and the donor(House, 2004)
Key findings & major concerns are: 1) The evaluations rarely target actual problematic
projects, or real problems within a project; 2) Often are a technocratic exercise, rather than an
in-depth and objective assessment 3 ) Are not systematized and rarely used as a
lessons learned process; (Prouse de Montclos,2011);
What the literature has to say aboutit?
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
7/13
Some key results from the
interviews
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
8/13
What do we mean by evaluation ofhumanitarian interventions in Canada?
Inconsistency : internal &/orexternal; objectives & methodologiesvaried; Dilemma with objectivity :Internal assessments are morerecurrent than external;
Donors driven versus lessonslearned or capacity building drivenOutput driven versus process driven;
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
9/13
What kind of Canadian expertise dowe have, need and expect?
Limited internal expertise. We often go troughthe other branches/federations
It seems preferable to do the evaluationourselves, as we have the impression of even aweaker expertise outside ;
Very limited capacity in Canada; not enough
resources are assigned Consensus about: very little institutional
culture This context boosts the existing gap between
the marketing rhetoric and the field reality;
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
10/13
Key conclusions & challenges of evaluating theimpact of humanitarian programs
First: literatures and rhetoric tend to agree;
High turnover of staff working in humanitarian actionthat affects organisational memory and reduce theneed for a more systemic approach;
Reactive and quick implementation of humanitarianaction that affects planning and the identification of performance measures;
Limited funding available
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
11/13
Lack of objectiveness. Most of the assessments are doneby internals. There is a need to lean for an impartial
approach;
Canadian humanitarian organizations should encourage a
cultural change: Training & sharing lessons learned
Expose and debate the results;
Work with academics for debates & ensure objectivity, develop
methodologies, etc;
Key conclusions and challenges ofevaluating the impact of humanitarian
programs
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
12/13
ReferencesM.A. Prouse de Montclos. 2011. L'aide humanitaire dans les pays en dveloppement
: qui value qui ? Mondes en dveloppement, 2011/1 n153, p. 111-120. DOI :10.3917/med.153.0111
P.Harvey, A. Stoddard A.Harmer and G. Taylor. 2009. Ltat du systme humanitaire :valuer les performances et les progrs . tude pilote. London. ALNAP.
Beck, T. (2006). Evaluating humanitarian action using the OECD-DAC criteria: AnALNAP guide for humanitarian agencies . London, UK: Overseas DevelopmentInstitute.
E. House. (2004) The role of the evaluator in a political world , Canadian Journal of Program Evaluation 19, 2, 1-16.
OECD (1999), Guidance for Evaluating Humanitarian Assistance in ComplexEmergencies . Paris.
Hallam, A. (1998), Evaluating Humanitarian Assistance Programmes in Complex
Emergencies . London, UK: ODI. Good Practice Review 7.
Harvey, P. (1997), PRA and Participatory Approaches in Emergency Situations: AReview of the Literature and ACTIONAIDs Experience . London: ACTIONAID.
Dabelstein, N. (1996), Evaluating the International Humanitarian System in DisastersVol. 20, No. 4. ODI, London.
-
8/3/2019 Setting the scene: The state of humanitarian evaluations in Canada
13/13
Thank you
Observatoire canadien sur les crises et laide humanitaire Universit de Montral
www.occah.org
http://www.occah.org/http://www.occah.org/