a process to develop operational bottom-up evaluation

13
HAL Id: hal-00505786 https://hal.archives-ouvertes.fr/hal-00505786 Submitted on 26 Jul 2010 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. A process to develop operational bottom-up evaluation methods – from reference guidebooks to a practical culture of evaluation Jean-Sébastien Broc, Bernard Bourges, Jérôme Adnot To cite this version: Jean-Sébastien Broc, Bernard Bourges, Jérôme Adnot. A process to develop operational bottom-up evaluation methods – from reference guidebooks to a practical culture of evaluation. Saving energy – just do it!, Jun 2007, La Colle sur Loup, France. pp. 659-670. hal-00505786

Upload: others

Post on 11-Jan-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A process to develop operational bottom-up evaluation

HAL Id: hal-00505786https://hal.archives-ouvertes.fr/hal-00505786

Submitted on 26 Jul 2010

HAL is a multi-disciplinary open accessarchive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come fromteaching and research institutions in France orabroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, estdestinée au dépôt et à la diffusion de documentsscientifiques de niveau recherche, publiés ou non,émanant des établissements d’enseignement et derecherche français ou étrangers, des laboratoirespublics ou privés.

A process to develop operational bottom-up evaluationmethods – from reference guidebooks to a practical

culture of evaluationJean-Sébastien Broc, Bernard Bourges, Jérôme Adnot

To cite this version:Jean-Sébastien Broc, Bernard Bourges, Jérôme Adnot. A process to develop operational bottom-upevaluation methods – from reference guidebooks to a practical culture of evaluation. Saving energy –just do it!, Jun 2007, La Colle sur Loup, France. pp. 659-670. �hal-00505786�

Page 2: A process to develop operational bottom-up evaluation

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 659

4,105 Aprocesstodevelopoperationalbottom-upevaluationmethods–fromreferenceguidebookstoapracticalcultureofevaluation4,105Brocetal

A process to develop operational bottom-up evaluation methods – from reference guidebooks to a practical culture of evaluationJean-Sébastien BROCEcole des Mines de Nantes (DSEE)[email protected]

Bernard BOURGESEcole des Mines de Nantes (DSEE)[email protected]

Jérôme ADNOTEcole des Mines de Paris (Centre of Energy and Processes)[email protected]

Keywordsevaluationmethodology,bottom-upmethod,impactevalua-tion,theory-basedevaluation,EnergyServicesDirective,ex-periencecapitalisation,evaluationcapacitybuilding

AbstractNeedsforevaluatingenergyefficiency(EE)activitiesare in-creasing,fortheaccountingofresultsandforunderstandingtheirsuccess/failures.Indeedevaluationresultsshouldbeusedforbothreportingpastactivitiesandimprovingfutureopera-tions.Lackofeasytousemethodsispointedoutbylocalstake-holdersasamajorbarriertoevaluation.Anotherissueisthefrequentnegativeperceptionofevaluation,experiencedasacontroland/orawasteoftime.

Thispaperpresentsasystematicprocesstodevelopbottom-upevaluationmethodsdesignedtofittostakeholdersneeds:directlyoperational,easytoappropriate,providingusefulcon-clusionstoimproveoperationsandtocommunicateabouttheirresults.

Ourapproachreliesontheprincipleofexperiencecapitali-sationandonanorganisationwithtwolevels,centralandon-field.Itaimstocreateconditionsforcontinuousimprovement.Moreover it should insure involvedstakeholdersdoactuallytakepartinandtakeadvantageoftheevaluationprocess.

Thismethodologyhandlesbothimpactandprocessevalu-ation.Fortheimpacts,focusisoncalculationstransparency,dataqualityandreliabilityoftheresults.Regardingoperationprocess,main issuesareanalysingcausalitybetweenactionsandresults,anddetectingthesuccessandfailurefactors.

Thisworkwasfirstdevelopedfortheevaluationoflocalop-erationsinFrance1.Theresultingmethodologywastestedontwocase studies from theEcoEnergyPlan, a localEEpro-grammeimplementedinSouth-EastofFrance.

IntroductionTheusefulnessofevaluationmaybeexpressedthroughitstwodimensions,summative(whataretheresults)andformative(howtoimproveand/orinsuresuccessandcost-effectivenessofoperations)[EuropeanCommission1999].Intheparticularfieldofenergyefficiency(EE)activities,referenceguidebooksastheEuropean[SRCI2001pp.8-10]ortheIEA[Vreuls2005pp.4,8-10]oneshighlightneedsandreasonsforevaluatingEEprogrammes:quantifyingandreportingresultsinaregulatoryframework,improvingcost-effectiveness,insuringbestuseofpublicfunds,etc.Theyalsoemphasisenumerousframeworkswhichincreasetheneedforevaluation,astheKyotoProtocolortheEUDirectiveonEnergyEnd-useandEnergyServices[ESD2006].

Suchevaluationneedsalsoappearatamorelocalscale.In-deed,localEEactivitiesaregrowing,asnationalgovernmentsrequirelocalauthoritiestoinvolvethemselvesinlocalEEpoli-ciesandaslocalauthoritiesdevelopaswelltheirowninitia-tives.ThiswasconfirmedbyaninventoryoflocalEEactivitiesinFrance,whichstressedalackofevaluationpracticefortheseactivities[Broc2005].Contactswithstakeholdersduringthisinventorybroughtoutthatoneofthemainreasonsevaluations

1. within a partnership between ARMINES, the Wuppertal Institute for Climate Environment and Energy, and EDF R&D (Electricité de France)

Broc,Jean-SébastienBourges,Bernard

Adnot,Jérôme

Contents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index AuthorsContents Index Authors

Page 3: A process to develop operational bottom-up evaluation

660 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION

werenotperformedwasthattheydidnotdisposeofopera-tionalevaluationmethodstheycouldeasilyapply.

Thispaperpresentsamethodologysetuptodevelopsuchmethodsforex-postevaluations.First,specificationswerede-ducedfromanalysingneedsandexpectationsofthestakehold-erstowardevaluation.Methodologyprinciplesandstructureswere then drawn to tackle these issues. Besides, a literaturereviewandcasestudieswereperformedtoproviderequiredmethodologicalmaterials.Waystoaddresskeyissuesaresub-sequentlydetailedaboutthequantificationofimpactsandtheanalysisofprogrammetheory.Theresultingsystematicprocesstodevelopoperationalevaluationmethodsisdescribed,andways to involve stakeholders in evaluation process are dis-cussed.Finally,examplesofapplicationonconcretecasesarebrieflyintroduced,aswellasperspectivesofadaptationofthismethodologytoevaluationrequirementsfortheimplementa-tionofESD.

Definingspecificationsforevaluationmethodsatlocallevel

WhAtDoloCAlstAKeholDersexpeCt/neeDAboutevAluAtIon?Apreviouspaper[Broc2005]presentedtheresultsfromanin-ventoryoflocalEEactivitiesinFrance.Duringthisinventory,manylocalstakeholderswerecontactedandaskedabouttheir

evaluationpracticeandneeds.Mainexpectationswere thendeducedfromtheiranswers[Broc2006app.122-123]2,3,4.

ADvAntAgesAnDDrAWbACKsoftheloCAlDImensIonImplementinganoperationatalocalscalecangivebothad-vantagesordrawbacksspecifictothelocaldimension.Thesearepresentedinthetablebelow.

AdvantagesofimplementingoperationsatalocalscaleSpecificadvantagesoflocaldimensionmaybegroupedinthreemainlines:

nearness:

localstakeholdersareclosertothetargetedpublics,andparticularlyefficienttoreachscatteredtargets

localstakeholdersformnetworkswithsignificantabilityofmobilisation

localstakeholdershaveabetterknowledgeoflocalspe-cificities(e.g.priorityneeds,specialbarriers)

2. These conclusions are also confirmed by other studies about involvement of local stakeholders in local energy policies in France [AGORA 2002, Godinot 2004, Bouvier 2005, Trouslot 1995].

3. justifying usefulness and cost-effectiveness of implemented operations: either for contractual requirements, as the CPER (Contrat de Plan Etat-Région), a six-year agreement framework between French State and Regional Councils, or to-wards ratepayers

4. dissemination of the results: presenting evaluation work and results in a peda-gogic way, in order to contribute to train local stakeholders to new issues they have to address

4,105 BROC ET Al

summative dimension of evaluation formative dimension of evaluationm

ain

eva

lua

tio

no

bje

ctiv

es

- quantifying final results (energy savings, avoided emissions

or load reduction)

- feeding clear, synthetic and well-documented indicators

(for decision making, for communication and for

benchmarking)

- justifying usefulness and cost-effectiveness of implemented

operations, and pointing out local initiatives and/or

contributions to national bodies

- detecting success factors, to improve and support

easier reproduction of operations

- providing experience feedback documented enough for

experience sharing and capitalisation

- providing elements for critical analysis of results,

especially to better understand and interpret indicators

- dissemination of the results

- developing local skills to rebalance tests of strength

between national and local bodies, and enabling a better

transparency for debates

resu

lta

nt

exp

ect

ati

on

s

- quality of data used, as it is the first factor affecting reliability

and therefore usefulness of evaluation results

- transparency of the evaluation methods used: good

understanding and interpretation of reported results, results

discussion and checking, and comparisons with other

operations

- credibility of reported results, in order to insure recognition

of local initiatives by national bodies

- to show and explain to decision-makers (and also to the

final public targeted) successful experiences, in order to

convince and to familiarise them to these new

approaches

- to involve stakeholders in evaluation process: for

decision making, and for both a stronger involvement of

local stakeholders in EE activities (from passive to

active strategies), and a change of scale (from pilot

operations to widespread diffusion of best practices)

table1.evaluationobjectivesandexpectations

Contents Index Authors

Page 4: A process to develop operational bottom-up evaluation

PANEl 4. MONITORING AND EVAlUATION

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 661

flexibility:

localinitiativescanadapttolocalcontextsandtakead-vantageofgoodawarenessoffieldconditions

localframesletmorefreedomtoimplementactions,andthereforefavournewideasandwaystoact

locallevelismoreoperational:operationscanbestartedfaster(e.g.decisionmakingprocess isshorter,smallerbudgetsareeasiertoallocate)

anchoring and integratingdifferentpoliciesonagiventer-ritory:

whenthetargetterritoryiswelldefined,itmakeseasiercross-sectoralandglobal5approaches

targeting a given territory may enable economies ofscope

Moreover, interactions between stakeholders at local level6maybeprescriptivebothinapositiveornegativeway,result-ingeitherinsynergiesorinadditionalbarriers.

DrawbacksofimplementingoperationsatalocalscaleMaindrawbacksoflocaldimensionare:

unbalancedpositionsfornegotiationsbetweenlocalandna-tional(orinternational)bodies(e.g.negotiationsbetweenalocalauthorityandsupermarketdistribution)

smalleropportunitiesforeconomiesofscaleincomparisontonationaloperations

smallerfinancialmeansavailable(incomparisontonationaloperations)

5. global means here integrating different public policies: e.g. activities for a given neighbourhood considering both housing and transportation issues

6. for France, these interactions concerning local energy policies have been well analysed by Bouvier [2005]

speCIfICItIesofevAluAtIonAtloCAllevelEvaluatinglocalEEactivitieshastotakeaccountoftheirlo-caldimensionasexpressedabove.But italso includesotherspecificities

methodologystructureandprinciplesSpecificationsdescribedintheabovesectionenabletodrawthestructureandprinciplesforourevaluationmethodology.

generAlstruCturebAseDonmAInevAluAtIonobjeCtIvesAnDfIelDsAnalysing the inventory of French local EE activities [Broc2005]highlightedconcreteneedsintermsofex-postevalua-tion(seeTable 1).Theseneedscanbetranscribedinconcreteevaluation objectives�:

understandingoperationprocesstobringoutsuccessfac-tors

performingacritical analysisoftheoperationanditsre-sults,especiallycomparisonstoglobalobjectivesthisopera-tioncontributestoandtoothersimilaroperations

makingresultsreliable and visible,especiallytowardspart-nersandfinalpublicstargeted

quantifyingfinalresultsintermsofloadreductions,energysavingsandavoidedemissions

definingcost-effectivenessoffinalresults(e.g.intermsofc€/savedkWh)

Ourmethodologyisthendesignedtoanswertotheseobjec-tivesthroughasystematicapproachstructuredinthreemainevaluationfields:

7. Another significant evaluation objective may be to assess the sum of final results of several local operations and to compare it with the evolution of global indica-tors (crossing bottom-up and top-down approaches). This is out of the scope of this paper.

4,105 BROC ET Al

methodological

specificities

- evaluation may be a “counterpart to decentralisation and devolution of jurisdiction” [Trouslot 1995 p.535]

- local operations are very diverse, and evaluation methods have to be adjustable to these differences

- local operations often involve partnerships between various stakeholders: this complicates their monitoring

and requires evaluation methods to enable to take account of each point of view, and to fit to different

needs and skills

- evaluation has often to be managed on two levels:

- on-field level: providing synthetic results and feedback to operation managers and partners

- central level: reporting activities to supervision bodies and centralising experiences

- evaluation has to be integrated in the operation process itself:

- to strengthen mobilisation of targeted final publics (e.g. communication about gained results is a good

mean for raising awareness)

- to strengthen involvement of partners and other local stakeholders

technical

specificities

- lack of practical culture and skills of evaluation

- lack of reference data specific to a given territory (most of reference data are only available at national

level) and difficulties to assess consistency between local and national data

- financial and human means available for evaluation are often limited

table2.specificitiesofevaluationatlocallevel

Contents Index Authors

Page 5: A process to develop operational bottom-up evaluation

662 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION

mAInprInCIplesofourmethoDologyTofittostakeholdersneeds,ourmethodologyisalsobasedonthreemainprinciples.

Sofar,experiencefeedbackhasbeenusedtoformbest prac-tices guidebooks or databases8. However, the inventory ofFrenchlocalEEactivitiesbroughtoutthatsuchguidebooksordatabasescoveredonlyaverylittleportionofwhatisactuallydone[Broc2005].Indeed,thisformofexperiencecapitalisa-tionisinformal,mainlybasedonaface-to-face transmissionamonginsiderscircles.Descriptionsmadeofbestpracticesdonotgointodetails,astheyaremadetoraisereader’sinteresttocontactpersonswhomanagedtheexperiencespresented.

Thiswayofdisseminatingbestpracticesislimited,especiallybecausecorrespondentcontactshaveoftenalimited availa-bility,whichlimitsdisseminationofbestpractices.Likewise,thesecontactshavealsoskills in a particular fieldbutnotinallissues,whichdonotfavourcross-cuttingapproaches.Moreo-

8. Such best practices guidebooks and database are for instance built by Energie-Cités, most often with support of ADEME (French Agency for Energy Management and Environment) and European Commission. Energie-Cités’ database is available at: www.energie-cites.eu (section “Resources” then “Best Practices”).

ver,thememoryofexperiencefeedbackandknow-howmaybelostwhencorrespondentcontactsleave.

Rakoto[Rakoto2002,Rakoto2004]alsonoticedtheseis-suesstudyinghowexperience feedbackwasused incompa-nies.Hesuggestedrationalisingthisprocessusingknowledgemanagementsystems.Ourmethodologyaimstobeawaytoinitiatesuchasystematicprocess,appliedtothefieldoflocalEEactivities.

Figure 1describeshowitproposes to initiateaprocessofexperiencecapitalisation.

4,105 BROC ET Al

1) operation theory analysis:

- analysis of operation design and progress

- confrontation final results / initial objectives

- assessment of intermediate outputs and outcomes

- involvement of partners

- review of communication plan and its impacts

- evaluation of market transformation (when required)

- analysis of point of view differences among partners

- comparison with other similar operations

2) evaluation of final results:

- energy savings

- load reductions

- avoided GHG emissions

3) economic assessment:

- assessment of cost-effectiveness

indicators (e.g. c€/ saved kWh)

- costs/benefits analysis (according

to stakeholders points of view)

table3.threemainevaluationfieldsforasystematicapproach

table4.mainprinciplesofourmethodology

Principles Explanations

1) evaluation methods have

to be operational and easy

to appropriate for all

stakeholders involved, and

to enable a progressive

training to evaluation

- a unique methodology is designed to develop several methods: general evaluation

objectives are common to all local EE activities and evaluations should be consistent with each

other and easy to reproduce ; but differences between operations require to adjust methods

according to certain criteria such as targeted end-uses or policy instruments

- evaluation methods have to be based on techniques for data collection and analysis which are

already experienced

- evaluation methods are designed similarly to software:

- an interface gathers in a constant structured way all required information related to the

operation and its results

- two modules (one for analysing operation theory, the other for calculating results and cost-

effectiveness indicators) provide calculation models and/or frameworks to perform evaluation

and obtain conclusions and results from input data in order to feed the interface

- tutorials provide additional advice to use evaluation modules and interface, from basic use

(requiring as few input data and analysis as possible) to expert use (requiring additional input

data and analysis)

2) evaluation methods have

to insure data quality and

results transparency:

- a systematic monitoring of evaluation provides the required information for quality indicators

about data and evaluation techniques used (this is based on the Quality Assurance Guidelines

approach developed by Vine [Vine 1999a pp.50-52])

- advice are provided to perform uncertainties assessment, from qualitative (order of magnitude)

to quantitative (statistical confidence intervals)

3) evaluation methods have to support a progressive process of experience capitalisation (see details below)

Starting point

Proposin g improvements :

• for designing and managing

operations

• for evaluation methods

Evaluating

operations

Implementing

operations

Analysis of available existing

experience feedback

Figure 1. Progressive process for experience capitalisation.

Contents Index Authors

Page 6: A process to develop operational bottom-up evaluation

PANEl 4. MONITORING AND EVAlUATION

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 663

settIngupAnevAluAtIonsystemontWolevelsOneofthekeyfactorsforexperiencecapitalisationisopera-tionmanagersbeingtheheartoftheprocess.However,fortheparticularfieldoflocalEEactivities,theseoperationmanagersmaybenumerousandscattered.Thereisthereforeaneedtocentraliseinformation.That’swhyourmethodologyistosetupanevaluationsystemontwolevels(seeFigure 2).

On-field evaluationisperformedby(orundersupervisionof)operationmanagers.Theyarenotevaluationexpertsandmoreovergetoftenonlylittletimetodevotetothis.Theireval-uationtoolsshallthereforebeeasytoappropriateandtoapply.Butthesetoolsshallalsoproposefurtherguidelinestoenablegoingdeeperintoevaluationwhenoperationmanagerswishso,especiallyinanapproachofcontinuousimprovement.

Theobjectivesofcentralising evaluationsare:

togatherinformationinordertomakethemavailableforboth,decision-makersandoperationmanagers

toreviewinformationinordertoinsureevaluationsarereli-ableandcanbecomparedwitheachother

toupdateevaluationmethods,referencedataandbestprac-ticesguidebooks(forimplementingoperations)

Thiscentralisationmaybeperformedeitherwithinanationalbody(e.g.nationalagency,energysuppliers),orforagiventer-ritory(e.g.atregionalscale,byaregionalenergyobservatory/agency).Guidelinesforcentralisationaremeantforevaluationexperts,andare toregister information inasystematicway,inordertoprovidestructuredanddetailedexperiencefeed-back.Theseguidelinesarealsotobeusedtocompleteon-fieldevaluations,whenthecentralevaluationservicedecidesitisrelevant.

In parallel, a standard form was designed to collect insystematicwayinformationaboutlocaloperationsandtheir

evaluations.Thisformprovidesaframetosummarizethisin-formation.Eachdataholdingkeyinformationismarkedwithagivencode(e.g.A-1).Thesemarksarecommonforalltoolsofanevaluationmethod(e.g.thestandardform,calculationguidelines,etc.),sothateachinformationcanbeeasilylocatedinanytool.Itcouldalsobeusedafterwardstobuildaninfor-mationsystem.

settingupanevaluationmethodologyThissectionpresentsthemaincomponentsofourmethodol-ogy,firstitsmethodologicalbackground,andthenthekeyis-suesaddressedbothforimpactsquantificationandfortheory-basedevaluation.

methoDologICAlmAterIAlsuseDSelectionofmainbibliographicreferenceswasbasedonthreecriteria:

recognition and/or usebyinternationalornationalinsti-tutionsand/orbyenergyefficiencyprofessionals(utilities,ESCo,regulatoryorotherpublicbodies)

up-to-date contents(basedoncurrentstate-of-artand/orrecentexperiences)

wide methodological scope(coveringmostofmainevalu-ationissues)

Otherbibliographicinputshavebeenused,especiallyforspe-cificevaluationissues(e.g.[Vine1992]forpersistenceofsav-ings).Butthefollowingreferencesprovidedthebasicmaterialforourmethodology.9

9. the IEA DSM Programme: International Energy Agency Demand-Side Manage-ment Programme, see http://dsm.iea.org

4,105 BROC ET Al

EV

AL

UA

TIO

NO

UT

PU

TS

DA

TA

NE

ED

ED

EV

AL

UA

TIO

NT

OO

LS

Common

evaluation

methodology

Concrete

methods

Programmes-specific

required data

Databases of reference

values

Other available data

on-field level

central level

Best practices

experience

feedback

evaluation

reports

defines and keeps

up-to-date

defines and keeps

up-to-date

centralises and

reviews

centralises and

diffuses

completes

and reports

provides

and uses

applies collects and

provides

uses as

inputs

Figure 2. Evaluation system on two levels

Contents Index Authors

Page 7: A process to develop operational bottom-up evaluation

664 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION4,105 BROC ET Al

Reference Contents

International

Performance

Measurement and

Verification Protocol

[IPMVP 2002]

Prepared for the US Department of Energy (DoE), IPMVP stands as an international reference for M&V

(Measurement and Verification) of energy savings. A summary of IPMVP can be found in [Appendix B of

SRCI 2001 pp.B50-B53]. Main inputs from IPVMP are:

- 4 M&V options to quantify gross energy savings

- description of measurement techniques and issues

Guidelines for the

Monitoring,

Evaluation, Reporting,

Verification, and

Certification of

energy-efficiency

projects for climate

change mitigation

[Vine 1999a]

Prepared for the US Environmental Protection Agency and US DoE, the MERVC guidelines were

developed to evaluate CDM (Clean Development Mechanism) and JI (Joint Implementation) projects

within Kyoto protocol framework. A summary can be found in [Vine 1999b]. Main inputs from MERVC

guidelines are:

- global approach from ex-ante registration to ex-post verification

- methodological approach to evaluate net energy savings (estimating gross results, then a baseline, and

finally net results)

- guidelines to estimate ex-ante an initial baseline (through performance benchmarks), and then to re-

estimate it ex-post (through surveys, discrete choice or regression models), in order to better capture

free-rider effect

- quality assurance guidelines, to insure quality of performed evaluations and thus reliability of results

- reporting forms

A European ex-post

evaluation guidebook

for DSM and EE

service programmes

[SRCI 2001]

Prepared for the European Commission within a SAVE project, the European guidebook stands as an

international reference for evaluation preparation and planning. A summary of its approach can be found

in [Birr-Pedersen 2001]. Main inputs from the European guidebook are:

- guidelines for evaluation planning

- synthetic description of the basic concepts related to the evaluation of EE programmes (especially net-

to-gross adjustment factors)

- description of the main techniques for data collection and energy savings calculation

California Energy

Efficiency Protocols

and Evaluation

Framework

[TecMarket Works

2004, TecMarket Works

2006]

Prepared for the California Public Utilities Commission (CPUC), both manuals provide detailed guidelines

for energy efficiency programmes evaluation to the evaluation experts and programme managers in

charge of it. The Californian Evaluation Framework covers all evaluation issues, forming a very complete

state of the art of evaluation practices in United States. The protocols present official requirements and

recommended process for evaluation of EE programmes under CPUC regulation. Main inputs from

Californian manuals:

- systematic approach of evaluation work, divided in main issues: M&V and impact evaluation, process

evaluation, market transformation evaluation, uncertainty, sampling, cost-effectiveness

- detailed review of all evaluation key issues (with a significant bibliography)

- detailed requirements and organisation of an evaluation scheme in a given regulatory framework

Evaluation guidebook

for evaluating energy

efficiency policy

measures & DSM

programmes [Vreuls

2005]

Prepared within task I of IEA DSM Programme, the objective of the IEA guidebook is “to provide practical

assistance to administrators, researchers, and policy makers who need to plan assessments and to

evaluators who carry out evaluations of energy efficiency programmes”, especially programmes related to

Kyoto greenhouse gas targets. Main inputs from IEA guidebook:

- a methodological approach based on seven key elements: statement of policy measure theory,

specification of indicators, development of baselines for indicators, assessment of output and outcome,

assessment of energy savings, calculations of cost-effectiveness, and choice of level with regard to the

evaluation effort

- application of this framework to four main types of EE programmes (regulation, information, economic

incentives, voluntary agreements) and to combinations of these types

- detailed and standardised description of main evaluation experiences of eight countries (Belgium,

Canada, Denmark, France, Italy, Republic of Korea, The Netherlands, Sweden)

Evaluation and

comparison of utility's

and governmental

DSM-programmes for

the promotion of

condensing boilers

[Haug 1998]

Contrary to previous references, this SAVE project is not general but specific to a particular energy

efficient solution (condensing boilers). It was selected because it presents a well-detailed evaluation

approach and because it represents an example of specific material to be used developing a specific

method. Main inputs from this particular evaluation are:

- a concrete and detailed evaluation framework, especially to study factors influencing the success of a

programme

- significant material about market transformation issues, which can be used for other cases of promotion

of efficient appliances

- significant technical material useful for developing method for other energy efficiency programmes

related to heating

table5.mainreferencesusedtobuildourmethodology.

Contents Index Authors

Page 8: A process to develop operational bottom-up evaluation

PANEl 4. MONITORING AND EVAlUATION

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 665

KeyIssuesforImpACtsquAntIfICAtIon

makingitclearwhataregrossandnetresultsGross resultsaretheresultsgainedfromthepoint of view of final end-users.Theseresultscorrespondtothechangesbe-tweenthesituationsbeforeandaftertakingpartinanEEpro-gramme.Situationbeforemaybeeithertheenergyconsump-tionintheperiodprecedingtheprogramme,orthechangeswhichwouldhaveoccurredinabsenceoftheprogramme(e.g.buyingapresentstandardequipmentinsteadofapresentef-ficientone).Definingsuchabeforesituationmeanschoosing a baseline.Besides,beforeandaftersituationshavetobecom-paredinsimilarconditions(e.g.similarweatherconditions).Thisrequirestheapplicationofcorrection(ornormalisation)factors(e.g.heatingdegreedays).

Net results are the results gained thanks to an EE pro-gramme, fromthepoint of view of society.Thismeansnetresultsonlyaccountfortheportionoftheresultswhichwouldnothavebeengainedwithouttheprogramme.Thedifferencebetweengrossandnetresultscanbetakenintoaccount,eitherinadjusting the baseline(e.g.usingcontrolgroupsormarketmodelling), either in applying adjustment factors (such asfree-riderandspill-overeffects).

For local operations, an interesting option is to consider what occurs at national level as a control group.Forinstance,for a local operation promoting CFL (Compact FluorescentLightbulb),thebaselinecanbethenationalevolutionofCFLsales10.Anadvantageofthisalternativeisthatitmayenabletotake intoaccount theeffectsofnationalprogrammes.How-ever,ithastobecontrolledwhetherthecomparisonbetweennationaland localevolutions isnotbiased(e.g.due to localspecificities).

DefiningafourstepscalculationprocessTomakethecalculationoftheresultseasier,itispossibletoperformitasastep-by-stepprocess.11,12

Settingupanoperationalevaluationmethodforcalculatingimpactsresultsthenintwomaintasks:specifyingacalcu-lationmethod(forstep1)andanaccountingmethod(forsteps2and3).Persistencestudies(forstep4)areoftento

10. If the local operation cover a significant territory, the national evolution has to be understood as the evolution for the rest of the country (not covered by the operation).

11. unitary results means here either the results per participant or per equip-ment

12. Persistence of results means the efficient solution is still in place and opera-tional (retention), with a certain level of performance compared to the first year of implementation (performance degradation) [Vine 1992, Wolfe 1995].

bemanagedat thecentral level tobemorecost-effective[Skumatz2005].

ChoosingacalculationmethodA calculation method is composed of three main elements:acalculation model,data collection techniquestofeedthismodel,andreference ex-ante valueswhichcompletethedatacollectedex-post.Choosingacalculationmethodisthenaniterativeprocess.Aparticularcalculationmodelwillrequireagivensetofdata.Lookingforwhatdataarealreadyavailableornot,itcanbededucedwhatdatashouldbecollected.Inparallel,thedifferentpossiblecollectiontechniques13canbeconsidered,takingintoaccounttheirtechnical14andpractical15feasibility,andtheircosts.Afterthat,availableandpossible-to-collectdataarecomparedtothedataneeded.Finally,thechoicesofcalcu-lationmodelanddatacollectiontechniquesareadjusteduntilavailableandpossible-to-collectdatafittodataneeded.

twocalculationapproachesTwomainapproachesmaybeusedtocalculateunitaryen-

ergy savings, whether energy consumption data are directlyavailableornot:

Whenenergyconsumptiondataaredirectly available,atleast fora sampleofparticipants, thegeneral calculationformulacomparesdirectlyenergyconsumptionbeforeandaftertheimplementationoftheevaluatedefficientsolution.

13. A list of these techniques can be found at: http://www.tecmarket.net/data.htm

14. e.g. it may not be possible to install a sub-meters, or to disaggregate data from existing metering

15. e.g. some data may be difficult to access for privacy (e.g. commercial data) or confidentiality (personal data) reasons

1.

4,105 BROC ET Al

step 1calculating the unitary gross annual results: main issues raised for this step are the baseline, the correction factors

and the choice of a calculation method (see the following section)

step 2calculating the total gross annual results: main issue raised for this step is to define a method to account for the

number of participants, especially when this accounting is not direct (e.g. when participants are not directly registered,

their number may be assessed through sales data)

step 3calculating the total net annual results: main issue raised for this step is to adjust the baseline or to apply adjustment

factors

step 4calculating total net results over time: main issue raised for this step is to assess the lifetime of the results, taking

account of their persistence

table6.Afourstepscalculationprocess

Possible-to-collect data

Data neededAvailable data

Data collection

techniques

Existing statistics,

experience feedback, etc. Calculation

model

Looking for the

best compromise

Figure 3. Iterative process for choosing a calculation method.

Contents Index Authors

Page 9: A process to develop operational bottom-up evaluation

666 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION

Correctionornormalisationfactorsmaybeappliedwhennecessary as explained before. Energy consumption datacanbecollectedeitherfromenergybills,meterreadingsorenergyend-usemeasurements.

Equation1.Directformula

2.Whenenergyconsumptiondataarenotdirectlyavailable,for-mulacanbebrokendowninintermediateparameters(forwhich data are easier to access/assess) to calculate energyconsumption afterwards. For instance, energy consump-tionsmaybebrokendownintwoterms,loadanddurationofuse:

Equation2.Brokendownformula.

Otherdecompositionmaybeused(e.g.averageconsumptionpercycleandnumberofcyclesforwashingmachines).Andasforenergyconsumption,theintermediateparametersmayberelatedtoadditionalparameters(e.g.numberofpersonsperdwellingforthenumberofcycles).

qualitycontrolanduncertaintyanalysis

Qualification of the input dataAssessingqualityofinputdataresultsinlinkingtheirvalueseitherwitharange of variations(betweenaminimumandamaximum),orwithaconfidence interval(anditsconfidencelevel).Inpractice,thelatterisdifficulttoapply.Andsometimes,eventheformeroptionmaynotbepossible.Inthatcase,itmaybeusedaqualitativeappreciation(e.g.approximate,medium,good).

Qualification of the data sourcesDatasourcescanbedistinguishedeither theywereassessedex-anteordefinedex-post.Ex-antedataarereferencevaluesdeducedfromotherexistingexperiences,studiesorstatistics.Ex-postdataaredefinedfromcharacteristics,measurements,monitoringorsurveysspecifictotheoperationevaluated.Usu-ally,anex-postdatawillbemoreaccuratethananex-antedata,as it takes intoaccount theoperationspecificities.However,theoppositeisalsopossible,especiallywhenex-postdataaredefinedfromnon-representativesamples.

Quality Assurance Guidelines (for the evaluation)Nocalculationmodelcanbeassumedtobemoreaccuratethantheothers16.Evaluationreliabilitydependsmoreontheinputdataandtheassumptionsdonethanonthemodelitself.That’swhyVine[Vine1999app.50-52]proposedtousequality as-surance guidelinestoassesstheapplicationofanevaluationmethod.Ourmethodologyhasincludedthisapproach,adjust-edtothespecificitiesofevaluationlocaloperations.

Three progressive levels for addressing uncertaintiesAttheend,theaimistopresenttheresultswithinformationabouttheiruncertainties.Sofar,suchinformationhasneverbeenfound,amongallexperiencefeedbackoflocalEEopera-tionswecouldfind.Itwouldthereforebeunrealistictoexpectresultswithconfidenceintervalrightaway.Soourmethodolo-gyproposesaprogressive approach,withthreelevelsofinfor-mation.Firstlevelisatleastanorder of magnitude,basedonqualitativeappreciationsand/orassessmentsofminimumandmaximumvalues.Secondlevelisasensitivity analysis,usingthevariationrangesofeachparametertodefinepessimisticandoptimisticscenarios,leadingtomoreaccurateminimumandmaximumvalues.Thirdleveliseitherastatistical approachleadingtoconfidenceintervals(atgivenconfidencelevels),orMonteCarlosimulations,whichanalysesthedistributionoftheresultsfromrandomsensitivityanalysis.

16. see for instance [Ridge 1994, Schiffman 1993] for a comparison between statistical and engineering models.

4,105 BROC ET Al

unitary gross annual energy savings

= ( [annual energy consumption]0 - [annual energy consumption]1 )+/- normalisation factors

where:

- 0 and 1: situation respectively before and after implementation of an efficient solution

unitary gross annual energy savings = ( [P*ALF*D]0 - [P*ALF*D]1 )

where:

- 0 and 1: situation respectively before and after implementation of an efficient solution

- P: average rated power

- ALF: average load factor

- D: average annual duration of use

Contents Index Authors

Page 10: A process to develop operational bottom-up evaluation

PANEl 4. MONITORING AND EVAlUATION

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 667

KeyIssuesfortheory-bAseDevAluAtIon

systematicdescriptionoftheoperationFirststepinanalysingtheoperationtheoryistodescribeit.The standard frame designed to report evaluations enablesa systematic description, making this step easier. The frameisparticularlyused tomake the objectivesof theoperation more explicit,andthustodeducethepriorityobjectivesfortheevaluation.Italsobringsoutthemaincontextual factors and assumptionsoftheoperation,especiallythebarrierstheoperationisassumedtoaddress.

expressingoperationtheoryintoconcreteintermediateindicatorsandsuccessfactorsFromthesystematicdescriptionoftheoperation,theoperationtheoryisthenexpressedintoconcreteintermediateindicatorsand/orsuccessfactorspossibletomonitor.Usually,suchanap-proachisusedtorepresenttheoperationtheoryasalinear(orstep-by-step)process(seeforinstance[Joosen2005]).Howeverthisapproachmayeitherrequirealargeamountofdata(tryingtobeexhaustive),ormisssomekeyfactorsofsuccess(tryingtobestraightforward).Ourmethodologysuggestsacompromisebyrepresentingtheoperationtheoryasanon-linear combi-nation of success factors or intermediate steps,selectingthemostrelevantfactors/stepsfromexistingexperiencefeedback.Theaimisalsotobringoutpossibleinteractionsbetweensuc-cessfactorsand/orintermediatesteps.Thefinalresultofthisevaluationworkistherepresentationoftheoperationtheoryasasuccess pathway.

CriticalanalysisofresultsAsevaluationresultsarealwaysrelative,ourmethodologypro-videsguidelinesfortheircriticalanalysis.Thetwomainguide-linesaretotakeintoaccountthepossibledifferences of points of viewamongstakeholders(e.g.betweenapublicagency,autility,theretailersandtheparticipants)andtocomparetheoperationevaluatedwithsimilarones.Thiscomparisonistofocusnotonlyontheresultsbutalso,andinpriority,onthechoicesmadewhiledesigningtheoperation(e.g.whatcom-municationmeansareused,whatpartnershipsweresetup).

CriticalanalysisoflocaldimensionAsourmethodologyistoevaluatelocaloperations,itisessen-tialtoprovideguidelinestoconsiderthelocaldimensionoftheoperationsanalysed.Theseguidelinesaremainlybasedonthespecificitiesofthelocaldimensiondescribedinsection“Ad-vantages and drawbacks of the local dimension”.Theapproachistorevieweachspecificity,whetheritwasaneffectiveadvantageordrawbackfortheparticularoperation.

systematicprocessfordevelopingoperationalmethods

DesCrIptIonoftheproCessThefinalresultofourmethosdologyisaprocessfordevelopingoperationalmethods,whoseaimistoaddressinasystematicwaythekeyevaluationissuesdescribedinabovesections,inordertocompletetheevaluationtoolsforminganevaluationmethod.

4,105 BROC ET Al

1) analysing available experience

feedback and literature

2) specifying guidelines for theory-

based evaluation

- analysing the contextual factors around the efficient solution used

- searching for available experience feedback and other information

- structuring and analysing these information

guidelines for

- describing operation theory

- assessing intermediate indicators

- critical analysis of results and local dimension

- list of data to collect ex-post (and how to)

3) adjusting the standard frame

presenting the operation and evaluation

- analysing the main evaluation objectives

- selecting in details the evaluation fields to be covered

- adjusting the standard representation of the success pathway

(especially specifying the main success factors and indicators)

4) specifying the calculation method

- choosing the calculation model and the data collection techniques

- specifying the baseline, correction and adjustment factors, etc.

- looking for reference ex-ante values

- listing data to collect ex-post (and how to)

5) summarising guidelines in a

synthetic manual of use

synthesis of phases 2 to 4

- presenting the different documents/evaluation tools

- general list of data to collect ex-post (and how to)

6) testing the method on available

experience feedback or pilot operation

applying the method to available feedback (or an pilot operation)

- to detect practical issues (especially for data collection)

- to adjust the method when necessary

7) initiating the process of experience

capitalisation

using the new evaluations:

- to add new elements of comparison and to update ex-ante values

- to update guidelines for the implementation of future operations

- to update guidelines for future evaluations

Figure 4. Systematic process for developing operational methods.

Contents Index Authors

Page 11: A process to develop operational bottom-up evaluation

668 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION

hoWtoInvolveConCerneDstAKeholDersIntheproCessInvolvingthestakeholderintheprocessfordevelopingevalu-ationmethodsisakeyfactortoinsurethemethodsareopera-tionalandtheywilluseit,andsobeinvolvedintheevaluationprocess.Thiswastriedwhileapplyingourmethodologyontwoconcretecases(localcampaignsforpromotingCFL,andraisingawarenessactionsinofficebuildings).Thepersonalcontactswereverygood,andoperationmanagersaswellascentralbod-ieswereveryinterestedintheevaluationprocess,evenbeforeour first contacts. So these applications were successful andprovidedinterestingfeedbackandlessonslearned.However,theissueofinvolvingstakeholdersshouldbefurtherexamined,especiallyforlessfavourablecases.

ConclusionsandperspectivesAsmentionedabove,ourmethodologywastestedontwocon-cretecasestudiesoflocaloperationsimplementedbyEDFinSouth-EastofFrance.Thefirstonewasa localcampaign topromoteCFL.Acompleteevaluationmethodwassetupandapplied,providingdetailedresults[Broc2006a].Thesecondcasestudywasaboutraisingawarenessactionsinofficebuild-ings.Existingexperiencefeedbackwasnotrichenoughtosetupacompleteevaluationmethod.Thepartontheory-basedevaluationcouldbefulfilled,butnotthepartonimpactevalua-tion.Still,thiscasestudyenabledustoobtaininterestingresults[Broc2006b].Presentingboth,themethodologyandthesecasestudies,wasnotpossibleinasinglepaper.Sotheanalysisofthecasestudieswillbethesubjectoffurtherpublications.

Inparallel,thismethodologyiscurrentlyadjustedtoevalu-atelargerEEactivities,especiallyEEactivitiestobereportedwithin the new European Directive on energy end-use effi-ciency[ESD2006].Changingofscaleandfittingtoparticularregulatoryrequirementsraiseadditionalevaluationissues,forinstance:

specifyingharmonisedaccountingrulesforMember-Stateswithdifferentexperiencesaboutevaluation

doublecountingofresultsbetweenEEactivitieswithsimi-lartargets

Oneofthemainevolutionsofourmethodologyistoconsiderthreelevelsofevaluationefforts1�.

Thesenewdevelopmentsofourmethodologyarecurrentlystudied and discussed within the Intelligent Energy EuropeprojectentitledEMEEES(EvaluationandMonitoringfortheEUDirectiveonEnergyend-useEfficiencyandEnergyServ-ices),coordinatedbytheWuppertalInstituteforClimateEnvi-ronmentandEnergy18.

references[AGORA2002]AGORA.ALES - Autorités Locales et Effet de 

Serre. contratpourl’ADEME,décembre2002.[BIRR-PEDERSEN2001]PrebenBirr-Pedersen.European 

ex-post evaluation guidebook for DSM and EE services pro-grammes.ProceedingsoftheECEEE2001Summerstudy,pp.116-118,panel1,paper1.098.

[BOUVIER2005]GuillaumeBouvier.Les Collectivités locales et l’électricité - Territoires, acteurs et  enjeux autour du ser-vice public local de l’électricité en France.ThèseCIFREdeDoctoratengéographie(mentiongéopolitique),réaliséeàlaDirectiondesAffairesPubliquesd’EDF,InstitutFran-çaisdeGéopolitique-UniversitéParis8,1�juin2005.

[BROC2006a]Jean-SébastienBroc.L’évaluation ex-post des opérations locales de Maîtrise de la Demande  en Energie - Etat de l’art, méthodes bottom-up, exemples appliqués et  approche du développement d’une culture pratique de l’évaluation. ThèsedeDoctoratenEnergétiquedel’EcoledeMinesdeParis,réaliséeàl’EcoledesMinesdeNantes,8décembre2006.

[BROC2006b]Jean-SébastienBroc,BertrandCombes,BernardBourges,JérômeAdnot,SandrineHartmann,

17. This approach is inspired from recent guidebooks, such as the Danish [SRCI 2003], the IEA [Vreuls 2005 pp.43-47], and the IPCC [IPCC 2006 pp.1.7-1.8] ones.

18. For more details, see the paper presented by Stefan Thomas at this ECEEE 2007 Summer Study.

4,105 BROC ET Al

Figure 5. Level of evaluation efforts related to data collection techniques.

levels of evaluation effortsmain data used

data collection techniques

already available data

well-known data collection

techniques

specific data collection

techniques

European reference values

Member States-specific

values

measure-specific values

level 1: minimum evaluation

efforts

level 2: intermediate

evaluation efforts

level 3: enhanced

evaluation efforts

An evaluation method may combined

several levels of efforts, using different

data collection techniques

Contents Index Authors

Page 12: A process to develop operational bottom-up evaluation

PANEl 4. MONITORING AND EVAlUATION

ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT! 669

4,105 BROC ET Al

Marie-IsabelleFernandez.Raising awareness for energy efficiency in the service  sector: learning from success stories to disseminate good practices. ProceedingsoftheIEECB2006conference,Improving Energy Efficiency in Commer-cial Buildings, FrankfurtamMain,Germany,26-2�April2006.

[BROC2005]Jean-SébastienBroc,BernardBourges,JérômeAdnot,SandrineHartmann.Local energy efficiency  and demand-side management activities in France. Proceed-ingsoftheECEEE2005SummerStudy,volumeI,panel1,paper1,202,pp.183-194,Energy savings: what  works and who delivers? MandelieuLaNapoule,France,30May-4June2005.

[ESD2006]ESD.Directive 2006/32/EC of the European Parlia-ment and of the Council of 5 April 2006 on energy end-use efficiency and energy services and repealing Council Direc-tive 93/76/EEC. EuropeanUnion,2006/32/CE,Brussels,5April2006.

[EUROPEANCOMMISSION1999]EuropeanCommission.Paper I - Defining criteria for evaluating the effectiveness of EU environmental measures. TowardsanewEUframe-workforreportingonenvironmentalpoliciesandmeas-ures(Reportingonenvironmentalmeasures-‘REM’)

[GODINOT2004]SylvainGodinot,JacquesRavaillault,Lau-rentComéliau.Collectivités locales et effet de serre : étude sur la mise en oeuvre locale de la lutte contre le changement climatique. ConventionderecherchepourlecomptedelaMIES,ADEME,Directiondel’ActionRégionale,février2004.

[HAUG1998]J.Haug,B.Gebhardt,C.Weber,M.VanWess,etal.Evaluation and comparison of utility’s and govern-mental DSM-Programmes for the promotion of condensing boilers. SAVEcontractXVII/4.1031/Z/96-136,IER(Insti-tutfürEnergiewirtschaftundRationelleEnergieanwend-ung),Stuttgart,Germany,October1998.

[IPCC2006]IPCC,SimonEggleston,LeandroBuendia,KyokoMiwa,ToddNgara,KiyotoTanabe.2006 IPCC Guidelines for National Greenhouse Gas Inventories - vol-ume 2 : Energy. PreparedbytheNationalGreenhouseGasInventoriesProgramme,PublishedbyIGESonbehalfoftheIPCC,Japan.(availableat:www.ipcc-nggip.iges.or.jp)

[IPMVP2002]IPMVP.International Performance Measure-ment & Verification Protocol - Volume I : Concepts and Options for Determining Energy and Water Savings. reportDOE/GO-102002-1554fortheUSDepartmentofEnergy,EfficiencyValuationOrganization,RevisededitionofMarch2002.(availableat:www.evo-world.org)

[JOOSEN2005]SuzanneJoosen,MirjamHarmelink.Guide-lines for the ex-post evaluation of policy instruments on en-ergy efficiency. ReportEIE-2003-114,AID-EEprojectfortheEuropeanCommissionwithintheIntelligentEnergyforEurope(EIE)programme,August2005.(availableat:www.aid-ee.org)

[RAKOTO2004]HolitianaRakoto.Intégration du Retour d’Expérience dans les processus industriels - Application à Alsthom Transport ThèseCIFREdeDoctoratenSystèmesIndustriels,réaliséeauDépartementQualitéd’AlsthomTransportSA,InstitutNationalPolytechniquedeTou-louse,15octobre2004.

[RAKOTO2002]HolitianaRakoto,PhilippeClermont,LaurentGeneste.Elaboration and Exploitation of Lessons Learned. ProceedingsoftheIFIP1�thWorldCompu-terCongress-TC12StreamonIntelligentInformationProcessing,IFIPConferenceProceedings;Vol.221

[RIDGE1994]RichardRidge,DanViolette,DonDohrmann.An Evaluation of Statistical and Engineering Models for Estimating Gross Energy Impacts. FinalreportofPacificConsultingServicesfortheCaliforniaDemandSideMan-agementAdvisoryCommittee,Berkeley,California,June1994.(availableat:www.calmac.org)

[SCHIFFMAN1993]DeanA.Schiffman,RobertF.Engle.Appendix Z Simulation Study: Comparison of Alternative Methods for  Measuring the Gross and Net Energy Impacts of Demand-Side Management  Programs (with Adden-dum). ReportforSanDiegoGasandElectric,August1993.(availableat:www.calmac.org)

[SKUMATZ2005]LisaSkumatz,JohnGardner.Best Practices in Measure Retention and Lifetime Studies: Standards for Reliable Measure Retention Methodology Derived from Extensive Review. proceedingsofthe2005InternationalEnergyProgramEvaluationConference,Session4BPersistence-theissuethatnevergoesaway,pp.483-493,Reducing Uncertainty Through Evaluation, NewYork,1�-19August2005.

[SRCI2003]SRCI,AKF,Elkraftsystem.Handbog i evaluering af energispareaktiviteter (Handbook for evaluating energy saving projects). January2003.

[SRCI2001]SRCI,NOVEM,ElectricityAssociation,MO-TIVA,etal.A European Ex-Post Evaluation Guidebook for DSM and EE Service Programmes. SAVEProjectNo.XVII/4.1031/P/99-028,April2001.

[TECMARKETWORKS2006]TecMarketWorks.California Energy Efficiency Evaluation Protocols: Technical,  Meth-odological, and Reporting Requirements for Evaluation Professionals. ReportpreparedfortheCaliforniaPublicUtilitiesCommission,April2006.(availableat:www.calmac.org)

[TECMARKETWORKS2004]TecMarketWorks,Megdal&Associates,ArchitecturalEnergyCorporation,RLWAna-lytics,etal.The California Evaluation Framework. ReportpreparedfortheSouthernCaliforniaEdisonCompanyasmandatedbytheCaliforniaPublicUtilitiesCommission,K2033910,RevisedSeptember2004.(availableat:www.calmac.org)

[TROUSLOT1995]FranckTrouslot.L’évaluation des actions publiques à l’échelon local : illustration et  analyse critique à partir de l’exemple de la politique de maîtrise de  l’énergie en Poitou-Charentes.ThèsedeDoctoratenSciencesEconomiques,UniversitédePoitiers,FacultédeSciencesEconomiques,décembre1995.

[VINE1999a]EdwardVine,JayantSathaye.Guidelines for the Monitoring, Evaluation, Reporting, Verification and  Cer-tification of Energy-Efficiency projects for Climate Change  Mitigation. ReportpreparedfortheU.S.EnvironmentalProtectionAgency,LBNL-41543,March1999.

[VINE1999b]EdwardL.Vine,JayantA.Sathaye.An Overview of Guidelines and Issues for the Monitoring, Evaluation,  Reporting, Verification and Certification of 

Contents Index Authors

Page 13: A process to develop operational bottom-up evaluation

4,105 BROC ET Al

670 ECEEE 2007 SUMMER STUDY • SAVING ENERGY – JUST DO IT!

PANEl 4. MONITORING AND EVAlUATION

Energy-Efficiency Projects  for Climate Change Mitigation.ProceedingsoftheECEEE1999SummerStudy.

[VINE1992]EdwardL.Vine.Persistence of energy savings: What do we know and how can it be ensured? Energy, 17 (11),pp.10�3-1084.

[VREULS2005]HarryVreuls,WimDeGroote,PeterBach,RichardSchalburg,etal.Evaluating energy efficiency policy measures & DSM programmes - volume I : evalua-tion guidebook. ReportfortheIEA-DSMtaskIX.October2005.(availableat:http://dsm.iea.org)

[WOLFE1995]AmyK.Wolfe,MarilynA.Brown,DavidTrumble.Measuring persistence : a literature review focus-ing on methodological issues. Reportn°ORNL/CON-401bytheOakRidgeNationalLaboratoryfortheUS-DOE,WashingtonDC,march1995.

AcknowledgementsTheworkpresentedinthispaperwasdoneduringaPhDthe-sisjointlysupervisedinEcoledesMinesdeNantesandEcoledesMinesdeParis.ApartofthisPhDconsistedinaresearchcontract,entitledEVADEM,forEDFR&Dperformedinpart-nershipwiththeWuppertalInstituteforClimateEnvironmentandEnergy.CurrentworkstogofurtheraremainlyledwithintheIntelligentEnergyEuropeprojectentitledEMEEES(Evalu-ationandMonitoringfortheEUDirectiveonEnergyend-useEfficiencyandEnergyServices),coordinatedbytheWuppertalInstituteforClimateEnvironmentandEnergy.

TheauthorswouldliketothankPaulBaudryandSandrineHartmann (EDF R&D), Marie-Isabelle Fernandez and Ber-trand Combes (Regional Delegation of EDF PACA), StefanThomasandWolfgangIrrek(WuppertalInstitute)forthesuc-cessfulcollaborationwithinEVADEM.WewouldalsoliketothankStefanThomas(again),HarryVreuls(SenterNovem)andallEMEEESpartnersfortheircommentswithinEMEEES.

Contents Index Authors