after school programes

Upload: obahamondet

Post on 04-Apr-2018

219 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 after school programes

    1/44

    AFTERSCHOOLEVALUATION 101:How to Evaluate an Expanded Learning Program

    Erin HarrisHarvard Family Research Project

    - - - - - - - - - - - - - - - - - - - - - - - -

    Version 1.0

    December 2011

  • 7/29/2019 after school programes

    2/44

    AfterschoolEvaluation 101

    Table of

    Contents

    2011 President and Fellows o Harvard

    College. All rights reserved. May not be

    reproduced whole or in part without written

    permission rom Harvard Family ResearchProject.

    Harvard Family Research Project

    Harvard Graduate School o Education

    3 Garden Street

    Cambridge, MA 02138

    www.hrp.org

    Tel: 617-495-9108

    Fax: 617-495-8594

    Twitter: @HFRP

    For questions or comments about this

    publication, please email

    [email protected]

    INTRODUCTION 1

    TheImportanceofEvaluationandDevelopinganEvaluationStrategy 1

    HowtoUseThisToolkit 2

    OtherEvaluationResources 2

    Acknowledgments 3

    STEP 1: DETERMINING THE EVALUATIONS PURPOSE 4

    WhyShouldWeEvaluateOurProgram? 4

    Whatisourprogramtryingtodo? 4

    Whatinformationdoourfundersexpect? 5 Whatquestionsdowewanttoanswer? 5

    STEP 2: DEVELOPING A LOGIC MODEL 7

    Whatisalogicmodelandwhyshouldwecreateone? 7

    Whatdoesalogicmodellooklike? 7

    Howdowedeneourgoals? 8

    Howdowedeneourinputsandoutputs? 9

    Howdoweidentifyouroutcomes? 9

    Howdowedevelopperformancemeasures? 10

    STEP 3: ASSESSING YOUR PROGRAMS CAPACITY FOR EVALUATION 11

    Whoshouldbeinvolvedintheprocess? 11

    Whatresourcesmustbeinplace? 11

    Whowillconductourevaluation? 11

    Howdowendanexternalevaluator? 12

    STEP 4: CHOOSING THE FOCUS OF YOUR EVALUATION 13

    WhatistheFiveTierapproachandwhyuseit? 13

    Howdoweconductaneedsassessment(Tier1)? 14

    Howdowedocumentprogramservices(Tier2)? 14

    Howdoweclarifyourprogram(Tier3)? 14

    Howdowemakeprogrammodications(Tier4)? 15

    Howdoweassessprogramimpact(Tier5)? 15

    STEP 5: SELECTING THE EVALUATION DESIGN 17

    Whattypeofevaluationshouldweconduct? 18

    Whattypeofdatashouldwecollect? 18

    Whatevaluationdesignshouldweuse? 18

    Whatisourtimeframeforcollectingdata? 20

    STEP 6: COLLECTING DATA 22

    Howdoweselectourprogramsample? 22

    Whatdatacollectionmethodsshouldweuse? 23

    Howdowechooseevaluationtoolsandinstruments? 24

    Whatethicalissuesdoweneedtoconsiderinourdatacollection? 24

    Howdowecollectparticipationdata? 25

    Howcanwemanageourevaluationdata? 27

    STEP 7: ANALYZING DATA 28

    Howdoweprepareourdataforanalysis? 28

    Howdoweanalyzequantitativedata? 28

    Howdoweanalyzequalitativedata? 29

    Howdowemakesenseofthedatawehaveanalyzed? 29

    STEP 8: PRESENTING EVALUATION RESULTS 30

    Howdowecommunicateresultstostakeholders? 30

    Whatlevelofdetaildowereporttostakeholders? 30

    Whatdoweincludeinanevaluationreport? 31

    Howcanwemakeourevaluationndingsinterestingandaccessible? 32

    STEP 9: USING EVALUATION DATA 33

    Howdoweuseevaluationtomakeprogramimprovements? 33

    Howdoweuseevaluationtoinformourfutureevaluationactivities? 34

    Howdoweuseevaluationformarketing? 34

    Whoneedstobeinvolvedindecisionsabouthowtousetheevaluationdata? 35

    APPENDIX A: KEY RESOURCES 36

    APPENDIX B: DISCUSSION QUESTIONS 37

    APPENDIX C: GLOSSARY 38

    http://www.hfrp.org/http://twitter.com/hfrpmailto:hfrp_pubs%40gse.harvard.edu?subject=Afterschool%20Evaluation%20101mailto:hfrp_pubs%40gse.harvard.edu?subject=Afterschool%20Evaluation%20101http://twitter.com/hfrphttp://www.hfrp.org/
  • 7/29/2019 after school programes

    3/44

    IntroductionThistoolkitisintendedtohelpprogramdirectorsandstaffofafterschool,summer,andotherout-

    of-schooltime(OST)programsdevelopanevaluation strategy.

    THE IMPORTANCE OF EVALUATION AND DEVELOPING AN EVALUATION STRATEGY

    What is an evaluation?

    EvaluationhelpsyourOSTprogrammeasurehowsuccessfullyithasbeenimplementedandhow

    wellitisachievingitsgoals.Youcandothisbycomparing:

    The activities you intended to implement

    The outcomes you intended to accomplish

    u

    u

    The activities you actually implemented

    The outcomes you actually achieved

    Forexample,ifyouhavecreatedanOSTprogramthathighlightsrecreationandnutrition

    education,withthegoalofimprovingchildrensabilitiestomakehealthychoices,anevaluation

    canhelpyoudetermineifyourprogramhasbeenimplementedinawaythatwillleadtopositive

    outcomesrelatedtoyourgoals(e.g.,whetheractivitiesofferopportunitiesforphysicalactivity

    orlearningaboutbetternutrition).Inaddition,anevaluationcanhelpdeterminewhether

    participantsareactuallymakinghealthierchoices,suchaschoosingrecessgamesthatinvolvephysicalactivity,orselectingfruitasasnackratherthancookies.

    Who should conduct an evaluation?

    EveryOSTprogramcanandshouldcollectatleastsomebasicinformationtohelpevaluateits

    success.Evenanewprogramthatisjustgettingupandrunningcanbegintoevaluatewhois

    participatingintheprogram,andwhy.Amoreestablishedprogram,meanwhile,canevaluate

    outcomesforparticipantsrelatedtoprogramgoals.Andaprogramthatisgoingthrougha

    transitioninitsgoals,activities,andfocuscanuseevaluationtotestoutnewstrategies.

    Why conduct an evaluation?

    Informationgatheredduringanevaluationhelpsdemonstrateyourprogramseffectivenessand

    providesvaluableinsightintohowtheprogramcanbetterserveitspopulation.Manyprograms

    useevaluationsasanopportunitytoidentifystrengths,aswellasareasthatneedimprovement,

    sotheycanlearnhowtoimproveservices.Inaddition,grantmakersoftenrequireevaluationsoftheprogramstheyinvestintoensurethatfundsarewellspent.Positiveoutcomesforprogram

    participants(suchasimprovedacademicperformanceorfewerdisciplineproblems)foundduring

    anevaluationcanalsohelpsellyourprogramtonewfunders,families,communities,andothers

    whomaybenetfrom,orprovidebenetsto,yourprogram.

    What is an evaluation strategy?

    Anevaluationstrategyinvolvesdevelopingawell-thoughtoutplanforevaluatingyourprogram,

    withthegoalofincorporatingthelessonslearnedfromtheevaluationintoprogramactivities.As

    partofthislargerstrategy,evaluationisnotviewedmerelyasaone-timeeventtodemonstrate

    results,butinsteadasanimportantpartofanongoingprocessoflearning and continuous

    improvement.

    Why create an evaluation strategy?

    Creatinganevaluationstrategycanhelpyoutocreateaplanforevaluationthatcanbothservethefundersrequirementsandalsoinformeffortstoimproveyourprogram.Anevaluationstrategy

    canhelpyourstaffrecognizetheevaluationasabenecialprocess,ratherthanasanadded

    burdenimposedbyfunders.Eveniftheevaluationresultssuggestroomforimprovement,the

    factthattheprogramiscollectingsuchdataindicatesacommitmenttolearningandcontinuous

    improvementandgivesapositiveimpressionoftheprogramspotential.

    How do we conduct a program evaluation?

    Thereisnosinglerecipeforconductingaprogramevaluation.OSTprogramsuseavarietyof

    evaluationapproaches,methods,andmeasuresbothtocollectdataforprogramimprovement

    _______________________________________________________________________________________Follow us on twitter @HFRP

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

  • 7/29/2019 after school programes

    4/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    2 Afterschool Evaluation 101

    andtodemonstratetheeffectivenessoftheirprograms.Itisuptoleadersatindividualprogram

    sitestodeterminethebestapproach:onesuitedtotheprogramsdevelopmental stage,theneeds

    ofprogramparticipantsandthecommunity,andfunderexpectations.Onecommonchallenge

    isdevelopinganevaluationapproachthatsatisesthedifferentinterestsandneedsofthe

    variousstakeholders:Fundersmaywantonething,whileparentsandprogramstaffwantanother.

    Regardlessoftheapproach,yourevaluationstrategyshouldbedesignedwithinalargerquestion

    ofhowthedatayoucollectcanbeusedtoshapeandimproveyourprogramsactivities.

    HOW TO USE THIS TOOLKIT

    Thistoolkitstressestheneedtocreatealargerevaluationstrategytoguideyourevaluation

    plans.Thistoolkitwillwalkyouthroughthestepsnecessarytoplanandimplementanevaluation

    strategyforyourOSTprogram.

    This toolkit is structured in a series of nine steps:

    Step1helpsyoutodeterminetheoverallpurposeofyourevaluation.

    Step2outlineshowtocreatealogic model,whichisavisualrepresentationofyourprogram

    strategythatcanguideyourevaluation.

    Step3describeshowtothinkthroughwhatresourcesyouhaveavailable(stafng,etc.)to

    actuallyconductanevaluation.

    Step4discusseshowbesttofocusyourevaluation,basedonyourprogramsneeds,

    resources,anddevelopmentalstage.

    Steps5and6coverselectingtheevaluationdesignanddatacollectionmethodsthatare

    bestsuitedtoyourprogram.

    Steps7,8,and9containinformationaboutwhattodowiththedataonceyouhaveit,

    includinghowtoconductandwriteuptheanalysisand,perhapsmostimportantly,howto

    usethedatathatyouhaveanalyzed.

    Notethateachstepisdesignedtobuildonthelast;however,youmayprefertoskipaheadtoa

    specictopicofinterest.Thistoolkitalsoincludesasetofdiscussionquestionsrelatedtoeach

    sectionthatyouandothersinvolvedintheevaluationmaywanttoconsider.Inaddition,thereisa

    Glossaryofevaluationterms(wordsaredenotedinbold blue text)includedintheAppendixatthe

    endofthisdocument.

    OTHER EVALUATION RESOURCESThistoolkitoffersthebasicsinthinkingaboutandbeginningtoplananevaluationstrategyfor

    yourOSTprogram;youmayalsowanttoconsultadditionalresourceswhenitistimetoimplement

    yourevaluation.Beyondthistoolkit,HarvardFamilyResearchProjecthasseveralotherresources

    andpublicationsthatyoumayndhelpful:

    The Out-of-School Time Research and Evaluation Databaseallowsyoutosearchthrough

    proleswrittenaboutevaluationsandresearch studiesconductedofOSTprogramsand

    initiatives.Youcansearchthedatabasebyprogramtypetondprogramssimilartoyour

    ownthathaveconductedevaluations,orbymethodologytoseeexamplesofvarioustypesof

    evaluationmethodsinpractice.

    Measurement Tools for Evaluating OST ProgramsdescribesinstrumentsusedbyOST

    programstoevaluatetheirimplementationandoutcomes.Thisresourcecanprovideideas

    forpossibledatacollectioninstrumentstouseoradaptforyourprogram.

    Detangling Data Collection: Methods for Gathering Dataprovidesanoverviewofthemost

    commonlyuseddatacollectionmethodsandhowtheyareusedinevaluation.

    Performance Measures in Out-of-School Time Evaluationprovidesinformationaboutthe

    perormance measuresthatOSTprogramshaveusedtodocumentprogressandmeasure

    resultsofacademic,youthdevelopment,andpreventionoutcomes.

    Learning From Logic Models in Out-of-School Timeoffersanin-depthreviewoflogicmodels

    andhowtoconstructthem.Alogicmodelprovidesapointofreferenceagainstwhich

    progresstowardsachievementofdesiredoutcomescanbemeasuredonanongoingbasis

    throughbothperformancemeasurementandevaluation.

    Navigation Tip:The blue glossary terms, red resource

    titles, and white italic publication

    titles in the related resources

    sidebars, are all clickable links that

    will take you to the relevant part o

    this document or to the resources

    page onwww.hrp.org. The menu

    on the rst page o each section is

    also clickable to help you navigate

    between sections.

    http://hfrp.org/OSTDatabasehttp://hfrp.org/OSTMeasurementToolshttp://hfrp.org/DetanglingDataCollectionhttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/http://www.hfrp.org/http://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/DetanglingDataCollectionhttp://hfrp.org/OSTMeasurementToolshttp://hfrp.org/OSTDatabase
  • 7/29/2019 after school programes

    5/44

    How to Evaluate an Expanded Learning Program

    _______________________________________________________________________________________Follow us on twitter @HFRP

    TheseandotherrelatedHFRPresourcesarereferencedwithintherelevantsectionsofthistoolkit

    forthosewhowantadditionalinformationonthattopic.Thistoolkitalsocontainsalistofkey

    resourcesforevaluationtoolsdevelopedbyothersthatwehavefoundespeciallyhelpful.Youwill

    ndalinktothisresourcelistoneachpage.

    ACKNOWLEDGMENTS

    PreparationofthisToolkitwasmadepossiblethroughthesupportoftheCharlesStewartMott

    Foundation.WearealsogratefulforthereviewandfeedbackfromPriscillaLittle,SuzanneLe

    Menestrel,andLisaStClair.MarcellaFranckwasalsoinstrumentalinconceptualizingthistool,

    whileCarlyBourneprovidededitorialsupportandguidanceinframingthistoolforapractitioner

    audience.ThankyoualsotoNaomiStephenforherassistancewithediting.Partsofthistoolwere

    adaptedfromour2002evaluationguide,Documenting Progress and Demonstrating Results:

    Evaluating Local Out-of-School Time Programs.

    http://hfrp.org/DocumentingProgresshttp://hfrp.org/DocumentingProgresshttp://hfrp.org/DocumentingProgresshttp://hfrp.org/DocumentingProgress
  • 7/29/2019 after school programes

    6/44

    4 Afterschool Evaluation 101

    STEP 1: Determining the Evaluations PurposeProgramsconductevaluationsforavarietyofreasons.Itisimportantforyourprogramto

    determine(andbeinagreementabout)thevariousreasonsthatyouwantorneedanevaluation

    beforebeginningtoplanit.Therearefourquestionsthatareessentialtoconsiderindetermining

    thepurposeofyourevaluation:

    Whyshouldweevaluateourprogram?

    Whatisourprogramtryingtodo?

    Whatinformationdoourfundersexpect?

    Whatquestionsdowewanttoanswer?

    1. WHY SHOULD WE EVALUATE OUR PROGRAM?

    Inplanningforanevaluation,startbythinkingaboutwhyyouwanttoconductanevaluation.

    Whatarethebenetsandhowwillitbeused?Anevaluationshouldbeconductedforaspecic

    purpose,andallpartiesinvolvedshouldhaveaclearunderstandingofthispurposeastheystart

    theevaluationplanning.

    OSTprogramsusuallyconductanevaluationforone,orboth,oftwoprimaryreasons:

    To aid learning and continuous improvement.Ratherthanbeingmerelyastaticprocess

    whereinformationiscollectedatasinglepointintime,anevaluationcanbecomea

    practicaltoolformakingongoingprogramimprovements.Evaluationdatacanhelpprogram

    managementmakedecisionsaboutwhatis(andisnt)working,whereimprovementis

    needed,andhowtobestallocateavailableresources.

    To demonstrate accountability. Evaluationdatacanbeusedtodemonstratetocurrent

    fundersthattheirinvestmentsarepayingoff.

    Therearealsotwosecondarypurposesthatmaydriveevaluations:

    To market your program.Evaluationresults,particularlyregardingpositiveoutcomesfor

    youthparticipants,canbeusedinmarketingtoolssuchasbrochuresorpublishedreports

    torecruitnewparticipantsandtopromotetheprogramtothemediaaswellastopotential

    fundersandcommunitypartners.

    To build a case for sustainability.Evaluationresultscanillustrateyourprogramsimpacton

    participants,families,schools,andthecommunity,whichcanhelptosecurefundingand

    otherresourcesthatwillallowyourprogramtocontinuetooperate.

    Beclearfromthebeginningaboutwhyyouareconductingtheevaluationandhowyouplantouse

    theresults.

    2. WHAT IS OUR PROGRAM TRYING TO DO?

    Oneoftheinitialstepsinanyevaluationistodeneprogramgoalsandhowservicesaimtomeet

    thesegoals.Ifyouarecreatinganevaluationforanalready-establishedprogram,chancesare

    thattheprogramsgoals,inputs,andoutputsarealreadydened(seemoredetaileddescriptions

    ofgoals,inputs,andoutputsinStep2).Ifnot,orifyouarestartinganewprogram,theseelements

    willneedtobedetermined.Manyestablishedprogramscanalsobenetfromrevisitingtheirexistinggoals,inputs,andoutputs,andtweakingthemasnecessarytoensurethattheprogramis

    aseffectiveaspossible.

    AsyouwilllearninStep2,yourinputs,outputs,andgoalsshouldallhavealogicalconnectionto

    oneanother.So,forexample,ifyourprogramaimstoimproveacademicoutcomes,youractivities

    shouldincludeafocusonacademicinstructionorsupport,suchashomeworkhelporaprogram

    curriculumthatisdesignedtocomplementin-schoollearning.

    Ausefulapproachtogoal-settingisthedevelopmentofalogicmodel.Step2ofthistoolkit

    denesthisterm,andexploreshowtodevelopalogicmodelandalsohowtouseitforyour

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

    ______________________________________________________________________________________Visit us at www.hfrp.org

    RELATED RESOURCES FROM HFRP.ORG

    A Conversation With Gary Henry

    Why Evaluate Afterschool

    Programs?

    Why, When, and How to Use

    Evaluation: Experts Speak Out

    Evaluation Tip:Having a discussion with everyone

    involved in your program to clariy

    goals together can help everybody,

    including sta, create a shared

    understanding o your programs

    purpose.

    Evaluation Tip:I you are conducting an evaluation

    or accountability to unders,

    consider incorporating it into a

    strategy that also allows you to use

    the data you collect or learning

    and continuous improvement,

    rather than just or the purpose o

    ullling a under mandate.

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/a-conversation-with-gary-henryhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/why-evaluate-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/why-evaluate-after-school-programshttp://www.hfrp.org/publications-resources/browse-our-publications/why-when-and-how-to-use-evaluation-experts-speak-outhttp://www.hfrp.org/publications-resources/browse-our-publications/why-when-and-how-to-use-evaluation-experts-speak-outhttp://www.hfrp.org/publications-resources/browse-our-publications/why-when-and-how-to-use-evaluation-experts-speak-outhttp://www.hfrp.org/publications-resources/browse-our-publications/why-when-and-how-to-use-evaluation-experts-speak-outhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/why-evaluate-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/why-evaluate-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/a-conversation-with-gary-henry
  • 7/29/2019 after school programes

    7/44

    How to Evaluate an Expanded Learning Program

    _______________________________________________________________________________________Follow us on twitter @HFRP

    evaluation.Whetherornotyouchoosetodevelopalogicmodel,itiscrucialthatyourprogram

    haveclearlyarticulatedgoalsandobjectivesinordertodeterminetheevaluationspurposeand

    focus.

    3. WHAT INFORMATION DO OUR FUNDERS EXPECT?

    Manyprogramsentertheevaluationprocesswithspecicevaluationrequirements.Fundersmayrequireprogramstocollectinformationaboutparticipants,theirfamilies,andtheservicesthey

    receive.Somefundersalsoimposespecictimeframes,formats,anddisseminationprocedures

    forreportingresults.Forexample,afundermaytellyouthatyouneedtoproducea10-to-15-

    pageevaluationreportcoveringtherstyearoffundingthatdescribesprogramimplementation

    successesandchallenges,andthatthereportshouldbemadepubliclyavailableinsomeway

    (suchaspostingitonyourownorthefunderswebsite).

    Thefollowingstrategiescanhelpprogramsnegotiatewithfundersaboutevaluationrequirements:

    Work with funders to clarify what information they expect and when they expect it.Maintain

    acontinuingdialoguewithfundersaboutthekindsofinformationtheyareinterestedinand

    wouldnduseful.Askfundershowevaluationresultswillbeusedforexample,anevaluation

    oftherstyearofaprogramsoperationmightbeusedonlytoestablishabaseline.Find

    outiftheevaluationisintendedtobeormative(providinginformationthatwillstrengthen

    orimproveyourprogram)or summative(judgingyourprogramsoutcomesandoveralleffectiveness).

    Allow for startup time for your program before investing in an evaluation.Aprogrammustbe

    establishedandrunningsmoothlybeforeitisreadyforaformalevaluation.Inmostcases,a

    programwillnotbereadyforafull-blownevaluationinitsrstyearofimplementation,since

    therstyeartendstoinvolvealotoftrialanderror,andrenementstoprogramstrategies

    andactivities.Theserenementsmayevenextendtoasecondorthirdyear,dependingon

    theprogram.Startuptimecanbeusedforotherevaluation-relatedtasks,however,such

    asconductinganeeds assessment(seeStep4)andcollectingbackgrounddataonthe

    populationtargetedbyyourprogram.

    Think collaboratively and creatively about effective evaluation strategies.Funderscanoften

    providevaluableinsightsintohowtoevaluateyourprogram,includingsuggestionsonhowto

    collectdatatohelpwithprogramimprovement.

    Workwiththefundertosetrealisticexpectationsforevaluationbasedonhowlongyourprogramhasbeeninoperationthatis,theprogramsdevelopmental stage.Beexplicit

    aboutreasonabletimeframes;forinstance,itisunrealistictoexpectprogressonlong-term

    participantoutcomesafteronlyoneyearofprogramparticipation.

    Try to negotiate separate funds for the evaluation component of your program.Besureto

    noteanyadditionalstafngandresourcesneededforevaluation.

    Workwithfunderstocreateanevaluationstrategythatsatisestheirneeds,aswellas

    your programs needs.Evaluationshouldbeseenaspartofalargerlearningstrategy,rather

    thanjustaone-timeactivitydemonstratingaccountabilitytofunders.Ifthefundersaretruly

    supportiveofyourprogram,theyshouldwelcomeanevaluationplanthatincludescollecting

    datathatcanbeusedforprogramimprovements.

    Ifyourprogramhasmultiplefundersthatrequireyoutoconductanevaluation,negotiatingan

    evaluationstrategythatmeetsalloftheirneedsislikelytobeanadditionalchallenge.Addressingtheissuesoutlinedabovewhenplanningyourevaluationstrategycanhelpyoutobetternavigate

    diversefunderrequirements.

    4. WHAT QUESTIONS DO WE WANT TO ANSWER?

    Beforebeginninganevaluation,youmustdecidewhichaspectsofyourprogramyouwanttofocus

    oninordertoensurethattheresultswillbeuseful.Answeringthequestionsbelowcanhelpyou

    toformevaluationquestionsthatarerealisticandreasonable,givenyourprogramsmissionand

    goals.Askyourselfthefollowing:

    Evaluation Tip:Set ambitious but realistic

    expectations or the evaluation. Do

    not give in to under pressure to

    provide results that you think you

    are unlikely to be able to deliver

    just to appease the under.

  • 7/29/2019 after school programes

    8/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    6 Afterschool Evaluation 101

    Whatisourmotivationforconductinganevaluation?Thatis,areweprimarilymotivatedbya

    desireforlearningandcontinuousimprovement,funderaccountability,marketing,buildinga

    caseforsustainability,orsomecombinationofthese?(AsoutlinedinWhy should we evaluate

    our program?)

    Whatdatacanwecollectthatwillassistlearningandcontinuousimprovementwithinour

    program?

    Arewerequiredbyafundertofollowaspecicreportingformat?

    Whatisourtimeframeforcompletingtheevaluation?

    Howcanwegainsupportfortheevaluationfromprogramstaff,schools,andotherswithan

    interestinourprogram,andhowdoweinvolvethemintheprocess?

    Howcanweprovideinformationthatisusefultoourstakeholders?

    Allprogramsshouldatleastexamineprogramparticipation datatodeterminethedemographicsof

    thosewhoparticipate,howoftentheyparticipate,andwhethertheyremainintheprogram.Step6

    providesguidanceonhow(andwhy)tostartcollectingparticipationdata.

    Dependingonyourresponsestotheabovequestions,evaluationquestionsyoumaywantto

    considerinclude:

    Doesourprogramrespondtoparticipantneeds?

    Whatarethecostsofourprogram?

    Whostaffsourprogram?Whattrainingdotheyneed?

    Whatservicesdoesourprogramprovide?Howcanweimprovetheseservices?

    Whatisourprogramsimpactonyouthparticipantsacademic,social,andphysicalwell-

    being?

    Whatrolesdofamiliesandthelargercommunityplayintheprogram,andhowdoesthe

    programbenetthem?

    RELATED RESOURCES FROM HFRP.ORG

    KIDS COUNT Self-Assessment:

    Bridging Evaluation With Strategic

    Communication of Data on Children

    & Families

    Supporting Effective After SchoolPrograms: The Contribution of

    Developmental Research

    Evaluation and the Sacred Bundle

    Evaluation Tip:Use what you already know to help

    shape your evaluation questions. I

    you have done previous evaluations

    or needs assessments, build on what

    you learned to crat new evaluationquestions and approaches. You can

    also use existing data sets, such as

    those rom Kids Count and the U.S.

    Census Bureau, as well as other

    programs evaluations and existing

    youth development research. See Key

    Resources or more inormation about

    existing data sets.

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-and-the-sacred-bundlehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-and-the-sacred-bundlehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/supporting-effective-after-school-programs-the-contribution-of-developmental-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-familieshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/strategic-communications/kids-count-self-assessment-bridging-evaluation-with-strategic-communication-of-data-on-children-families
  • 7/29/2019 after school programes

    9/44

    How to Evaluate an Expanded Learning Program

    STEP 2: Developing a Logic ModelIndesigninganevaluation,itisimportanttohaveaclearunderstandingofthegoalsofthe

    programtobeevaluated,andtoberealisticaboutexpectations.Alogicmodelisausefultoolthat

    willassistyouindeningprogramgoalsandguringoutthefocusoftheevaluation.Thissection

    addressesthefollowingquestions:

    Whatisalogicmodelandwhyshouldwecreateone?

    Whatdoesalogicmodellooklike?

    Howdowedeneourgoals?

    Howdowedeneourinputsandoutputs?

    Howdoweidentifyouroutcomes?

    Howdowedevelopperformancemeasures?

    1. WHAT IS A LOGIC MODEL AND WHY SHOULD WE CREATE ONE?

    Alogic modelisaconcisewaytoshowhowaprogramisstructuredandhowitcanmakea

    differenceforaprogramsparticipantsandcommunity.Itisaone-pagevisualpresentationoften

    usinggraphicalelementssuchascharts,tables,andarrowstoshowrelationshipsthatdisplays:

    Thekeyelementsofaprogram(i.e.,itsactivitiesandresources).

    Therationalebehindtheprogramsservicedeliveryapproach(i.e.,itsgoals).

    Theintendedresultsoftheprogramandhowtheycanbemeasured(i.e.,theprograms

    outcomes).

    Thecause-and-effectrelationshipsbetweentheprogramanditsintendedresults.

    Alogicmodelalsocanhelpidentifythecoreelementsofanevaluationstrategy.Likean

    architectsscalemodelofabuilding,alogicmodelisnotsupposedtobeadetailedblueprint

    ofwhatneedstohappen.Instead,thelogicmodellaysoutthemajorstrategiestoillustrate

    howtheyttogetherandwhethertheycanbeexpectedtoadduptothechangesthatprogram

    stakeholderswanttosee.

    Creatingalogicmodelatthebeginningoftheevaluationprocessnotonlyhelpsprogramleaders

    thinkabouthowtoconductanevaluation,butalsohelpsprogramschoosewhatpartsoftheir

    program(e.g.,whichactivitiesandgoals)theywanttoevaluate;itisoneofthekeymethods

    usedtoassistorganizationsintrackingprogramprogresstowardsimplementingactivitiesand

    achievinggoals.Sincecreatingalogicmodelrequiresastep-by-steparticulationofthegoals

    ofyourprogramandtheproposedactivitiesthatwillbeconductedtocarryoutthosegoals,a

    logicmodelcanalsobehelpfulinchartingtheresourcesyourprogramneedstocarryoutthe

    evaluationprocess.

    2. WHAT DOES A LOGIC MODEL LOOK LIKE?

    Althoughlogicmodelscanbeputtogetherinanumberofdifferentways,thefollowing

    componentsareimportanttoconsiderwhenconstructingyourlogicmodel:

    Goalswhatyourprogramultimatelyhopestoachieve.Sometimesorganizationschoosetoputtheirgoalsattheendofthelogicmodeltoshowalogicalprogression.However,yourgoalsshould

    drivetherestofyourlogicmodel,andforthatreason,youmaywanttoconsiderputtingthegoals

    rightatthebeginning.

    InputstheresourcesatyourprogramsdisposaltousetoworktowardprogramgoalsThese

    resourcesincludesuchsupportsasyourprogramsstaff,fundingresources,andcommunity

    partners.

    Outputstheservicesthatyourprogramprovidestoreachitsgoals.Theseserviceswillprimarily

    consistoftheprogramactivitiesofferedtoyouthparticipants,althoughyoumayalsoofferother

    _______________________________________________________________________________________Follow us on twitter @HFRP

    RELATED RESOURCES FROM HFRP.ORG

    Eight Outcome Models

    An Introduction to Theory of

    Change

    Learning From Logic Models

    Out-of-School Time

    Evaluating Complicatedand

    ComplexPrograms Using

    Theory of Change

    Using Narrative Methods to

    Link Program Evaluation and

    Organization Development

    RELATED RESOURCES FROM HFRP.ORG

    Logic Models in Out-of-Schoo

    Time

    Logic Model Basics

    Project HOPE: Working Acro

    Multiple Contexts to Support

    At-Risk Students

    Theory of Action in Practice

    Logic Models in Real Life: Af

    School at the YWCA of Ashe

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/eight-outcome-modelshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/an-introduction-to-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/an-introduction-to-theory-of-changehttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-model-basicshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/theory-of-action-in-practicehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/theory-of-action-in-practicehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-model-basicshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/using-narrative-methods-to-link-program-evaluation-and-organization-developmenthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluating-complicated-and-complex-programs-using-theory-of-changehttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/out-of-school-time/publications-resources/learning-from-logic-models-in-out-of-school-timehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/an-introduction-to-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/an-introduction-to-theory-of-changehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/eight-outcome-models
  • 7/29/2019 after school programes

    10/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    8 Afterschool Evaluation 101

    servicessuchasactivitiesaimedatfamiliesandcommunitiesthatalsowillbepartofyour

    outputs.Aspartofthisstep,itisimportanttohaveaclearpictureofthespecictargetpopulation

    foryouractivities(e.g.,girls,low-incomeyouth,aspecicagerange,youthlivinginaspecic

    community).

    Outcomesyourprogramsdesiredshort-term,intermediate,andlong-termresults.Generally,

    short-termoutcomesfocusonchangesinknowledgeandattitudes,intermediateoutcomesfocus

    onchangesinbehaviors,andlong-termoutcomestendtofocusonthelargerimpactofyour

    programonthecommunity.

    Performance measuresthedatathatyourprogramcollectstoassesstheprogressyourprogram

    hasmadetowarditsgoals.Thesedatainclude:

    Measures o eort,whichdescribewhetherandtowhatextentoutputswereimplemented

    asintended(e.g.,thenumberofyouthservedinyourprogram,thelevelofyouthandparent

    satisfactionwiththeprogram).

    Measures o eect,whichconveywhetheryouaremeetingyouroutcomes(e.g.,improvements

    inyouthparticipantsskills,knowledge,andbehaviors).

    ThetablebelowprovidesasamplelogicmodelforanOSTprogramthatfocusesonproviding

    academicsupporttoyouth.

    Forexamplesofspecicprogramslogicmodels,see:

    Project HOPE: Working Across Multiple Contexts to Support At-Risk Students

    Theory of Action in Practice

    Logic Models in Real Life: After School at the YWCA of Asheville

    3. HOW DO WE DEFINE OUR GOALS?

    Whilethegoalsaretheresultsthatyourprogramaimstoachieve,theymustbeconsideredfrom

    thebeginningsincealloftheotherpiecesofthelogicmodelshouldbeinformedbythosegoals.

    Todetermineyourgoals,askyourself:Whatareweultimatelytryingtoachievewithourprogram?

    Forestablishedprograms,thegoalsmayalreadybeinplace,butitisimportanttoensurethatthe

    goalsareclearandrealistic,andthatthereisacommonunderstandingandagreementabout

    thesegoalsacrossprogramstakeholders.Revisitingexistinggoalscanallowtimeforreection,as

    wellaspossiblerenementofthegoalstobetterttheprogramscurrentfocusandstakeholders

    currentinterests.

    TABLE 1: Example of a logic model for an academically focused OST program

    Goals Inputs Outputs Outcomes Perormance measures

    Improve the

    academic

    development and

    perormance o at-

    risk students.

    Program sta

    Funding

    School & community

    partners

    Activities:

    Academic enrichment

    Homework help/tutoring

    Target population:

    Children in the local community

    classied as at risk or academic

    ailure based on amily income

    and poor school perormance.

    Short-Term:

    Greater interest in school

    Intermediate:

    Improved academic grades

    and test scores

    Long-Term:

    Higher graduation and

    college attendance rates

    Measures o eort:

    Number o youth served in the program

    Number o sessions held

    Level o youth and parent satisaction

    with the program

    Measures o eect:

    Changes in participants academic

    grades

    Test scores

    Graduation rates College attendance rates

    Evaluation Tip:There are many ways to depict a logic

    modelthere is no one right way to

    do it. Choose the approach that works

    best or your program.

    Evaluation Tip:Consult a variety o program

    stakeholders to identiy what they

    hope to get out o the program.

    Their eedback can help inorm the

    programs goals. These stakeholders

    could include program sta, amilies,

    community partners, evaluators,

    principals, teachers and other school

    sta, and the participants themselves.

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-studentshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/theory-of-action-in-practicehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/logic-models-in-real-life-after-school-at-the-ywca-of-ashevillehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/professional-development/theory-of-action-in-practicehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/complementary-learning/project-hope-working-across-multiple-contexts-to-support-at-risk-students
  • 7/29/2019 after school programes

    11/44

    How to Evaluate an Expanded Learning Program

    _______________________________________________________________________________________Follow us on twitter @HFRP

    4. HOW DO WE DEFINE OUR INPUTS AND OUTPUTS?

    Onceyouhaveaclearsetofgoalsforyourprogram,youcanstarttothinkabouttheelementsof

    yourprogramthatis,theinputsandoutputsthathelpyourprogramtoachieveitsgoals.These

    inputsandoutputsshouldhaveadirectlinktoyourgoals.Forexample,ifyourprimarygoalrelates

    toacademicachievement,yourprogramshouldincludeoutputsoractivitiesthataredirectly

    workingtoimproveacademicachievement(e.g.,tutoringorenrichment).

    Indeningyourinputs(resources),considerthefollowing:

    Whatresourcesareavailabletoourprogrambothprograminfrastructure,suchasstafng

    andfunding,andexistingcommunityresources,suchassocialservicesupportsforfamilies

    thatwecanusetoworktowardourgoals?

    Arethereadditionalinputsthatweneedtohaveinplaceinordertoimplementourprogram?

    Indeningyouroutputs(activitiesandtargetpopulation),considerthefollowing:

    Whatactivitiesshouldourprogramoffer(tutoring,sports,etc.)tobestmeetourgoals?

    Canweimplementtheseactivitieswiththeinputs(resources)available?

    Whomdowehopetoserveinourprogram?Whatages,demographics,neighborhoods,etc.,

    dowetarget?

    Shouldweservefamiliesand/orthelargercommunityinadditiontoyouth?Ifso,how?

    Doesourtargetparticipantpopulationalignwiththedemographicsofthelocalcommunity?

    5. HOW DO WE IDENTIFY OUR OUTCOMES?

    Whilegoalsexpressthebig-picturevisionforwhatyourprogramaimstoaccomplish,outcomes

    aretheon-thegroundimpactsthatyourprogramhopestoachieve.Forexample,ifyourgoalisto

    improvetheacademicdevelopmentandperformanceofat-riskstudents,yourshort-termintended

    outcomemightbetoincreasestudentsinterestinschool,withthelong-termanticipatedoutcome

    ofhighergraduationandcollegeattendancerates.Assuch,youshouldselectoutcomesthatare

    SMART:Specic,Measurable,Action-oriented,Realistic,andTimed.Thesmarteryouroutcomes

    are,theeasieritwillbetomanageperformanceandassessprogressalongtheway.

    Specic.Tobeusefulandmeaningful,outcomesshouldbeasspecicaspossible.Forexample,anoutcomeofimprovedacademicachievementissomewhatvaguewhatdoesthismean?

    Cleareroutcomesforacademicachievementcouldincludeimprovedtestscores,grades,or

    graduationrates.

    Measurable.Withouttheabilitytomeasureyouroutcomes,youhavenowayofreallyknowing

    if,orhowmuch,yourprogramhasanimpactonoutcomes.Outcomesthatarespecicarealso

    morelikelytobemeasurable.Forinstance,outcomesintheexampleaboveimprovedtest

    scores,grades,andgraduationratesallhavedatacollectedbyschoolsthatcanbeusedtotrack

    progresstowardtheseoutcomes.

    Action-Oriented.Outcomesarenotpassiveby-productsofprogramactivitiestheyrequire

    ongoingefforttoachieve.Ifyourprogramisnotpursuingactivitiesaimedatproducingaspecic

    outcome,itisnotreasonabletoexpectthatoutcometoresultfromyourprogram.Soforexample,

    anoutcomeofincreasedfamilyinvolvementinchildrenslearningshouldbeaccompaniedbya

    programcomponentinwhichprogramstaffactivelyseekoutparentsupportandengagement.

    Realistic.Outcomesshouldbesomethingthatyourprogramcanreasonablyexpecttoaccomplish,

    oratleastcontributeto,withothercommunitysupports.Theprimaryconsiderationinidentifying

    realisticoutcomesishowwelltheoutcomesarealignedwithandlinkedtoyouractivitiessothat

    thereisalogicalconnectionbetweenyoureffortsandwhatyouexpecttochangeasaresultof

    yourwork.Forexample,aprogramthatdoesnothaveagoalofincreasingacademicachievement

    shouldnotincludeoutcomesrelatedtoparticipanttestscoresorgrades.

    Timed.Logicmodelsallowthedesignationofshort-term,intermediate,and/orlong-term

    outcomes.Short-termandintermediateoutcomesforOSTprogramstendtofocusonthechanges

    Evaluation Tip:Consider the intensityand duratio

    o your activities in determining

    whether outcomes are realistic. Fo

    example, a once-a-week physical

    activity component is unlikely to

    show dramatic decreases in youthobesity.

  • 7/29/2019 after school programes

    12/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    10 Afterschool Evaluation 101

    thatyoucanexpecttoseeinprogramparticipantsafterayear(ormore)ofparticipation,such

    asimprovedgradesorchangesinknowledgeorattitudes.Long-termgoalstendtoinvolvelarger

    changesintheoverallcommunitythatisbeingserved,orchangesinparticipantsthatmaynotbe

    apparentrightaway,suchasincreasedhighschoolgraduationratesorchangesinbehavior.

    6. HOW DO WE DEVELOP PERFORMANCE MEASURES?

    Thenalstageoflogicmodeldevelopmentistodeneyourprogramsperormance measures.

    Thesemeasuresassessyourprogramsprogressontheimplementationofinputsandoutputs.

    Whileoutcomeslayoutwhatyourprogramhopestoaccomplishasawhole,performance

    measuresshouldbenarrowerinscopetoprovidemeasurabledataforevaluation.

    Therearetwotypesofperformancemeasures:

    Measures of effortassesstheeffectivenessofyouroutputs.Theyassesshowmuchyoudid,

    butnothowwellyoudidit,andaretheeasiesttypeofevaluationmeasuretoidentifyand

    track.Thesemeasuresaddressquestionssuchas:Whatactivitiesdoesmyprogramprovide?

    Whomdoesmyprogramserve?Areprogramparticipantssatised?

    Measures of effectarechangesthatyourprogramactingaloneorinconjunctionwith

    partners(e.g.,schoolsorothercommunityorganizations)expectstoproduceinknowledge,

    skills,attitudes,orbehaviors.Thesemeasuresaddressquestionssuchas:Howwillweknow

    thatthechildrenorfamiliesthatweservearebetteroff?Whatchangesdoweexpectto

    resultfromourprogramsinputsandactivities?

    Strongperformancemeasuresshould:

    Have strong ties to program goals, inputs, and outputs.Thereshouldbeadirectand

    logicalconnectionbetweenyourperformancemeasuresandtheotherpiecesofyourlogic

    model.Askyourself:Whatdowehopetodirectlyaffectthroughourprogram?Whatresults

    arewewillingtobedirectlyaccountableforproducing?Whatcanourprogramrealistically

    accomplish?

    Be compatible with the age and stage of your program.Performancemeasuresshouldbe

    selectedbasedonyourprogramscurrentlevelofmaturityanddevelopment.Forexample,a

    programinitsrstyearshouldfocusmoreonmeasuresofeffortthanonmeasuresofeffect

    toensurethattheprogramisimplementedasintendedbeforetryingtoassessoutcomesfor

    participants.

    Consider if the data you need are available/accessible.Performancemeasuresshouldnever

    beselectedsolelybecausethedataarereadilyavailable.Forexample,ifyourprogramdoes

    notseektoimpactacademicoutcomes,itdoesnotmakesensetoexamineparticipants

    grades.Thatsaid,youshouldthinktwicebeforeselectingprogramperformancemeasures

    forwhichdatacollectionwillbeprohibitivelydifcultand/orexpensive.Forexample,donot

    chooseperformancemeasuresthatrequireaccesstoschoolrecordsiftheschoolwillnot

    provideaccesstothesedata.

    Yield useful information to the program.Considerthequestion:Willtheinformation

    collectedbeusefultoourprogramanditsstakeholders?Theanswershouldalwaysbea

    resoundingYes.Todeterminewhetherthedatawillbeuseful,considerthepurposeofyour

    evaluationandwhatyouhopetogetoutofit,asoutlinedinStep1.

    Formoreinformationonperformancemeasuresusedbyafterschoolprograms,seeourOST

    EvaluationSnapshot:Performance Measures in Out-of-School Time Evaluation.

    RELATED RESOURCES FROM HFRP.ORG

    Performance Measures in OST

    Evaluation

    http://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/PerformanceMeasuresinOSTEvaluationhttp://hfrp.org/PerformanceMeasuresinOSTEvaluation
  • 7/29/2019 after school programes

    13/44

    How to Evaluate an Expanded Learning Program

    STEP 3: Assessing Your Programs Capacity or Evaluation

    Onceyouhavedeterminedwhyyouwanttoconductanevaluationandhavedevelopedalogic

    modelforyourprogram,youshouldconsideryourprogramscapacityforevaluation.

    Questionsaddressedinthissectioninclude:

    Whoshouldbeinvolvedintheprocess?

    Whatresourcesmustbeinplace?

    Whowillconductourevaluation?

    Howdowendanexternalevaluator?

    1. WHO SHOULD BE INVOLVED IN THE PROCESS?

    Theinputofallprogramstakeholdersiscriticalinplanninganevaluationstrategy.Stakeholders

    arethosewhoholdavestedinterestinyourprogram.Theyincludeanyonewhoisinterestedin

    orwillbenetfromknowingaboutyourprogramsprogress,suchasboardmembers,funders,

    collaborators,programparticipants,families,schoolstaff(e.g.,teachers,principals,and

    superintendents),collegeoruniversitypartners,externalevaluators,someonefromthenext

    schoollevel(e.g.,middleschoolstaffforanelementaryschool-ageprogram),andcommunity

    partners.

    Recognizingstakeholdersimportancetotheevaluationprocessrightfromthestartisimportant

    forthreereasons.First,suchrecognitioncanincreasestakeholderswillingnesstoparticipatein

    theevaluationandhelpaddressconcernsastheyarise.Second,itcanmakestakeholdersfeel

    thattheyareapartoftheprojectthatwhattheydoorsaymatters.Lastly,itcanmakethenal

    productricherandmoreusefultoyourprogram.

    2. WHAT RESOURCES MUST BE IN PLACE?

    Allocatingresourcestoandfundinganevaluationarecriticaltomakingevaluationareality.

    Consideringevaluationoptionsinvolvesassessingtradeoffsbetweenwhatyourprogramneeds

    toknowandtheresourcesavailabletondtheanswers.Resourcesincludemoney,time,

    trainingcommitments,externalexpertise,stakeholdersupport,andstaffallocation,aswellas

    technologicalaidssuchascomputersandmanagement inormation systems (MIS).

    Resourcesforevaluationshouldbeincorporatedintoallprogramfundingproposals.Evaluation

    costscanbeinuencedbytheevaluationsdesign,datacollectionmethods,numberofsites

    included,lengthofevaluation,useofanoutsideevaluator,availabilityofexistingdata,andtype

    ofreportsgenerated.Inrequestingevaluationfunding,youshouldhaveaclearideaofwhatyour

    programplanstomeasure,whyyouchoseparticulardatacollectionmethods,andhowprogress

    willbemonitored.Theprocessofrequestingfundingforevaluationalsohelpsprogramsdetermine

    whatresourcesareneededforevaluation.

    Manyorganizationscancoverevaluationcostswithresourcesfromtheirprogrambudgets.When

    programresourcescannotsupportevaluations,organizationsmustndcreativewaystoobtain

    resourcesforevaluation.Theseresourcescancomefromarangeofinterestedcommunity

    entitiessuchaslocalbusinesses,schooldistricts,andprivatefoundations,andcanincludecash,

    professionalexpertise,andstafftime.Universitiesinterestedinresearchcanalsomakegreat

    partnersandcanprovideexpertisearounddesignissuesandassistwithdatacollectionand

    analysis.

    3. WHO WILL CONDUCT OUR EVALUATION?

    Sometimesprogramstaffandotherstakeholdershavetheskillsandexperiencenecessaryto

    designandimplementanevaluation.Attimes,however,programsneedthedesignandanalysis

    expertiseofanoutsideconsultant.Further,somefundersstronglyrecommendorrequirethatthe

    evaluationbecompletedbyanobjectiveoutsider.

    _______________________________________________________________________________________Follow us on twitter @HFRP

    RELATED RESOURCES FROM HFRP.ORG

    Pizza, Transportation, and

    Transformation: Youth

    Involvement in Evaluation an

    Research

    Issues and Opportunities in O

    Evaluation: Youth Involvemen

    Evaluation & Research

    Evaluation Tip:Pay particular attention to the

    needs o youth participants and

    their amilies in planning or

    evaluation. As the programs e

    users, they provide an especia

    valuable perspective on the

    program.

    Evaluation Tip:You can always start small and

    tailor your evaluation to available

    resources. A small evaluation ot

    allows or quick eedback, so you

    can use the results in a timely

    ashion. As programs expand and

    resources increase, you can also

    expand your evaluation activities

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/publications-resources/browse-our-publications/youth-involvement-in-evaluation-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-researchhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-2/pizza-transportation-and-transformation-youth-involvement-in-evaluation-and-research
  • 7/29/2019 after school programes

    14/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    12 Afterschool Evaluation 101

    Thereareprosandconstoconsiderwhenhiringanoutsideconsultant.Issuestoconsiderinclude

    thefollowing:

    Costs.Usinganexternalevaluatorislikelytobemorecostlythanusingin-housestaff.However,

    anexternalevaluatorcanalsosavemoneyifanin-houseevaluatorisinefcientorlackstimeand

    expertise.

    Loyalty. Asanunbiasedthirdparty,anexternalevaluatorislikelytobeloyaltotheevaluation

    processitselfratherthanparticularpeopleintheorganization.Yetthislackofprogramloyaltycansometimescreateconictwithprogramstakeholders,whomayseetheevaluatorasanoutsider

    whodoesnotunderstandtheprogramoritsneeds.

    Perspective.Anexternalevaluatormayprovideafreshperspectivewithnewideas.However,this

    freshperspectivecanalsoresultintheexternalevaluatorsinadvertentlyfocusingonissuesthat

    arenotessentialtoyourprogram.

    Time.Externalevaluatorshavethetimetofocusattentionsolelyontheevaluation.Becausethey

    arenotconnectedtotheprogram,however,externalevaluatorsmaytakealongtimetogetto

    knowyourprogramanditspeople.

    Relationships. Theexternalevaluatorsoutsiderperspectivecanbebenecialinmanagingconict

    resolution,butmayalsoresultinalackofthetrustnecessarytokeeplinesofcommunication

    openandeffective.

    4. HOW DO WE FIND AN EXTERNAL EVALUATOR?

    Theuseofanexternalevaluatorshouldbeconsideredintermsofyouravailableresources:Do

    youhaveaccesstoknowledgeableevaluators?Howmuchtimecanprogramstaffallocateto

    evaluationresponsibilities?Doesyourprogramhavethestaffresourcestodevelopin-houseforms

    forevaluationpurposes?Whattypeofexpertisewilltheevaluationreallyneed,andhowmuchcan

    yourprogramdoitself?

    Ifyoudodecidetohireanexternalevaluator,startbyresearchingthoserecommendedby

    peopleyouknow.The American Evaluation Associationisalsoagoodresourceforidentifying

    reputableevaluators.Fromtheserecommendations,identifythosewhogivefreeconsultations,

    haveareputationforthetypeofevaluationyouwanttodo,andareabletoworkwithinyourtime

    frameandyourbudget.Finally,alwaysaskforaproposalandabudgettohelpyoumakeyour

    decisionandtobeclearaboutwhattheevaluatorwilldo.Thefollowingaresomequestionstoaskevaluatorcandidates:

    Howlonghaveyoubeendoingthiswork?

    Whattypesofevaluationsdoyouspecializein?

    Whatotherorganizationshaveyouworkedwith?

    Canyoushowmesamplesofnalevaluationreports?

    Whatrole,ifany,doyouexpecttheprogramstaffandadministrationtoplayinhelpingyouto

    shapetheevaluationprocess,includingtheevaluationquestions,methods,anddesign?

    Haveyoueverworkedwithyouthorfamilies?

    Willyoubetheoneworkingwithus,ordoyouhavepartners?

    Howdoyoucommunicatechallenges,progress,andprocedures?

    Doyouconductphoneconsultationsorfollow-upvisits?Ifso,areextracostsinvolved?

    Canwereviewandrespondtoreportsbeforetheyarenalized?

    Agoodexternalevaluatorcanbeexpectedtoobserveyourprogramsday-to-dayactivitiesat

    length;besensitivetotheneedsofparticipants,families,andstaff;communicatereadilyand

    effectivelywithyourprogram;inspirechangeandassessprocessestoimplementchange;identify

    programneeds;andpromoteprogramownershipoftheevaluation.

    Evaluation Tip:You may be able to seek help or your

    evaluation rom within your program or

    rom the larger community, including

    sta who studied evaluation beore

    joining your program, volunteers rom

    local businesses, or students at local

    colleges and universities.

    Evaluation Tip:HFRPs OST Database includes

    contact inormation or the evaluators

    and researchers responsible or each

    o the studies in the database. You

    may want to use the database as a

    source to identiy potential external

    evaluators or your program.

    http://www.eval.org/http://hfrp.org/OSTDatabasehttp://hfrp.org/OSTDatabasehttp://www.eval.org/
  • 7/29/2019 after school programes

    15/44

    How to Evaluate an Expanded Learning Program

    STEP 4: Choosing the Focus o Your EvaluationOnceyouhavedeterminedwhyyouwanttodoanevaluation,haveassessedyourcapacitytodo

    it,andhavedevelopedalogicmodeltoidentifywhattoevaluate,thenextstepisdeterminethe

    evaluationsfocus.

    Werecommendusinganevaluationapproachcalledthefve tier approach.Thissectionaddresses

    thefollowingissuesrelatedtochoosingastartingpointforevaluation:

    WhatistheFiveTierapproachandwhyuseit?

    Howdoweconductaneedsassessment(Tier1)?

    Howdowedocumentprogramservices(Tier2)?

    Howdoweclarifyourprogram(Tier3)?

    Howdowemakeprogrammodications(Tier4)?

    Howdoweassessprogramimpact(Tier5)?

    1. WHAT IS THE FIVE TIER APPROACH AND WHY USE IT?

    Asitsnamesuggests,theFiveTierapproachdescribestheevaluationprocessasaseriesofve

    stages,outlinedinthetablebelow.

    TABLE 2: The Five Tier Approach to Evaluation

    Evaluation Tier Task Purpose Methods

    Tier 1 Conduct a needs

    assessment

    To address how the program can best meet the

    needs o the local community.

    Determine the communitys need or an OST program.

    Tier2 Document program services To understand how program services are beingimplemented and to justiy expenditures.

    Describe program participants, services provided, and costs.

    Tier 3 Clariy your program To see i the program is being implemented asintended.

    Examine whether the program is meeting its benchmarks, and

    whether it matches the logic model developed.

    Tier 4 Make programmodications

    To improve the program. Discuss with key stakeholders how to use the evaluation data orprogram improvement.

    Tier 5 Assess program impact To demonstrate program eectiveness. Assess outcomes with an experimental or quasi-experimentalevaluation design.

    Thetieronwhichyoubeginyourevaluationislargelydeterminedbyyourprogramsdevelopmental

    stage.Forexample,anewprogramshouldstartwithaTier1evaluation,whileanolder,more

    establishedprogrammightbereadytotackleTier5.Regardlessofwhattieryoubeginyour

    evaluationon,youshouldrstcreatealogicmodel(seeStep2)toassistyouindeningprogram

    goalsandguringoutthefocusoftheevaluation.

    Whiletherearemanyapproachestoprogramevaluation,theFiveTierapproachhasseveral

    importantbenets.First,allprogramsareabletodoatleastsomeevaluationusingoneoftheve

    tiersforinstance,Tier1issomethingthateveryprogramcanandshoulddotohelpensurethat

    yourprogramispositionedtomeetanidentiedneedinthecommunity.Second,thetypeofdata

    thataprogramneedscanchangeovertime;therefore,theevaluationapproachmustbeexible

    enoughtoallowforthis.Third,evaluationisanongoing,cyclicalprocessfeedbackfromonetier

    oftheevaluationcanbeusedtoshapethenextphase.

    _______________________________________________________________________________________Follow us on twitter @HFRP

    RELATED RESOURCES FROM HFRP.ORG

    Evaluation Theory or What Ar

    Evaluation Methods For?

    Mindset Matters

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

    http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-theory-or-what-are-evaluation-methods-forhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-theory-or-what-are-evaluation-methods-forhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-for-continuous-improvement/mindset-mattershttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-for-continuous-improvement/mindset-mattershttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-theory-or-what-are-evaluation-methods-forhttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluation-methodology/evaluation-theory-or-what-are-evaluation-methods-for
  • 7/29/2019 after school programes

    16/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    14 Afterschool Evaluation 101

    2. HOW DO WE CONDUCT A NEEDS ASSESSMENT (TIER 1)?

    ThemaintaskofTier1istodoaneedsassessment,whichisanattempttobetterunderstand

    howyourprogramismeeting,orcanmeet,theneedsofthelocalcommunity.Fornewprograms,

    aneedsassessmentcanhelpindevelopingtheprograminwaysthatbesttthecommunitys

    needs,andcanalsoprotectagainstthetemptationtojustprovideservicesthatareeasyto

    implementratherthanthoseservicesthatchildrenandtheirfamiliesactuallyneed.Forolder

    programs,aneedsassessmentcanserveasachecktobesurethatyourprogramisadequatelyaddressingthecommunitysneeds,andcanhelpbuildacaseforprogramandserviceexpansion.

    Aneedsassessmentcanhelpanswerthefollowingquestions:

    Whatservicesarealreadyofferedtothechildrenandfamiliesinthecommunity?Whereare

    thegaps?

    WhatdoesourcommunitywantfromanOSTprogram?

    Doesourtargetpopulationmatchthelocaldemographics?

    Whatarethepotentialbarrierstoimplementingourprograminthelocalcommunity?

    CommunityneedscanbeidentiedthroughconductingsurveysoforinterviewswithOST

    stakeholders(communityorganizationsandleaders,families,businesses,schoolleaders,etc.).

    Youcanalsomakeuseofexistingstatistics,research,andotherdata(e.g.,censusdataor

    departmentofeducationinformation,evaluationsorresearchstudiesofsimilarprograms)to

    identifyneeds.

    3. HOW DO WE DOCUMENT PROGRAM SERVICES (TIER 2)?

    Tier2evaluationinvolvesdocumentingtheservicesyourprogramprovidesinasystematicway,

    alsocalledprogram monitoring.Programmonitoringhastwobasicfunctions.Oneistobeableto

    trackwhatyourprogramfundingisbeingspentonmanyfundersrequirethistypeofdata.The

    otherfunctionofprogrammonitoringistodescribethedetailsofyourprogramactivities,including

    informationabouttheirfrequency,content,participationrates,stafngpatterns,stafftraining

    provided,andtransportationusage.

    Programscanusetheirprogrammonitoringdatatoseeiftheyarereachingtheirintended

    targetpopulation,tojustifycontinuedfunding,andtobuildthecapacityneededforprogram

    improvements.Thedatacanalsobethefoundationonwhichlaterevaluationsarebuilt.

    DocumentingprogramserviceswithTier2evaluationhelpstoanswercriticalquestions:

    Whatservicesoractivitiesdoesourprogramoffer?

    Isourprogramservingtheintendedpopulationofchildrenandtheirfamilies?Areservices

    tailoredfordifferentpopulations?

    Whostaffsourprogram,andinwhatcapacity?Whatadditionalstafngneedsdowehave?

    Howareourfundsspent?

    Programservicescanbedocumentedthroughintakeformsthatdescribeparticipant

    characteristics,formsthatrecordprogramactivitiesandparticipationrates,andrecordsthattrack

    staffandtheirtraining.

    4. HOW DO WE CLARIFY OUR PROGRAM (TIER 3)?

    Thistierofevaluationdeterminesifyourprogramisdoingwhatitsetouttodo,asoutlinedinyour

    logicmodel.Specically,thistierinvolvesexaminingwhatyourprogramlookslikeinreallifeand

    howitoperatesonaday-to-daybasis,andwhetherornotthatmatchesupwithhowyourprogram

    wasenvisioned.

    Thesereallifedatacanthenbeusedtomakeadjustmentstoyourprogramgoalsandactivities

    tobestmeetthecommunitysneeds.Forexample,youmayndthatthetargetpopulationof

    Evaluation Tip:Other organizations may have already

    completed some o the research that

    you need about your communitys

    strengths and capacities. Consider

    these sources o inormation

    when assessing your communitys

    existing services: local colleges

    and universities, public schools,

    hospitals, social service agencies,

    police and re departments, libraries,

    school oundations, philanthropic

    organizations, and recreational

    resources.

    RELATED RESOURCES FROM HFRP.ORG

    Demographic Differences in Youth

    OST Participation: A Research

    Summary

    Demographic Differences in

    Patterns of Youth OST Activity

    Participation

    What Are Kids Getting Into These

    Days? Demographic Differences inYouth OST Participation

    Findings From HFRPs Study of

    Predictors of Participation in OST

    Activities: Fact Sheet

    http://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summaryhttp://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summaryhttp://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summaryhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/findings-from-hfrp-s-study-of-predictors-of-participation-in-out-of-school-time-activities-fact-sheethttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/what-are-kids-getting-into-these-days-demographic-differences-in-youth-out-of-school-time-participationhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/out-of-school-time/publications-resources/demographic-differences-in-patterns-of-youth-out-of-school-time-activity-participationhttp://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summaryhttp://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summaryhttp://www.hfrp.org/publications-resources/browse-our-publications/demographic-differences-in-youth-out-of-school-time-participation-a-research-summary
  • 7/29/2019 after school programes

    17/44

    How to Evaluate an Expanded Learning Program

    _______________________________________________________________________________________Follow us on twitter @HFRP

    childrenthatyourprogramaimedtoservearenottheoneswhocanmostbenetfromthe

    program,soyoumaywanttorethinktheactivitiesthatyouareprovidingtobetterreachyour

    targetpopulation(orrethinkwhomyoushouldbetargeting).Thedatacanalsoprovidefeedback

    toprogramstaffmembersonwhattheyaredoingwell,andwhatneedsimprovement.Asyoulook

    atthedata,youmaywanttoreviseyourlogicmodelbasedonwhatyouarelearning.

    InTier3evaluation,thefollowingquestionsareaddressed:

    Whatwereourprogramsintendedactivities?Wereallactivitiesimplemented?

    Aretheservicesofferedappropriatetoourprogramstargetedyouthparticipantsandtheir

    families?Aresomeyouthexcluded?

    Whatdoparticipantsthinkaboutprogramofferings?Howwilltheirfeedbackbeused?

    Howcanourprogramdoabetterjobofservingchildrenandtheirfamilies?

    Therearemanywaystocomparewhataprogramintendedtoprovidewithwhatitactually

    provides.Thesemethodsinclude:

    Comparingprogramoperationswithlogicmodelgoalsandobjectives.

    Usingself-assessmentoranexternalevaluatortoobserveandrateprogramactivities.

    Askingstaffmembersandparticipantstokeepajournaloftheirexperienceswiththe

    program.

    Enlistingparticipatingfamiliestogivefeedbackinsmallgroupsessions.

    Developingaparticipant/parentsatisfactionsurvey.

    5. HOW DO WE MAKE PROGRAM MODIFICATIONS (TIER 4)?

    Havingexaminedwhetherornotprogramservicesmatchintendedprogramgoals(Tier3),youcan

    begintollinthegapsrevealedinyourprogramandne-tuneprogramofferings.

    Tier4evaluationcanhelpaddressthefollowingquestions:

    Areourshort-termgoalsrealistic?Ifso,howcanwemeasureprogresstowardthesegoals?

    Doourprograminputsandactivitieshaveadirectconnectiontoourintendedoutcomes?

    Whatisthecommunitysresponsetoourprogram?

    Whathaveweaccomplishedsofar?

    Atthisstage,youcandiscussyourprogramsevaluationdatawithstaffandotherkey

    stakeholdersandbrainstormwiththemforideasabouthowtousethedatatomakeprogram

    improvements.

    6. HOW DO WE ASSESS PROGRAM IMPACT (TIER 5)?

    AfterconductingevaluationTiers14,someprogramsarereadytotacklethecomplextaskof

    determiningprogrameffectiveness(Tier5).Formalevidenceofprogrameffectivenesscanbe

    collectedfortheprogramoverallorforspecicprogramcomponents.Programsthatcanprovide

    convincingevidencethattheybenettheiryouthparticipantsaremorelikelytogetcontinuingnancialandpublicsupporttohelpsustainandevenscaleupprogramactivities.

    Inthistier,thefollowingquestionsareaddressed:

    Doesourprogramproducetheresultswehopeditwould?

    Doesitworkbetterforsomeparticipantsthanothers?

    Inwhatwayshasthecommunitybenetedfromourprogram?

    Howcanourndingsinuencepolicydecisions?

    Evaluation Tip:A necessary part o evaluation is

    determining who actually uses

    the programs services. Consider

    the ollowing questions: Are we

    reaching the group(s) that our

    program targets? Are some groups

    over- or under-represented? Is

    the program serving groups it

    did not expect to attract? How

    are resources allocated among

    dierent types o participants?

    RELATED RESOURCES FROM HFRP.ORG

    After School Programs in the

    21st Century: Their Potential

    and What it Takes to Achieve It

    Ask the Expert: Karen Walker,Public/Private Ventures

    Does Youth Participation in OST

    Activities Make a Difference?

    Investigating Quality: The Study

    of Promising After-School

    Programs

    http://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-ithttp://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-ithttp://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-ithttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-1/ask-the-experthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-1/ask-the-experthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/does-youth-participation-in-out-of-school-time-activities-make-a-differencehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/does-youth-participation-in-out-of-school-time-activities-make-a-differencehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time-program-quality/investigating-quality-the-study-of-promising-after-school-programshttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/does-youth-participation-in-out-of-school-time-activities-make-a-differencehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/evaluating-out-of-school-time/does-youth-participation-in-out-of-school-time-activities-make-a-differencehttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-1/ask-the-experthttp://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/out-of-school-time-issue-1/ask-the-experthttp://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-ithttp://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-ithttp://www.hfrp.org/publications-resources/browse-our-publications/after-school-programs-in-the-21st-century-their-potential-and-what-it-takes-to-achieve-it
  • 7/29/2019 after school programes

    18/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    16 Afterschool Evaluation 101

    Inthistier,youshoulduseanexperimentalorquasi-experimental designforyourevaluationif

    possible(seeStep6fordetailsondifferenttypesofevaluationdesign)tobeabletobuilda

    crediblecasethattheoutcomesobservedaretheresultofprogramparticipation.Formost

    programs,thistypeofevaluationrequireshiringexternalexpertswhoknowhowtousethespecic

    methodsandconductstatisticalanalysistodemonstrateprogramimpacts.

    Evaluation Tip:Prior to starting Tier 5, consider

    whether your program is sufciently

    developed to begin this process. Many

    small-scale programs do not have thetime, resources, and expertise needed

    to design an evaluation to veriy

    program eectiveness. And, even with

    adequate resources, a new program

    should set realistic expectations

    about its ability to demonstrate

    program eectiveness in its rst year

    o operation.

  • 7/29/2019 after school programes

    19/44

    How to Evaluate an Expanded Learning Program

    STEP 5: Selecting the Evaluation DesignDifferenttypesofevaluationdesignareappropriatefordifferentevaluationpurposes.Thissection

    addressesthefollowingquestionsrelatedtoselectingyourevaluationmethodsanddesign:

    Whattypeofevaluationshouldweconduct?

    Whatevaluationdesignshouldweuse?

    Whattypeofdatashouldwecollect?

    Whatisourtimeframeforcollectingdata?

    Thetablebelowcanserveasareferenceasyougothroughthisstepandthenexttohelpinform

    yourdecisionsinselectingtheevaluationdesignandthedatatobecollected.

    TABLE 3: What are the major issues we need to consider in conducting an evaluation?

    What is our purpose or evaluation?

    (see Step 1)

    To aid learning and continuous improvement To demonstrate accountability

    What type o perormance measures should

    we ocus on? (see Step 2)

    Measures o eort (ocused on outputs) Measures o eect (ocused on outcomes)

    What evaluation Tier should we start on?

    (see Step 4)

    Needs assessment > Tier 1

    Document program services > Tier 2

    Clariy the program > Tier 3

    Make program modications > Tier 4

    Assess program eectiveness > Tier 5

    What type o evaluation should we

    conduct? (see Step 5)

    Formative/process Summative/outcome

    What type o data should we collect?

    (see Step 5)

    Qualitative data

    Limited quantitative data or descriptive

    purposes, especially numerical data relatedto participation

    Quantitative data or statistical analysis

    What evaluation design should we use?

    (see Step 5)

    Descriptive Pre-experimental

    Quasi-experimental

    Experimental

    What time rame should we consider or

    data collection? (see Step 5)

    Collect data throughout the program year or as

    available/needed

    Collect data at the beginning (pretest), middle

    (midtest), and end (posttest) o the year

    Whom should we collect data on?

    (see Step 6)

    Program participants Program participants and a comparison/control

    group o nonparticipants

    What data collection methods should we

    consider? (see Step 6)

    Case study

    Document review

    Observation

    Secondary source/data review

    Interviews or ocus groups

    Surveys

    Secondary source/data review

    Surveys

    Tests or assessments

    _______________________________________________________________________________________Follow us on twitter @HFRP

    }Introduction

    Steps

    }1. Determining the Evaluations Purpose

    }2. Developing a Logic Model

    }3. Assessing Your Programs Capacity or

    Evaluation

    }4. Choosing the Focus o Your Evaluation

    }5. Selecting the Evaluation Design

    }6. Collecting Data

    }7. Analyzing Data}8. Presenting Evaluation Results

    }9. Using Evaluation Results

    Appendix

    }A. Key Resources

    } B. Discussion Questions

    }C. Glossary

  • 7/29/2019 after school programes

    20/44

    _______________________________________________________________________________________Visit us at www.hfrp.org

    18 Afterschool Evaluation 101

    1. WHAT TYPE OF EVALUATION SHOULD WE CONDUCT?

    Therearetwomaintypesofevaluations:

    Formative/processevaluationsareconductedduringprogramimplementationtoprovide

    informationthatwillstrengthenorimprovetheprogrambeingstudied.Findingstypically

    pointtoaspectsoftheprogramsimplementationthatcanbeimprovedforbetterparticipant

    outcomes,suchashowservicesareprovided,howstaffaretrained,orhowleadershipdecisionsaremade.Asdiscussedinthenextsection,formative/processevaluations

    generallyuseanon-experimental evaluation design.

    Summative/outcomeevaluationsareconductedtodeterminewhetheraprogramsintended

    outcomeshavebeenachieved.Findingstypicallyjudgetheprogramsoveralleffectiveness,

    orworth,basedonitssuccessinachievingitsoutcomes,andareparticularlyimportant

    indecidingwhetheraprogramshouldbecontinued.Asdiscussedinthenextsection,

    summative/outcomeevaluationsgenerallyuseanexperimentaloraquasi-experimental

    evaluation design.

    Selectionoftheevaluationtypedependsonthetierbeingevaluated(refertoStep4).Ingeneral,

    programsevaluatingTiers14shouldconductaformative/processevaluation,whileprograms

    evaluatingTier5shouldconductasummative/outcomeevaluation.

    2. WHAT TYPE OF DATA SHOULD WE COLLECT?

    Therearetwomaintypesofdata.

    Qualitative dataaredescriptiveratherthannumerical,andcanhelptopaintapictureof

    theprogram.Thistypeofdataissubjectiveandshowsmorenuancedoutcomesthancan

    bemeasuredwithnumbers.Forexample,qualitativedatacanbeusedtoprovidedetails

    aboutprogramactivitiesthatareoffered,orfeedbackfromparticipantsaboutwhattheylike

    (anddislike)abouttheprogram.Thistypeofdataisgenerallyusedforformative/process

    evaluations,butcanalsohelptoeshoutandexplainsummative/outcomeevaluation

    ndingsforexample,toprovidespecicdetailsabouthowparticipantsbehaviorhas

    changedasaresultoftheprogram.Qualitativedatacanbecollect