serve dc training 9/28/11. theories of change and logic models evidence performance measurement 101...
Post on 27-Mar-2015
212 Views
Preview:
TRANSCRIPT
Serve DC Training 9/28/11
Theories of Change and Logic Models
Evidence
Performance Measurement 101
Reviewing Performance Measures
eGrants Tips
Serve DC Training 9/28/11
Learning Objectives:
Know the definition of “theory of change”
Understand relationship between theory of change and program design
Understand how logic models articulate a theory of change
Serve DC Training 9/28/11
A theory of change:
Looks at cause and effect relationships
Identifies specific interventions to achieve the desired result
Uses evidence to articulate assumptions
Serve DC Training 9/28/11
PROBLEM: The identified community need
INTERVENTION: The activities of members and community volunteers supported by AmeriCorps members
OUTCOME: The change that occurs because of the intervention
EVIDENCE: Why you believe a certain set of actions (the intervention) will lead to the intended outcome
Serve DC Training 9/28/11
If the intervention (X) is delivered, at a certain dosage, then the expected outcome (Y) will happen.
X → Y
Serve DC Training 9/28/11
I have strep throat (PROBLEM).
If I take antibiotics (INTERVENTION), then I will get better (OUTCOME).
Antibiotics → I get better.X → Y
Serve DC Training 9/28/11
If I take penicillin, I will get better.
If I take a different antibiotic, will I get
better?
Some interventions (antibiotics) work better
than others. Some don’t work at all.
Serve DC Training 9/28/11
How do I know which antibiotic is best? I look at the evidence. There is research
that shows which antibiotic is likely to get the best result.
I consider constraints that may preclude the ideal intervention. (Penicillin may be too expensive.)
If I can’t have the most promising intervention, I need to understand the tradeoffs.
Serve DC Training 9/28/11
Two types of evidence are required:
Data that documents the community need
Data that documents why you think your intervention (using AmeriCorps members and community volunteers) will achieve the intended outcome
Serve DC Training 9/28/11
Data that demonstrates that the proposed intervention is likely to solve the identified problem
For example: Evidence says that x hours of tutoring leads to academic outcomes…so the intervention features X hours of AmeriCorps members tutoring
Serve DC Training 9/28/11
The evidence basis for an intervention may
include:Past performance measurement dataResults from a program evaluationResearch studies that document the outcomes of similar programsEvaluations that document outcomes of similar programs
Serve DC Training 9/28/11
Preliminary → Moderate → Strong
Serve DC Training 9/28/11
Variance in executing the ideal program intervention
Little evidence to support your intervention
Serve DC Training 9/28/11
PROBLEM: Children at risk of failing third grade reading exam
INTERVENTION: Individualized tutoring on five “building block” literacy skills
OUTCOMES: Students master skills, pass state reading exam
Serve DC Training 9/28/11
What is your theory of change?
Serve DC Training 9/28/11
Logic models are a visual way of expressing the cause and effect reasoning behind a theory of change. They move from left to right in a linear fashion.
X → Y
Serve DC Training 9/28/11
ProblemProblem InterventionIntervention OutcomesOutcomes
I have strep I have strep throat.throat.
I take penicillin.I take penicillin. I get better.I get better.
Serve DC Training 9/28/11
PROBLEM INTERVENTION OUTCOME
Children at risk of failing third grade reading exam
Evidence: Statistics on the number of students at risk of failing in program’s service area; Research on why reading proficiency by 3rd grade is important.
Individualized tutoring on five “building block” literacy skills
Evidence: Research on how children learn to read supporting theory that mastering building block skills will lead to proficiency. Research on design, frequency, duration of tutoring sessions.
Constraints?
Students master five building block skills.
Students pass state reading exam.
Serve DC Training 9/28/11
Logic Model Practice
Serve DC Training 9/28/11
Learning Objectives:
Review the two types of evidence that support a theory of change
Practice evaluating evidence
Serve DC Training 9/28/11
Data that documents the community need
Data that documents why the intervention is likely to lead to the outcome(s)
Serve DC Training 9/28/11
Relevant
Compelling
Up-to-date
Reliable Source
Evidence Continuum (Preliminary, Moderate, Strong)
Serve DC Training 9/28/11
Where do we find evidence to document the community need?
Serve DC Training 9/28/11
Past performance measurement data Results from impact evaluation of your
program Research studies that document the
outcomes of similar programs Evaluations that document outcomes of
similar programs
Serve DC Training 9/28/11
University or research organizations
Names of known professionals/thought leaders
Similar sounding programs/descriptions
Meta-articles that review multiple studies
Serve DC Training 9/28/11
Recommended design and dosage (frequency, intensity, duration)
Do we need to alter the design of our intervention?
Do we need to choose a new intervention?
Serve DC Training 9/28/11
Learning Objectives:
Know the definition of performance measurement
Understand differences between performance measurement and evaluation
Know performance measurement requirements for AmeriCorps grants
Serve DC Training 9/28/11
Performance measurement is the process of regularly measuring the amount of work done by your program and the outcomes of this work on your program beneficiaries.
Serve DC Training 9/28/11
Performance MeasurementCaptures near term changes
EvaluationCaptures lasting changesAttempts to demonstrate cause and effect between intervention and outcome
Serve DC Training 9/28/11
Performance Measurement
Evaluation
Systematic data collection and information about:•What took place•What outputs were generated•What near term outcomes were generated
Serve DC Training 9/28/11
Performance Measurement
Evaluation
• Tracks outputs and outcomes on a regular, ongoing basis
• Does not show causality
• Seeks to show causality
• Longer term focus• Uses the most rigorous
methodology that is right for the program (often quasi-experimental design)
Serve DC Training 9/28/11
The most important difference: Evaluation seeks to “prove” the theory of change (X→Y). Performance measurement does not.
Serve DC Training 9/28/11
Performance measurement can show the outcome (change ) occurred but not causality (the change occurred because of the intervention)
Performance measurement does not seek to “prove” a theory of change but can provide evidence that informs your theory
Performance measurement data can inform evaluation efforts
Serve DC Training 9/28/11
Performance MeasurementIndividual benchmark assessments on Dynamic Indicators of Basic Early Literacy Skills (DIBELS) three times/yearState Reading Exam --Number of students who graduate from the Minnesota Reading Corps who pass state reading exam
Serve DC Training 9/28/11
EvaluationMatched sample research project in Minneapolis School District—Reading Corps pre-school participants scored significantly higher in phonemic awareness, alphabetic principle, and total literacy than children in matched comparison group entering kindergarten
Serve DC Training 9/28/11
If performance measurement doesn’t prove that my intervention worked, then why do it?
Serve DC Training 9/28/11
If the evidence for an intervention is strong, PM helps show the program is on track
If the evidence basis is weak or not well-defined, PM can provide evidence that a change occurred
Serve DC Training 9/28/11
Improve performance
Inform decision making
Demonstrate accountability (internally and
externally)
Justify continued funding
Enhance customer service
Improve quality of services
Set targets for future performance
Serve DC Training 9/28/11
Measuring prevention or long-term outcomes
Time
Cost
Establishing reasonable targets
Brief service interventions
Attributing impact to the intervention
Serve DC Training 9/28/11
Counts of the amount of service that members or volunteers have completed.
They do not provide information on benefits to or other changes in the lives of members and/or beneficiaries.
Serve DC Training 9/28/11
Number of students who complete participation in an AmeriCorps education program
Number of veterans engaged in service opportunities
Number of individuals receiving support, services, education and/or referrals to alleviate long-term hunger
Serve DC Training 9/28/11
Outcomes specify changes that have occurred in the lives of members and/or beneficiaries. They should be:RealisticMeasurable during grant periodRelevant to theory of change
Serve DC Training 9/28/11
Outcomes measure changes in:
AttitudeBehaviorCondition
Most programs should aim to measure a quantifiable change in behavior or condition.
Serve DC Training 9/28/11
Applicants are required to create at least one aligned performance measure to capture the output and outcome of their primary service activity.
Note: Applicants may create additional performance measures provided that they capture significant program outcomes.
Serve DC Training 9/28/11
An aligned performance measure has two components:OutputOutcome
Alignment refers to whether:The outcome is logical and reasonable given your intervention and output(s)The output and outcome measure the same beneficiary
Serve DC Training 9/28/11
Learning Objectives:
Learn how CNCS assesses performance measures
Practice using assessment checklist
Serve DC Training 9/28/11
Applicants must describe the following theory of change elements:The problem(s) identified (Need)The actions that will be carried out by AmeriCorps members and community volunteers (Evidence-Based Intervention)The ways in which AmeriCorps members are particularly well-suited to deliver the solution (Value Added by AmeriCorps)The anticipated outcomes (Outcomes)
Serve DC Training 9/28/11
Measures align with the need, activities and outcomes (theory of change) described in the narrative
Outputs and outcomes are correctly aligned
Measures utilize rigorous methodologies to demonstrate significant outcomes
Serve DC Training 9/28/11
Choose an intervention that will lead to the specific desired outcomes.
Choose outcomes that can measure the intervention.
Example: Improving academic performance
Serve DC Training 9/28/11
Intervention: After school enrichment program
Outcome: Improved academic performance in reading
Serve DC Training 9/28/11
Intervention: Tutoring program focused on helping kindergarten students master the most critical emergent literacy skills
Outcome: Improved academic performance in reading
Serve DC Training 9/28/11
Intervention: Homework help program focusing on multiple subjects
Outcome: Improved academic performance in reading
Serve DC Training 9/28/11
Need a clear link between:
The intervention (design, frequency, duration)
The specific change (outcome) that is likely to occur as a result of the intervention
Serve DC Training 9/28/11
Intervention: AmeriCorps members lead classes to educate smokers about the health risks associated with smoking.
Outcomes: Individuals increase their knowledge of the health risks of smoking. Individuals stop smoking.
Alignment Issue: Simply telling people that smoking is bad for them may not help them to quit.
Serve DC Training 9/28/11
Intervention: Members provide financial literacy trainings to economically disadvantaged adults.
Outcome: Economically disadvantaged adults will open savings accounts after receiving financial literacy training.
Alignment Issue: If beneficiaries do not have enough money to meet their basic needs, a savings account may not be realistic.
Serve DC Training 9/28/11
National Measures must be aligned as directed in CNCS guidance
Aligned measure includes output and outcome for primary service activity
Outcomes likely to result from outputs
Outputs and outcomes measure the same population
Serve DC Training 9/28/11
Output H4: Clients participating in health education programs
Outcome: Community members will decrease costly emergency room visits
Serve DC Training 9/28/11
Output H5: Youth engaged in activities to reduce childhood obesity
Outcome: Children experiences at least an 8% increase in aerobic fitness
Serve DC Training 9/28/11
Output EN4: Acres of parks improved
Outcome: Acres of park certified as satisfactorily restored by land manager partners
Serve DC Training 9/28/11
Output EN4: Acres of parks improved
Outcome: Public parks will be cleaner as the result of removing 140,000 pounds of trash and debris
Serve DC Training 9/28/11
Data collection methods are rigorous
Outcomes capture a significant change. It is helpful to consider:So what?Is this change worth measuring?
Serve DC Training 9/28/11
Do outcomes capture the change you want to accomplish?
Will proposed methods/instruments capture these outcomes?
Are methods rigorous but realistic?
Is there a clear plan/timeline for developing instruments and collecting data?
Serve DC Training 9/28/11
Proposed methods are not realistic because:Too ambitiousCan’t get dataUnable to obtain a representative sample
Serve DC Training 9/28/11
A grantee plans to use a standardized pre/post test but has difficulty administering the test and aggregating the data within the grant period. Would like to measure improvement in grades instead.
A grantee is unable to create a sampling frame that defines the population from which they will sample.
Serve DC Training 9/28/11
Objective vs. Subjective
Not tested ahead of time
Don’t measure what they are supposed to measure (Validity)
Biased
Serve DC Training 9/28/11
What is a valid way to measure my driving proficiency?
Survey that asks how I feel about driving?
Survey that asks if I think I’m a good driver?
Written Test?
Driving Test?
Serve DC Training 9/28/11
A survey scale that only measures improvement
A survey that is only returned by individuals who feel strongly
Serve DC Training 9/28/11
The AmeriCorps State and National Performance Measurement Assessment Checklist:Alignment with theory of changeAlignment of outputs and outcomesQuality (Rigorous, Worth Measuring)
Serve DC Training 9/28/11
Practice using Performance Measurement Checklist
Serve DC Training 9/28/11
Learning Objectives
Learn tips for entering PMs in eGrants
Understand how the eGrants language is sometimes different from other CNCS language for performance measures
Serve DC Training 9/28/11
Strategy = Intervention
Result = Output, Intermediate Outcome, or End Outcome
Indicator = A description of the measurable change that will occur (Number of beneficiaries who...)
Target Statement = The indicator plus the expected number (100 beneficiaries will…)
Serve DC Training 9/28/11
Target – The number in the target statement (100)
Instrument – The specific tool that will be used to collect data (AIMSweb Letter Sounds and Letter Names pre/post test)
Data Collection Methodology – How data will be collected (survey, pre/post test, etc.)
Serve DC Training 9/28/11
The strategy (intervention) will be the same for all components of the measure (output, intermediate outcome, end outcome) because all of these should result from the same intervention
Serve DC Training 9/28/11
Within each output or outcome, the result statement, indicator, target statement and target number will seem repetitive:Result Statement: Students will demonstrate improved academic performanceIndicator: Number of students with improved academic performanceTarget Statement: 100 students will demonstrate improved academic performanceTarget: 100
Serve DC Training 9/28/11
The data collection methodology is how you will collect the data. For example, administering a standardized test is a method of collecting data.
The instrument is the actual tool that will be used. For example, AIMSweb Letter Sounds and Letter Names Pre/Post test is one standardized test that might be an acceptable instrument.
Serve DC Training 9/28/11
Resource Center
2012 AmeriCorps NOFO and Performance Measures Instructions
Performance Measurement by Harry Hatry
Stanford Social Innovation Review
Serve DC Training 9/28/11
top related