evidence-based public health module 9: …...evidence-based public health module 9: evaluating the...
TRANSCRIPT
5/14/2013
1
EVIDENCE-BASED PUBLIC HEALTH MODULE 9: Evaluating the Program or Policy
Jennifer Leeman May 15, 2013
Partners and Acknowledgements
2
5/14/2013
2
Learning Objectives
1. Understand the basic components of program evaluation.
2. Describe the differences and unique contributions of quantitative and qualitative evaluation.
3. Understand the various types of study designs useful in program evaluation.
4. Understand the concepts of measurement validity and reliability.
5. Understand some of the advantages and disadvantages of various types of data.
6. Understand some of the steps involved in conducting qualitative evaluations.
3
4
5/14/2013
3
QUESTION
In intervention planning, when should you
begin planning an evaluation?
A. When you begin planning the intervention
B. Before implementing the intervention
C. After the intervention has been implemented
D. When the intervention is complete
Should you…..
•Do your own evaluation?
OR
•Hire an external evaluator?
5/14/2013
4
Look for opportunities to evaluate and improve your work - Ask yourselves:
• What do we want to accomplish?
• What will we need to do to get there?
• Are we doing what we planned to do? (process
evaluation)
• Are we accomplishing what we hoped to
accomplish? (outcome evaluation)
Look for opportunities to evaluate and improve your work
• Engage in Plan, Do, Study, Act improvement
projects Study = Evaluation
• Create a plan to evaluate a program
Act Plan
Study Do
5/14/2013
5
One of the most significant challenges you face in evaluation
• Comprehensive evaluation versus nothing at
all
• Don’t make the perfect the enemy of the
good
9
All evaluations are a balance
• What findings are most important to
stakeholders? Utility
• What evaluation approaches are
feasible and accurate?
5/14/2013
6
CDC FRAMEWORK FOR EVALUATION
CDC Framework for Evaluation
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
5/14/2013
7
Step 1 – Engage stakeholders
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps Step 1
ENGAGING STAKEHOLDERS
5/14/2013
8
Stakeholders: Who? and Why?
• People who
▫ see the benefits
▫ oppose
▫ would implement
▫ will decide
Who are possible stakeholders in an ordinance to limit second-hand smoke exposure in public parks?
• Who might benefit?
• Who would implement?
• Who would decide whether
to support the intervention?
• Who might oppose?
5/14/2013
9
Who are possible stakeholders in a clinic intervention to include tobacco use as a vital sign?
• Who might benefit?
• Who would implement?
• Who would decide whether
to continue the intervention?
• Who might oppose?
Engage stakeholders
• Because they will help you:
▫ identify questions that need answers
▫ access data to implement your evaluation
▫ demonstrate credibility and transparency
▫ build a market for your findings
5/14/2013
10
DESCRIBE THE PROGRAM
Step 2 – Create a logic model to describe the program
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
Step 2
5/14/2013
11
Ordinance to limit exposure to second-hand smoke in public recreational areas
The Center for Tobacco Policy & Organizing
Example Logic Model
•Engage stakeholders •Communicate ordinance •Educate on dangers of second-hand smoke
•Implement ordinance • No smoking signs • SOP for violations
•Work with police and others to enforce
•Work with local officials to support
Stakeholders engaged Public awareness of ordinance PSA on hazards of second-hand smoke Parks with signs SOPs written Enforcement per SOP Support from local officials
• Police
department • Elected officials • Stakeholder
partners
Example logic model
Inputs Activities Outputs
5/14/2013
12
Implementation & Maintenance Outcomes
Short term • second-hand smoke in
parks • knowledge of second
hand smoke dangers
Intermediate term • Changed norms on smoking
in rec areas • # of smokers • # asthma attacks
Long term • lung & CV disease and
cancer • ROI
Inputs Activities Outputs Outcomes
FOCUS THE EVALUATION
5/14/2013
13
Focus the evaluation: questions and indicators
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
Step 3
Types of Evaluation
Inputs Activities Outputs Outcomes
Process Outcome
5/14/2013
14
Process Evaluation
• Asks:
▫ Are all activities being implemented as planned?
▫ Is the intervention feasible and acceptable?
▫ Are you reaching intended recipients?
• Provides shorter-term feedback on intervention
implementation, content, methods, participant
response, practitioner response
• Shows what is working, what is not working
Process evaluation helps to unravel the “Black Box”
28
5/14/2013
15
Process Evaluation Evidence-based programs and fidelity
• Fidelity: faithfulness to the core elements of the
program, in the way it was intended to be
delivered
Fidelity: Core Elements
• Core elements*: ▫ required components that represent the theory
and internal logic of the intervention and most likely produce the intervention’s effectiveness
*Eke, Neumann, Wilkes, Jones. Preparing effective behavioral
interventions to be used by prevention providers: the role of
researchers during HIV Prevention Research
Trials. AIDS Education & Prevention, 2006, 18(4 Suppl A):44-58.
5/14/2013
16
Body and Soul Program Core Elements • Project Committee
• Kick-off event
• At least 3 church-wide nutrition events
• At least 1 additional event involving the pastor
• At least 1 church food habit change (policy change)
• 2 motivational counseling calls by volunteers to each participant
Outcome Evaluation
• Asks:
▫ Has the intervention been successful in achieving intended outcomes?
• Outcome evaluation is also called summative evaluation
• Evaluation of longer term outcomes is also called impact evaluation
32
5/14/2013
17
Activity: Starting the Evaluation Work Plan
• Look at the smoke-free ordinance logic model. Write at least
two process evaluation question.
– Process questions ask what is working or not working?
How? Why?
• Write at least one Outcome evaluation
question.
– Outcome questions ask how successful
were you in achieving
outcomes.
Type of Evaluation Evaluation Questions
Process Evaluation • Are our key stakeholders engaged?
• Are the people who use the parks aware of the new
policy?
• Are no-smoking signs posted and visible?
• Are police enforcing the policy?
Short term
Outcome
Evaluation
• Are people more aware of the dangers of second-hand
smoke?
• Is there less exposure to second-hand smoke in
parks?
Intermediate
Outcome
Evaluation
• Is the proportion of people who smoke decreasing?
• Have people’s attitudes towards smoking in parks
changed?
• Is their a lower incidence of asthma attacks?
Long term
Outcome
Evaluation
• Is there a lower prevalence of cardiac and pulmonary
disease?
Example Evaluation Questions: Smoke-free Ordinance
5/14/2013
18
Focus the evaluation: questions and indicators
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
Step 3
Evaluation Designs
• Exploratory vs. explanatory (cause-effect)
• Design types
▫ Descriptive
▫ Case Study
▫ One group with pre-test / post- test
▫ Pre-test / post-test with a nonrandomized control group
▫ Time series
▫ True experimental design (RCT)
See Evaluation Design Handout
5/14/2013
19
Evaluation methods
True experimental designs are not common in practice, but there are lots of other options:
▫ Example: Recruit participants, collect baseline data, implement intervention, and then collect follow-up data.
▫ Example: Select two neighborhoods that are similar on several indicators. Implement intervention in only one neighborhood
but collect data at baseline and follow-up in both.
Evaluation versus research
Evaluation Research
• Controlled by stakeholders
• Flexible design
• Ongoing
• Used to improve programs
• Controlled by investigator
• Tightly controlled design
• Specific timeframe
• Use to further knowledge
38
5/14/2013
20
GATHER CREDIBLE EVIDENCE
Gather credible evidence
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
Step 4
5/14/2013
21
Gather Credible Evidence
(not the other way around)
Indicators Evaluation
questions
Methods
and Measures
Work with stakeholders to identify most useful and feasible indicators of success
Planning logic model
SMART Objectives Implementation Plan
Create an evaluation plan to assess those indicators
5/14/2013
22
Indicators
• Operationalize your evaluation questions’ key
concepts into measurable indicators
• Are police enforcing the policy? (process)
• Has the ordinance reduced exposure to second-
hand smoke in parks? (outcome)
Are police enforcing the policy? (process)
▫ How define and operationalize “enforcing”? ▫ Over what time period?
Number of warnings and citations police issued to smokers in parks in first three months following passage of ordinance Police knowledge of and agreement with the procedures for warning and citing smokers three months following passage of ordinance
5/14/2013
23
Has the ordinance reduced exposure to second-hand smoke in parks? (outcome)
• Changes in number of individuals smoking on
park grounds at three months following the
ordinance as compared to before
• Changes in number and characteristics of
individuals exposed to second hand smoke
three months following the ordinance as
compared to before
Gather Credible Evidence
Indicators Evaluation
questions
Methods
and Measures
5/14/2013
24
VALIDITY AND RELIABILITY (EVALUATION “THREATS”)
What is validity?
48
5/14/2013
25
Measurement Issues
Evaluation “threats”
▫ Validity
Is the instrument or design
measuring exactly what was
intended?
49
Measurement Issues
•Validity: best available approximation to
the “truth”
• Internal validity: the measured effects
are really attributable to the intervention
50
5/14/2013
26
Measurement Issues
Example ▫ Validity
Self-reported rate of smoking among
pregnant women compared with
cotinine validation
51
What is reliability?
52
5/14/2013
27
Measurement Issues
Reliability
▫ Is the measurement being conducted
consistently?
53
Measurement Issues
Example
▫ Reliability
Test-retest data on self-reported
smoking rates among women
Audits by two different staff
members report same findings
54
5/14/2013
28
Data methods and sources - Process Evaluation
Number of warnings and citations police issued to smokers in parks in first three months following passage of ordinance analyze existing data - police department records Police knowledge of and agreement with the procedures for warning and citing smokers three months following passage of ordinance qualitative methods - interviews with police officers
Data methods and sources - Outcome evaluation
• Changes in number of individuals smoking on park grounds at three months following the ordinance observation -count cigarette butts on grounds
• Changes in number and characteristics of individuals exposed to second hand smoke three months following the ordinance
existing data - park records on usage observation – people in parks
5/14/2013
29
Identifying methods/sources for Credible Evidence
• Link each indicator to a data source
• Look for existing data
▫ If there are no data available – collect your own
• Look for existing data collection tools and adapt
▫ If there are no existing tools – create your own
• Pilot test all tools
• Create a data collection plan
Gather Credible Evidence - Use Existing Data
Sources of existing data
• Patient or client records
• Surveillance data
• Other survey data
• Other??
5/14/2013
30
Sources of Existing Data -
Sources of Existing Data - Environments
Environments
• USDA Food Environment Atlas
• USDA Food Desert Locator
• USDA SNAP Data system maps
• Community Commons & Childhood Obesity GIS
5/14/2013
31
Challenges with Using Existing Data
•May not cover your specific population/location
•Might not ask the “right” questions
•Hard to attribute changes in data to your initiative
•Long time required to see changes
Gather Credible Evidence - Methods
• Existing databases (e.g., surveillance data)
• Tracking logs and inventories
• Print, online, phone, or in person surveys
• Qualitative interviews or conversations
• Observations or audits
• GIS
5/14/2013
32
Gather Credible Evidence - Use Tracking Logs or Inventories
• Ongoing or retrospective review of records
• Useful for Process Evaluation
▫ Attendance at events
▫ Completion of a series of steps (reminder to get colorectal cancer screening, completion of screening, schedule diagnostic test, etc.)
▫ Distribution of promotional materials (dates, number distributed, to whom)
Strengths and Challenges with Tracking Logs and Inventories
• Strengths: Low cost and straight forward
• Challenges: Missing and incomplete data
▫ Combine with qualitative data
Note what did and did not work - in addition to tracking attendance
Note reasons why an activity did not occur as planned
5/14/2013
33
Gather Credible Evidence - Use Survey Methods
• Close-ended questions
▫ Mailed, handed, emailed, or
administered by phone or in person
▫ To priority population, stakeholders, or
those implementing the intervention
Survey Examples
• Handed to participants at end of group training to assess satisfaction and changes to attitudes and knowledge
• Mailed to parents to assess changes in students’ eating behaviors
• Emailed to nutrition directors to assess if and how new standards were implemented at school
5/14/2013
34
Challenges - Surveys
• Collecting accurate data. Respondents may not:
▫ interpret questions as intended
▫ accurately self-report
• Strengthen your methods by:
▫ Use existing surveys that have been assessed
▫ Draw questions from existing surveys (e.g., BRFSS)
▫ Pilot test in the target population
▫ Assess accuracy (compare self-reported and actual implementation)
▫ Assess interpretation of survey questions
▫ Partner with survey experts
Challenges - Surveys
• Sampling errors: Participants may not be:
▫ representative of priority population, stakeholders or implementers
▫ Similar in the pre- and post-test samples
• Strengthen your methods:
▫ Assess and report % who responded
▫ Assess and report differences between those who did and did not respond
5/14/2013
35
Gather Credible Evidence - Use Qualitative Methods
• Interviews or conversations with key informants
▫ Questions are open ended
▫ Conducted individually or in groups of similar people
▫ Can include photo voice and concept mapping
• Ask participants in your intervention for feedback
on barriers and facilitators to participation
• Ask community members for input on an
intervention you are planning to implement
• Ask staff about their experience delivering an
intervention
Strengths and Challenges - Qualitative Methods
• Strengths
▫ Reveal the unexpected
▫ Offer insight into individual/group experiences
▫ Explore more deeply
• Challenges
▫ Findings are exploratory and preliminary
▫ Challenging to schedule
▫ Analysis can be daunting and time consuming
▫ Findings may be viewed with skepticism
5/14/2013
36
Qualitative Methods - Strengthen Your Data
• Report the number of individual interviews or focus groups that raised an issue
• Have two independent individuals identify themes in interviews and then meet to compare findings
• Identify and organize themes by question
• Using quotations from interviews
• Linking to quantitative findings
Gather Credible Evidence - Observation Data and Audits
• Onsite visits to document activities and environments
▫ Often uses a checklist or audit form
▫ Data can be collected at a random sample of sites to increase feasibility
▫ May use GIS technology
5/14/2013
37
Observation and Audit Methods - Strengthen Your Data
• Do formative work to develop audit tools
• Engage stakeholders to review and pilot
• Assess inter-rater reliability
▫ Have 2 auditors do independent audits and compare their findings
▫ Update instructions and protocols to reduce discrepancies
▫ Aim for 80% agreement or better
Gathering Credible Evidence - Tools and Measures
• Look for evaluation tools used by others for similar interventions
▫ Center TRT
▫ RTIPs
▫ Research publications
• Pull questions from surveillance data collection tools (e.g., BRFSS)
• Look at databases of tools and measures (e.g., NCCOR)
• Look at CDC resources
5/14/2013
38
Gathering Credible Evidence - Tools and Measures
• Contact State Center for Health Statistics
• Contact your state program staff
• Talk to your colleagues
• Contact NC Institute of Public Health (GIS)
Completing the Evaluation Work Plan
• List potential data collection methods you could use to answer your evaluation questions.
• Consider the strengths and weaknesses of your data collection methods.
• Look at your evaluation questions.
• Identify the types of data you can review to answer your question. You may need to review existing data or develop new data collection tools.
5/14/2013
39
JUSTIFY CONCLUSIONS
Justify conclusions
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps
Step 5
5/14/2013
40
Program Standards
• Compare evaluation results to a benchmark, such as
▫ Needs of participants
▫ Community values, expectations or norms
▫ Program objective
▫ Program protocol or procedure
▫ Performance by similar interventions or policies
▫ Performance by a control or comparison group
Judgments
• Statements about the merit of the intervention
▫ Based on comparing evaluation findings against one or more selected standards
• Intervention is successful, if
▫ Objectives are achieved
▫ Results are comparable with or better than those of similar programs
▫ Multiple methods show similar results
5/14/2013
41
ENSURE USE & SHARE LESSONS LEARNED
Ensure use and share lessons learned
Standards Utility
Feasibility Propriety Accuracy
Ensure use and share
lessons learned
Justify conclusions
Gather credible evidence
Focus the evaluation design
Describe the program
Engage stakeholders
Steps Step 6
5/14/2013
42
Using/Sharing Findings
• Identify audience
• Tailor message and materials to that audience
• Time dissemination to maximize impact
Plan message content & format
• Determine the purpose of communication
• Analyze audience
• Develop a preliminary message
NCI, NIH Making Data Talk,
5/14/2013
43
Evaluation Audience Purpose • Improve existing program
• Sustain momentum
• Promote your organization
• Garner support
• Demonstrate accountability
• Contribute to evidence base
for practice
• Implementation team
• Implementation team
• General public
• Decision makers/funders
• Decision makers/funders
• Peers
Tailor to audience to ensure:
• Relevance (timely, important, and actionable)
• Legitimacy (credibility)
• Accessibility (formatting and availability)
5/14/2013
44
Tailoring requires knowing your audience
Get feedback on message
• Content (relevance)
• Format (credibility and accessibility)
• Distribution channels (credibility and
accessibility)
• More piloting!
Audience Formats
• Implementation team
• General public
• Decision makers/funders
• Decision makers/funders
• Peers
• Bulletin board display
• Internal newsletter
• Brief report with 1 page
highlights at staff meeting
• Media coverage
• External newsletter
• 2-page issue brief
• Professional conference
• Publication
5/14/2013
45
Message: Scientists vs Lay
Scientist Vs Lay
• Sources & definitions of
evidence
• Belief in rational decision
making
• Acceptance of uncertainty
• Quantitative and science
literacy
• Ability/interest to review
extensive data
Narrow Broad
Strong Variable
High Low
High Low
High Low
NCI, NIH Making Data Talk
Tips for Presenting Audience-Friendly Data
• Avoid technical terms
• Avoid difficult math concepts
• Focus on main message
• Explain impact of data
• Present in way that gains attention
e.g., cohort
e.g., relative risk
Instead of detailed arguments
Make data’s relevance clear
NCI, NIH Making Data Talk
5/14/2013
46
Avoid overly complex graphics
Incorporate the following:
• Simple graphics
• Stories
• Photos (with permissions)
• Quotations
• Summary statistics
• Maps
5/14/2013
47
Priority Area – Reproductive Health and Pregnancy Outcomes
What’s Great?
• Vance Teen pregnancy rates are down
by 13% since 2010.
• Vance ranks eighth in the state, its
lowest rank in over 20 yrs.
• Granville Teen pregnancy rates are
down by 24% since 2010.
What’s Not So Good?
North Carolina Prevention Partners
5/14/2013
48
5/14/2013
49
THANK YOU!!!