methods and implications of using methods ellen taylor-powell university of wisconsin-cooperative...
TRANSCRIPT
Methods and Implications of Methods and Implications of Using MethodsUsing Methods
Ellen Taylor-Powell
University of Wisconsin-Cooperative Extension
Our time todayOur time today
• Overview: Sources and Methods• Program examples• Cultural considerations• Attribution vs. contribution• Application of evaluation standards
as we think about methods and implications
ProcessProcess
• Ask questions• Interactivity• Share examples
Methods = CHOICESMethods = CHOICES
So many choices, so many decisions
““Developing an evaluation is an exercise of Developing an evaluation is an exercise of dramatic imagination” (Cronbach, 1982: 239)dramatic imagination” (Cronbach, 1982: 239)
a. There is one best way to collect data b. Quantitative methods that collect
numbers provide more useful informationc. Evaluation data collection involves any
established social science research method
d. We often collect data from program participants
e. We should always collect data from as many participants as possible
Let’s get started by checking ourselves! Let’s get started by checking ourselves! (answer each statement with either true or false)(answer each statement with either true or false)
MythsMyths
• Choice about which method to choose is primarily a technical decision
• There is one best method• There are established and known
standards of what constitutes methodological quality and excellence
• More data is always better• “Hard” data is better than “soft” data
Where do Where do methods methods fall in the fall in the process of process of planning anplanning anevaluation?evaluation?
http://www.uwex.edu/ces/pdande/evaluation/
INPUTS OUTPUTS OUTCOMES
Program investments
Activities Participation Short MediumLong-term
Evaluation methods: How will you collect the information to answer your questions?
Match evaluation questions and methods Match evaluation questions and methods to your PROGRAM to your PROGRAM
Evaluation questions: What questions do you want to answer?
Logic model
Example: What do you (and others) want to know about the program?
Staff
Money
Partners
Assess parent ed programs
Design- deliver evidence-based program of 8 sessions
Parents increase knowledge of child dev
Parents better understanding their own parenting style Parents use
effective parenting practices
Improved child-parent relations
Research
INPUTS OUTPUTS OUTCOMES
Facilitate support groups
Parents gain skills in new ways to parent
Parents identify appropriate actions to take
Parents of 3-10
year olds
attend
Reduced stress
Parents gain confidence in their abilities
Strong families
Inputs Process Outcomes Impact
Possible evaluation questions…
Staff
Money
Partners
Assess parent ed programs
Design & deliver evidence-based program of 8 sessions
Parents increase knowledge of child dev
Parents better understand their own parenting style
Parents use effective parenting practices
Improved child-parent relations
Research
Facilitate support groups
Parents gain skills in effective parenting practices
Parents identify appropriate actions to take
Strong families
Parents of 3-10
year olds
attend
To what extent is stress reduced? relations improved?
To what extent did behaviorschange? For whom? Why? What else happened?
To what extent did knowledge and skills increase? For whom? Why? What else happened?
Did all parents participate as intended? Who did/not not?Did they attend all sessions?support groups?Level of satisfaction?
Were all sessions delivered? How well? Do support groups meet?
What amount of $ and time were invested?
Reduced stress
Sources of data Sources of data
Sources of evaluation information• People: youth participants,
parents, teachers, volunteers, leaders, judges…
• Pictorial records and observations: before-after photos; observations at events; artwork…
• Existing information: record books, plans of work, logs, journals, meeting minutes…
Data collection methods Data collection methods
• Survey• Interview• Focus group• Observation• Expert or peer
reviews• Portfolio reviews
• Testimonials• Tests• Photograph,
videotape, slides • Diaries, journals,
logs• Document review
and analysis
Polling slide…How many use/have used Polling slide…How many use/have used these methods?these methods?
Creative methods…Creative methods…
• Creative expression: drawing, drama, role-playing• Photography, videotape, slides• Diaries, journals, logs• Personal stories• Expert review• Buzz session• Affinity diagramming • ???
“There can be no definitive list of creative evaluation approaches. Such as list would be a contradiction in terms” Patton: 346
Pros and cons of different methodsPros and cons of different methods
• Insert slides OR connect to a pdf?• Example re. choices
Quantitative: numbers breadth generalizabilityQualitative: words depth specific
"Not everything that counts can be counted and not everything that can be counted counts.“ (Albert Einstein)
Quantitative information – Qualitative information
Often, it is better to use more than one Often, it is better to use more than one data collection method….data collection method….
TRIANGULATION
Why?
Examples Examples
How might you mix sources of information in your evaluation?
How might you mix data collection methods to evaluate your program?
Polling or quizPolling or quiz
1. Focus: Whole farm phosphorus management – 11 Western counties
2. Questions 3. Indicators 4. Timing 5. Data collection
Sources Methods Sample Instruments
1 What did the phosphorus mngt program actually consist of?
Who did what?
#, type of activities implemented: course developed, workshops conducted, on-farm work
#, who, role of partners
At time of activity
At time of involvement
Staff
StaffPartners
Recording log/data base
LogInterview annually
All activities
All partners
Need form; system for ongoing recording
Need recording form; Interview questions
2 Did the expected number of farmers attend the various activities? Who participated in what?
#, key characteristics of participating farmers per activity
At time of activity (workshop, field day, on-farm visit)
Attendance logs
Record review
All participants
Need recording form and system for collecting data
3 What resulted? To what extent did participating farmers:a) increase their knowledge? b) Increase skills in tracking P levels?c) adopt recommendations?d) reduce P levels?e) save money?4 What else Happened?
#,% of participants who
a) Report increased knowledgeb) demonstrate skillc) report changes in feeding levelsd) record P reductionse) report $ savings; amount of savings
End of each workshop
Ongoing
Annually –4th quarter
Participants
Participants
FarmersStaffPartners; other stakeholders
Post session survey
ObservationsRecord reviewInformal interviews
Focus groups
All participants
All participants
5-7 selected in each grouping
Questionnaire TBD
Recording logs and questions TBD
Develop focus group protocol for each group
EXAMPLE
DESIGN: Baseline? Comparison group? External contingencies? Other outcomes?
Contribution vs. attributionContribution vs. attribution
We need to accept the fact that what we are doing is measuring with the aim of reducing the uncertainty about the contribution made, not proving the contribution made.
Mayne, 1999:10
Culturally appropriate evaluation methodsCulturally appropriate evaluation methods
• How appropriate is the method given the culture of the respondent/the setting?
• Culture differences: nationality, ethnicity, religion, region, gender, age, abilities, class, economic status, language, sexual orientation, physical characteristics, organizational affiliation
Is a written questionnaire culturally appropriate?Is a written questionnaire culturally appropriate?
Things to consider:• Literacy level• Tradition of reading, writing • Setting• Not best choice for people with oral tradition• Translation (more than just literal translation)• How cultural traits affect response – response
sets• How to sequence the questions• Pretest questionnaire may be viewed as
intrusive
Are interviews culturally appropriate?Are interviews culturally appropriate?Things to consider:• Preferred by people with
an oral culture• Language level proficiency;
verbal skill proficiency• Politeness – responding to authority
(thinking it’s unacceptable to say “no”), nodding, smiling, agreeing
• Need to have someone present• Relationship/position of interviewer• May be seen as interrogation• Direct questioning may be seen as
impolite, threatening, or confrontational
Are focus groups culturally appropriate?Are focus groups culturally appropriate?
Things to consider:•Issues of gender, age, class, clan
differences•Issues of pride, privacy, self-sufficiency,
and traditions•Relationship to facilitator as
prerequisite to rapport•Same considerations as for interview
Is observation culturally appropriate?Is observation culturally appropriate?
Things to consider:•Discomfort, threat of being observed•Issue of being an “outsider”•Observer effect•Possibilities for
misinterpretations
CHALLENGESCHALLENGES
• Hard to reach populations
• Young children• When to do follow-up• Sensitive subject
matter• Reactivity• Evaluation as an
add-on
Insert – polling, quiz…some type of Insert – polling, quiz…some type of interactivityinteractivity
Apply the evaluation standards to your Apply the evaluation standards to your methods decisionsmethods decisions
• Utility• Feasibility• Propriety• Accuracy
UTILITYWill the data sources and collection methods serve the information needs of your primary users?
FEASIBILITYAre your sources and methods
practical and efficient?Do you have the capacity, time,
and resources? Are your methods non-intrusive
and non-disruptive?
PROPRIETYAre your methods respectful,
legal, ethical, and appropriate?Does your approach protect and
respect the welfare of all those involved or affected?
ACCURACYAre your methods technically adequate
to:• answer your questions?• measure what you intend to
measure?• reveal credible and trustworthy
information?• convey important information?
When choosing methods, consider… When choosing methods, consider…
•The purpose of your evaluation – what do you want to know?•Your use/users – what kind of data will your stakeholders find most credible and useful?
– Percents, comparison, stories, statistical analysis
•Your respondents − how they can best be reached, how they might best respond? •Your comfort level•Level of burden to program or participants•Pros and cons of each method•RESOURCES
http://www.uwex.edu/ces/pdande/evaluation/index.html
http://www.uwex.edu/ces/pdande/
http://www.uwex.edu/ces/lmcourse/
ResourcesResources
• Ohio State University• Penn State