program evaluation
DESCRIPTION
A presentation on the different types of program evaluationTRANSCRIPT
Program Evaluation
Alex, Ann, Mark, Lisa, Nick, Ethan
Evaluation• Alternative Views of Evaluation: • Diverse conceptions – definition? purpose?•
Philosophical & Ideological Differences • Objectivism & Subjectivism • Utilitarian & Intuitionist-Pluralist Evaluation
Evaluation Methodologies
• Methodological Preferences • Qualitative & Quantitative Methods
Metaphors for Evaluation
• Investigative journalism • Photography • Literary criticism • Industrial production • Sports
Objective-Oriented Evaluation• Tyler, Metfessel & Michael, Provus, Hammond • Use – achievement objectives determine success or
failure • Taxonomy of objectives and measurement
instruments • Assessments – objectives / criterion – referenced
testing – NAEP, Accountability Systems, NCLB
• Strengths & Limitations • Simplicity & Simplicity• Goal Free Evaluation
Management-Oriented Evaluation
• Context, Input, Process, Product Model (CIPP) • UCLA Model • Strengths & Limitations • Rational & Orderly • Fuels high level decision makers • Costly & Complex • Stability v. Need for Adjustment
Consumer-Oriented Approach
• Typically a summative evaluation approach
• This approach advocates consumer education and independent reviews of products
• Scriven’s contributions based on groundswell of federally funded educational programs in 1960s– Differentiation between formative/summative
evaluation
What is a consumer-oriented evaluation approach?
• When independent agencies, governmental agencies, and individuals compile information on education or other human services products for the consumer.
• Goal: To help consumers become more knowledgeable about products
For what purposes is it applied?
• Typically applied to educational products and programs – Governmental agencies– Independent consumer groups
• Educational Products Information Exchange• To represent the voice and concerns of the
consumers
How is it generally applied?
• Creating and using stringent checklists and criteria – Michael Scriven– Educational Products Information Exchange– U.S. Dept of Education
• Program Effectiveness Panel
–Processes–Content–Transportability–Effectiveness
Consumer Oriented Checklist
• Need• Market• Performance
– True field trials [tests in a “real” setting]– True consumer tests [tests with real users]– Critical comparisons [comparative data]– Long term [effects over the long term]– Side effects [unintended outcomes]– Process [product use fits its descriptions]– Causation [experimental study]– Statistical significance [supports product effectiveness]– Educational significance
Strengths… Has made evaluations available on products and programs to
consumers who may have not had the time or resources to do the evaluation process themselves
Increases the consumers’ knowledge about using criteria and standards to objectively and effectively evaluate educational and human services products
Consumers have become more aware of market strategies
…and Weaknesses
Increases product costs onto the consumer Product tests involves time and money, typically
passed onto the consumer Stringent criteria and standards may curb creativity in
product creation Concern for rise of dependency of outside products
and consumer services rather than local initiative development
Consumer Oriented Promotions
• Goals: Encourage repurchases by rewarding current users, boost sales of complementary products, and increase impulse purchases.
• Coupons—most widely used form of sales promotion.
• Refunds or rebates—help packaged-goods companies increase purchase rates, promote multiple purchases, and reward product users.
More Consumer Oriented Promotions
• Samples, bonus packs, and premiums—a “try it, you’ll like it” approach.
• Contests—require entrants to complete a task such as solving a puzzle or answering questions in a trivia quiz.
• Sweepstakes—choose winners by chance; no product purchase is necessary.
• Specialty Advertising—sales promotion technique that places the advertiser’s name, address, and advertising message on useful articles that are then distributed to target consumers.
Consumer Oriented Type Questions
• What educational products do you use?
• How are purchasing decisions made?
• What criteria seem to most important in the selection process?
• What other criteria for selection does this approach suggest to you?
Expertise Oriented Evaluation
• Expertise-oriented approaches, Direct application of professional expertise to judge the quality of educational endeavors, especially the resources and the processes.
Approaches
• Formal Professional Review System• Informal Professional Review System• Ad Hoc Panel Review• Ad Hoc Individual Review• Educational Connoisseurship and• Criticism
Formal Professional Review System
• Structure or organization established to conduct periodic reviews of educational endeavors
• Published standards• Pre-specified schedule• Opinions of several experts• Impact on status of that which is reviewed
Informal Professional Review System
• State review of district funding programs• Review of professors for determining rank
advancement or tenure status• Graduate student’s supervisory committee • Existing structure, no standards, infrequent schedule,
experts, status usually affected
Other Approaches
• Ad Hoc Panel Reviews (journal reviews)-Funding agency review panels-Blue-ribbon panels
– Multiple opinions, status sometimes affected• Ad Hoc Individual Reviews (consultant)-Status sometimes affected• Educational Connoisseurship and Criticism-Theater, art, and. literary critic
Uses
• Institutional accreditation• Specialized or program accreditation• doctoral exams, board reviews, accreditation,
reappointment/tenure reviews etc…
Strengths and Weaknesses
• Strengths: those well-versed make decisions, standards are set, encourage improvement through self-study
• Weaknesses: whose standards? (personal bias), expertise credentials, can this approach be used with issues of classroom life, texts, and other evaluation objects or only with the bigger institutional questions?
Expertise Type Questions
• What outsiders review your program or organization?
• How expert are they in your program’s context, process, and outcomes?
• What are characteristics of the most/least helpful reviewers?
Participant Oriented Evaluation
• Heretofore, the human element was missing from program evaluation
• This approach involves all relevant interests in the evaluation
• This approach encourages support for representation of marginalized, oppressed and/or powerless parties
Participant Oriented Characteristics
• Depend in inductive reasoning [observe, discover, understand]
• Use multiple data sources [subjective, objective, quant, qual]
• Do not follow a standard plan [process evolves as participants gain experience in the activity]
• Record multiple rather than single realities [e.g., focus groups]
Participant Oriented Examples • Stake’s Countenance Framework
– Description and judgment• Responsive Evaluation
– Addressing stakeholders’ concerns/issues– Case studies describe participants’ behaviors
• Naturalistic Evaluation– Extensive observations, interviews, documents and
unobtrusive measures serve as both data and reporting techniques
– Credibility vs. internal validity (x-checking, triangulation)
– Applicability vs. external validity (thick descriptions)
– Auditability vs. reliability (consistency of results)
– Confirmability vs. objectivity (neutrality of evaluation)
Participant Oriented Examples • Participatory Evaluation
– Collaboration between evaluators & key organizational personnel for practical problem solving
• Utilization-Focused Evaluation– Base all decisions on how everything will affect use
• Empowerment Evaluation– Advocates for societies’ disenfranchised, voiceless minorities– Advantages: training, facilitation, advocacy, illumination,
liberation– Unclear how this approach is a unique participant-oriented
approach– Argued in evaluation that it is not even ‘evaluation’
Strengths and Weaknesses
• Strengths: emphasizes human element, gain new insights and theories, flexibility, attention to contextual variables, encourages multiple data collection methods, provides rich, persuasive information, establishes dialogue with and empowers quiet, powerless stakeholders
• Weaknesses: too complex for practitioners (more for theorists), political element, subjective, “loose” evaluations, labor intensive which limits number of cases studied, cost, potential for evaluators to lose objectivity
Participant Oriented Questions
• What current program are you involved in that could benefit from this type of evaluation?
• Who are the stakeholders?
What's going on in the field?
•Educational Preparationhttp://www.duq.edu/program-evaluation/
• TEAhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949
What's going on in the field?
• Rockwood School District• Clear Creek ISD• Educational Link posted by Austin ISD• Houston ISD• Austin ISD
http://uncq.edu/
http://ceee.gwu.edu/
District Initiatives• Houston ISD “Real Men Read”
http://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRD
Alvin ISD “MHS (Manvel HS) Reading Initiative Program
http://www.alvinisd.net/education/staff/staff.php?sectionid=245
What does the research say?• “Rossman and Salzman (1995) have proposed a classification
system for organizing and comparing evaluations of inclusive school programs. They suggest that evaluations be described according to their program features (purpose, complexity, scope, target population, and duration) and features of the evaluation (design, methods, instrumentation, and sample).”Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.
What does the research say?
• “Twenty-eight school counselors from a large Southwestern school district participated in a program evaluation training workshop designed to help them develop evaluation skills necessary for demonstrating program accountability. The majority of participants expressed high levels of interest in evaluating their programs but believed they needed more training in evaluation procedures.”
What does the research say?• Group Interview Questions
“Graduate research assistants conducted group interviews in Grades 2-5 during the final weeks of the school year. We obtained parent permission by asking teachers to distribute informed consent forms to students in their classes, which invited the students to participate in the group interviews. We received informed consent forms from at least 3 students-the criterion number for a group interview at the school-for 21 schools (66% participation rate). If participation rates were high enough, the research assistants conducted separate interviews for Grades 2-3 and 4-5; the assistants conducted 23 interviews. The research assistants tape recorded all interviews, which averaged about 25 min, for data analysis. The interviewer encouraged responses from all group members. Four questions guided the group interviews.”Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
What does the research say?• Survey Collection
“A survey collected teachers' self-reports of the frequency with which they implemented selected literacy activities and the amount of time in minutes that they used the literacy activities. Teachers also reported their level of satisfaction with the literacy resources available to them.”Frey, B., Lee, S., Massengill, D, Pass, L, & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
Tips on making a survey:• Make the survey response time around 20 minutes • Make the survey easy to answer• What changes should the school board make in its policies regarding
the placement of computers in elementary school? -Is this question effective or vague?
• Survey questions should clarify the time period• During the past year, has computer use by the average child in your
classroom increased, decreased, or stayed the same?• Avoid Double (or triple, or quadruple) Barreled questions and responses• My classroom aide performed his/her tasks carefully, impartially,
thoroughly, and on time.• Langbein, L. (2006). Public Program Evaluation: A Statistical Guide.
New York: M.E. Sharpe, Inc.
Examples of Evaluation Methods
• EXAMPLES OF EVALUATION METHODS USEDOne study uses a mixed-methods approach of objective-oriented, expertise-oriented, and participant-oriented approaches. Evaluations were based on the models provided in “Program Evaluation Alternative Approaches and Practical Guidelines”
Cont.• The purpose of this report is to illustrate the procedures
necessary to complete an evaluation of the Naval Aviation Survival Training Program (NASTP) written by Anthony R. Artino Jr. a program manager and instructor within the NASTP for eight years
“In very few instances have we adhered to any particular “model” of evaluation. Rather, we find we can ensure a better fit by snipping and sewing bits and pieces off the more traditional ready-made approaches and even weaving a bit of homespun, if necessary, rather than by pulling any existing approach off the shelf. Tailoring works” (Worthen, Sanders, Fitzpatrick p. 183).
Cont.• objective-oriented evaluation – Objective-
oriented evaluation was used because a) the NASTP has a number of well-written objectives; b) it would be relatively easy to measure student attainment of those objectives using pre and post-assessments; and c) the program sponsor, the CNO, would be very interested to know if the objectives that he approves are in fact being met.
Cont.
• Expertise-oriented evaluation - An outside opinion from an aviation survival training subject matter expert – someone very familiar with the topics being taught and the current research literature in survival training was used.
Cont.
• Participant-oriented evaluation - It was important for evaluators and the SME to be totally immersed in the training environment. This included a focus on audience concerns and issues (i.e. mangers, instructors, and students) and an examination of the program “in situ” without any attempt to manipulate or control it (Worthen, Sanders, & Fitzpatrick, 1997).
• References• Astramovich, R., Coker, J., & Hoskins, W. (2005).
Professional School Counseling. • The Journal of Educational Research, 9, 49-54.• Dymond, S. (2001). A Participatory Action Research
Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.
• Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.
• Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe,
References • http://ceee.gwu.edu/
http://www.alvinisd.net/education/staff/staff.php?sectionid=245http://www.austinisd.org/inside/accountability/evaluation/http://www.duq.edu/program-evaluation/http://www.eval.org/Publications/GuidingPrinciples.asphttp://www.houstonisd.org/portal/site/ResearchAccountabilityhttp://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRD
http://www.mcrel.org/topics/Assessment/services/231/http://www.rockwood.k12.mo.us/dataquality/Pages/ProgramEvaluations.aspxhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949http://www.uncg.edu/http://www2.ccisd.net/Departments/ResearchAccountability/ProgramEvaluation.aspx