program evaluation

53
Program Evaluation Alex, Ann, Mark, Lisa, Nick, Ethan

Upload: spanishpvs

Post on 18-Dec-2014

61.137 views

Category:

Education


0 download

DESCRIPTION

A presentation on the different types of program evaluation

TRANSCRIPT

Page 1: Program evaluation

Program Evaluation

Alex, Ann, Mark, Lisa, Nick, Ethan

Page 2: Program evaluation

Evaluation Video

http://vimeo.com/6765206

Page 3: Program evaluation

Evaluation• Alternative Views of Evaluation: • Diverse conceptions – definition? purpose?•

Philosophical & Ideological Differences • Objectivism & Subjectivism • Utilitarian & Intuitionist-Pluralist Evaluation

Page 5: Program evaluation

Metaphors for Evaluation

• Investigative journalism • Photography • Literary criticism • Industrial production • Sports

Page 6: Program evaluation

Objective-Oriented Evaluation• Tyler, Metfessel & Michael, Provus, Hammond • Use – achievement objectives determine success or

failure • Taxonomy of objectives and measurement

instruments • Assessments – objectives / criterion – referenced

testing – NAEP, Accountability Systems, NCLB

• Strengths & Limitations • Simplicity & Simplicity• Goal Free Evaluation

Page 8: Program evaluation

Consumer-Oriented Approach

• Typically a summative evaluation approach

• This approach advocates consumer education and independent reviews of products

• Scriven’s contributions based on groundswell of federally funded educational programs in 1960s– Differentiation between formative/summative

evaluation

Page 9: Program evaluation

What is a consumer-oriented evaluation approach?

• When independent agencies, governmental agencies, and individuals compile information on education or other human services products for the consumer.

• Goal: To help consumers become more knowledgeable about products

Page 10: Program evaluation

For what purposes is it applied?

• Typically applied to educational products and programs – Governmental agencies– Independent consumer groups

• Educational Products Information Exchange• To represent the voice and concerns of the

consumers

Page 11: Program evaluation

How is it generally applied?

• Creating and using stringent checklists and criteria – Michael Scriven– Educational Products Information Exchange– U.S. Dept of Education

• Program Effectiveness Panel

–Processes–Content–Transportability–Effectiveness

Page 13: Program evaluation

Strengths… Has made evaluations available on products and programs to

consumers who may have not had the time or resources to do the evaluation process themselves

Increases the consumers’ knowledge about using criteria and standards to objectively and effectively evaluate educational and human services products

Consumers have become more aware of market strategies

Page 14: Program evaluation

…and Weaknesses

Increases product costs onto the consumer Product tests involves time and money, typically

passed onto the consumer Stringent criteria and standards may curb creativity in

product creation Concern for rise of dependency of outside products

and consumer services rather than local initiative development

Page 15: Program evaluation

Consumer Oriented Promotions

• Goals: Encourage repurchases by rewarding current users, boost sales of complementary products, and increase impulse purchases.

• Coupons—most widely used form of sales promotion.

• Refunds or rebates—help packaged-goods companies increase purchase rates, promote multiple purchases, and reward product users.

Page 16: Program evaluation

More Consumer Oriented Promotions

•  Samples, bonus packs, and premiums—a “try it, you’ll like it” approach.

• Contests—require entrants to complete a task such as solving a puzzle or answering questions in a trivia quiz.

• Sweepstakes—choose winners by chance; no product purchase is necessary.

• Specialty Advertising—sales promotion technique that places the advertiser’s name, address, and advertising message on useful articles that are then distributed to target consumers.

Page 21: Program evaluation

Informal Professional Review System

• State review of district funding programs• Review of professors for determining rank

advancement or tenure status• Graduate student’s supervisory committee • Existing structure, no standards, infrequent schedule,

experts, status usually affected

Page 22: Program evaluation

Other Approaches

• Ad Hoc Panel Reviews (journal reviews)-Funding agency review panels-Blue-ribbon panels

– Multiple opinions, status sometimes affected• Ad Hoc Individual Reviews (consultant)-Status sometimes affected• Educational Connoisseurship and Criticism-Theater, art, and. literary critic

Page 24: Program evaluation

Strengths and Weaknesses

• Strengths: those well-versed make decisions, standards are set, encourage improvement through self-study

• Weaknesses: whose standards? (personal bias), expertise credentials, can this approach be used with issues of classroom life, texts, and other evaluation objects or only with the bigger institutional questions?

Page 25: Program evaluation

Expertise Type Questions

• What outsiders review your program or organization?

• How expert are they in your program’s context, process, and outcomes?

• What are characteristics of the most/least helpful reviewers?

Page 26: Program evaluation

Participant Oriented Evaluation

• Heretofore, the human element was missing from program evaluation

• This approach involves all relevant interests in the evaluation

• This approach encourages support for representation of marginalized, oppressed and/or powerless parties

Page 27: Program evaluation

Participant Oriented Characteristics

• Depend in inductive reasoning [observe, discover, understand]

• Use multiple data sources [subjective, objective, quant, qual]

• Do not follow a standard plan [process evolves as participants gain experience in the activity]

• Record multiple rather than single realities [e.g., focus groups]

Page 28: Program evaluation

Participant Oriented Examples • Stake’s Countenance Framework

– Description and judgment• Responsive Evaluation

– Addressing stakeholders’ concerns/issues– Case studies describe participants’ behaviors

• Naturalistic Evaluation– Extensive observations, interviews, documents and

unobtrusive measures serve as both data and reporting techniques

– Credibility vs. internal validity (x-checking, triangulation)

– Applicability vs. external validity (thick descriptions)

– Auditability vs. reliability (consistency of results)

– Confirmability vs. objectivity (neutrality of evaluation)

Page 29: Program evaluation

Participant Oriented Examples • Participatory Evaluation

– Collaboration between evaluators & key organizational personnel for practical problem solving

• Utilization-Focused Evaluation– Base all decisions on how everything will affect use

• Empowerment Evaluation– Advocates for societies’ disenfranchised, voiceless minorities– Advantages: training, facilitation, advocacy, illumination,

liberation– Unclear how this approach is a unique participant-oriented

approach– Argued in evaluation that it is not even ‘evaluation’

Page 30: Program evaluation

Strengths and Weaknesses

• Strengths: emphasizes human element, gain new insights and theories, flexibility, attention to contextual variables, encourages multiple data collection methods, provides rich, persuasive information, establishes dialogue with and empowers quiet, powerless stakeholders

• Weaknesses: too complex for practitioners (more for theorists), political element, subjective, “loose” evaluations, labor intensive which limits number of cases studied, cost, potential for evaluators to lose objectivity

Page 32: Program evaluation

What's going on in the field?

•Educational Preparationhttp://www.duq.edu/program-evaluation/

• TEAhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949

Page 34: Program evaluation
Page 35: Program evaluation

http://uncq.edu/

Page 36: Program evaluation

http://ceee.gwu.edu/

Page 37: Program evaluation

District Initiatives• Houston ISD “Real Men Read”

http://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRD

Alvin ISD “MHS (Manvel HS) Reading Initiative Program

http://www.alvinisd.net/education/staff/staff.php?sectionid=245

Page 38: Program evaluation

What does the research say?• “Rossman and Salzman (1995) have proposed a classification

system for organizing and comparing evaluations of inclusive school programs. They suggest that evaluations be described according to their program features (purpose, complexity, scope, target population, and duration) and features of the evaluation (design, methods, instrumentation, and sample).”Dymond, S. (2001). A Participatory Action Research Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.

Page 39: Program evaluation

What does the research say?

• “Twenty-eight school counselors from a large Southwestern school district participated in a program evaluation training workshop designed to help them develop evaluation skills necessary for demonstrating program accountability. The majority of participants expressed high levels of interest in evaluating their programs but believed they needed more training in evaluation procedures.”

Page 40: Program evaluation

What does the research say?• Group Interview Questions

“Graduate research assistants conducted group interviews in Grades 2-5 during the final weeks of the school year. We obtained parent permission by asking teachers to distribute informed consent forms to students in their classes, which invited the students to participate in the group interviews. We received informed consent forms from at least 3 students-the criterion number for a group interview at the school-for 21 schools (66% participation rate). If participation rates were high enough, the research assistants conducted separate interviews for Grades 2-3 and 4-5; the assistants conducted 23 interviews. The research assistants tape recorded all interviews, which averaged about 25 min, for data analysis. The interviewer encouraged responses from all group members. Four questions guided the group interviews.”Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.

Page 41: Program evaluation

What does the research say?• Survey Collection

“A survey collected teachers' self-reports of the frequency with which they implemented selected literacy activities and the amount of time in minutes that they used the literacy activities. Teachers also reported their level of satisfaction with the literacy resources available to them.”Frey, B., Lee, S., Massengill, D, Pass, L, & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.

Page 42: Program evaluation

Tips on making a survey:• Make the survey response time around 20 minutes • Make the survey easy to answer• What changes should the school board make in its policies regarding

the placement of computers in elementary school? -Is this question effective or vague?

• Survey questions should clarify the time period• During the past year, has computer use by the average child in your

classroom increased, decreased, or stayed the same?• Avoid Double (or triple, or quadruple) Barreled questions and responses• My classroom aide performed his/her tasks carefully, impartially,

thoroughly, and on time.• Langbein, L. (2006). Public Program Evaluation: A Statistical Guide.

New York: M.E. Sharpe, Inc.

Page 43: Program evaluation

Examples of Evaluation Methods

• EXAMPLES OF EVALUATION METHODS USEDOne study uses a mixed-methods approach of objective-oriented, expertise-oriented, and participant-oriented approaches. Evaluations were based on the models provided in “Program Evaluation Alternative Approaches and Practical Guidelines”

Page 44: Program evaluation
Page 45: Program evaluation

Cont.• The purpose of this report is to illustrate the procedures

necessary to complete an evaluation of the Naval Aviation Survival Training Program (NASTP) written by Anthony R. Artino Jr. a program manager and instructor within the NASTP for eight years

“In very few instances have we adhered to any particular “model” of evaluation. Rather, we find we can ensure a better fit by snipping and sewing bits and pieces off the more traditional ready-made approaches and even weaving a bit of homespun, if necessary, rather than by pulling any existing approach off the shelf. Tailoring works” (Worthen, Sanders, Fitzpatrick p. 183).

Page 46: Program evaluation

Cont.• objective-oriented evaluation – Objective-

oriented evaluation was used because a) the NASTP has a number of well-written objectives; b) it would be relatively easy to measure student attainment of those objectives using pre and post-assessments; and c) the program sponsor, the CNO, would be very interested to know if the objectives that he approves are in fact being met.

Page 47: Program evaluation

Cont.

• Expertise-oriented evaluation - An outside opinion from an aviation survival training subject matter expert – someone very familiar with the topics being taught and the current research literature in survival training was used.

Page 48: Program evaluation

Cont.

• Participant-oriented evaluation - It was important for evaluators and the SME to be totally immersed in the training environment. This included a focus on audience concerns and issues (i.e. mangers, instructors, and students) and an examination of the program “in situ” without any attempt to manipulate or control it (Worthen, Sanders, & Fitzpatrick, 1997).

Page 49: Program evaluation
Page 50: Program evaluation
Page 51: Program evaluation
Page 52: Program evaluation

• References• Astramovich, R., Coker, J., & Hoskins, W. (2005).

Professional School Counseling. • The Journal of Educational Research, 9, 49-54.• Dymond, S. (2001). A Participatory Action Research

Approach to Evaluating Inclusive School Programs. Focus on Autism & Other Developmental Disabilities, 16, 54-63.

• Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N. (2005). Balanced Literacy in an Urban School District. The Journal of Educational Research, 98, 272-280.

• Langbein, L. (2006). Public Program Evaluation: A Statistical Guide. New York: M.E. Sharpe,

Page 53: Program evaluation

References • http://ceee.gwu.edu/

http://www.alvinisd.net/education/staff/staff.php?sectionid=245http://www.austinisd.org/inside/accountability/evaluation/http://www.duq.edu/program-evaluation/http://www.eval.org/Publications/GuidingPrinciples.asphttp://www.houstonisd.org/portal/site/ResearchAccountabilityhttp://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRD

http://www.mcrel.org/topics/Assessment/services/231/http://www.rockwood.k12.mo.us/dataquality/Pages/ProgramEvaluations.aspxhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949http://www.uncg.edu/http://www2.ccisd.net/Departments/ResearchAccountability/ProgramEvaluation.aspx