12 chapter

Post on 24-May-2015

72 Views

Category:

Education

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Designing and Conducting Summative Evaluation

The Evaluation after implementation

Involves _______ data

CollectingAnalyzingSummarizing

For the purpose of

Giving decision makers information on the effectiveness and efficiency of instruction

Effectiveness of Content Instruction solve the problem? Criterion created prior to evaluation? Was the criterion established

in conjunction withthe needs assessment?

Specifically

Did learners achieve the objectives? Learners feeling about instruction? What were the costs? How much time did it take? Was instruction implemented

as designed? What unexpected outcomes?

Alternative Approaches to Summative Evaluation

ObjectivismSubjectivism

Objectivism

Based on empiricism Answering questions on the bases of

observed data Goal based and replicable, uses the

scientific method

Subjectivism

Employs expert judgment Includes qualitative methods

observation and interviews evaluate content

“Goal Free” evaluators haven’t a

clue about the goals

Objectivism (limitations)

Examine only a limited number of factors

May miss critical effects

Subjectivism (limitations)

Are not replicable

Biased by idiosyncratic experiences, perspectives, or the people who do the evaluation

May miss critical effects

Designer Role in Summative Evaluation?

Somewhat controversial

Timing of Summative Evaluation?

Not in the first cycle

Summary Diagram Formative

Design ReviewsExpert ReviewsOne-to-one Eval.Small Group Eval.Field TrialsOngoing Eval.

SummativeDetermine Goals of the EvaluationSelect OrientationSelect DesignDesign or Select Evaluation MeasuresCollect DataAnalyze DataReport Results

Goals of the Evaluation

What decisions must be made? What are the best questions? How practical is it to gather data? Who wants the answer to a question? How much uncertainty?

Orientation of Evaluation Goal-based or goal-free A middle ground? Quantitative or qualitative appropriate? Experimental or naturalistic approach?

Select Design of EvaluationDescribes what data to collect

When the data will be collectedAnd under what conditions Issues to consider:

How much confidence must we have that the instruction caused the learning? (internal validity)

How important is the generalizability? (external validity)

How much control do we have over the instructional situation?

Design or Select Evaluation Measures Payoff outcomes

Is the problem solved? Costs avoided Increased outputs Improved quality Improved efficiency

Design or Select Evaluation Measures (2) Learning Outcomes

Use instrument you’ve already developed for the summative evaluation

But measure the entire program

Design or Select Evaluation Measures (3) Attitudes

Rarely the primary payoff goals Ask about learner attitudes toward

learning instructional materials subject matter

Indices of appeal attention, likeableness, interest, relevance,

familiarity, credibility, acceptability, and excitement

Design or Select Evaluation Measures (4) Level of Implementation

degree to which the instruction was implemented

Costs Cost-feasibility Cost-effectiveness

Alternative Designs

Instruction then posttest Pretest then instruction then posttest

The Report

SummaryBackground

Needs assessment, audience, context, program description

Description of evaluation study Purpose of evaluation, evaluation of the

design, outcomes measured, implementation measures, cost-effectiveness info., analysis of unintentional outcomes

The Report (continued)

Results Outcomes, implementation, cost-

effectiveness info., unintentional outcomes

Discussion causal relationship between program &

results Limitation of study

Conclusion & Recommendations

Summary Summative evaluation is after

implementation Limitations of subjective and objective

evaluation What to include in the report

top related