program evaluation

60
Program Evaluation Joy Anne R. Puazo Marie Buena S. Bunsoy

Upload: yen-bunsoy

Post on 29-Nov-2014

1.904 views

Category:

Education


1 download

DESCRIPTION

This presentation tackles the following information: *Approaches to Program Evaluation *Three Dimensions that Shape Point of View on Evaluation *Doing Program Evaluation *Program Components as Data Sources Reference: The Elements of Language Curriculum (A Systematic Approach to Program Development) by James Dean Brown of University of Hawaii at Manoa Reporters: Joy Anne R. Puazo & Marie Buena S. Bunsoy Program: Bachelor in Secondary Education Major in English Year: 4th Instructor: Mrs. Yolanda D. Reyes Subject: Language Curriculum for Secondary Schools

TRANSCRIPT

Page 1: Program evaluation

Program EvaluationJoy Anne R. Puazo

Marie Buena S. Bunsoy

Page 2: Program evaluation

Evaluation

Evaluation

◦The systematic gathering of information for purposes of making decisions

◦- Richards et al. (1985. p. 98)

Page 3: Program evaluation

Evaluation

“Systematic educational evaluation consists of a formal assessment of

the worth of educational phenomena.”

- Popham (1975, p.8)

Page 4: Program evaluation

Evaluation

“Evaluation is the determination of the worth of a thing. It includes obtaining information for use in judging the worth of the program,

product, procedure, or object, or the potential utility of alternative approaches designed to attain specified objectives.”

- Worthen and Sanders (1973, p.19)

Page 5: Program evaluation

Evaluation

“Evaluation is the systematic collection and analysis of all relevant information

necessary to promote the improvement of the curriculum and assess its

effectiveness within the context of the particular institutions involved.”

- Brown

Page 6: Program evaluation

Evaluation

Testing◦Procedures that are based on test, whether

they be criterion-referenced or norm-referenced in nature.

Measurement◦Testing is included; other types of

measurements that result in quantitative data such as attendance records, questionnaires, teacher-student ratings

Page 7: Program evaluation

Evaluation

Evaluation◦More qualitative in natureCase studies, classroom observations, meetings, diaries, and conversations

Page 8: Program evaluation

Approaches to Program Evaluation

Product-oriented approaches◦Focus: goals and instructional objectives

◦Tyler, Hammond, and Metfessel and Michael

Page 9: Program evaluation

Product-oriented approaches

The programs should be built on explicitly defined goals, specified in terms of the society, the students, the subject matter, as well as on measurable behavioral objectives.

Purpose: To determine whether the objectives have been achieved, and whether the goals have been met.

- Tyler

Page 10: Program evaluation

Product-oriented approaches

Five steps to be followed in performing a curriculum evaluation: Hammond (Worthen and Sanders 1973, p. 168)

1. Identifying precisely what is to be evaluated

2. Defining descriptive variables3. Stating objectives in behavioral terms4. Assessing the behavior described in the

objectives5. Analyzing the results and determining the

effectiveness of the program

Page 11: Program evaluation

Product-oriented approaches

8 Major Evaluation Process: Metfessel and Michael (1967)

1. Direct and indirect involvement of the total school community

2. Formation of a cohesive model of broad goals and specific objectives

3. Transformation of specific objectives into communicable form

Page 12: Program evaluation

Product-oriented approaches

4. Instrumentation necessary for furnishing measures allowing inferences about program effectiveness

5. Periodic observation of behaviors6. Analysis of data given by status and change

measures7. Interpretation of the data relative to specific

objectives and broad goals8. Recommendations culminating in further

implementations, modifications, and in revisions of broad goals and specific objectives

Page 13: Program evaluation

Approaches to Program Evaluation

Static-Characteristic Approaches◦Conducted by outside experts who inspect

outside the program by examining various records

Accreditation◦Process whereby an association of institutions

sets up criteria and evaluation procedures for the purposes of deciding whether individual institutions should be certified as members in good standing of that association

Page 14: Program evaluation

Static-Characteristic Approaches

“A major reason for the diminishing interest in accreditation conceptions of evaluation is the

recognition of their almost total reliance on intrinsic rather than extrinsic factors. Although there are some

intuitive support for the proposition that these process factors are associated with the final outcomes

of an instructional sequence, the scarcity of the empirical evidence to confirm the relationship has

created growing dissatisfaction with the accreditation approach among the educators.”

- Popham (1975, p. 25)

Page 15: Program evaluation

Approaches to Program Evaluation

Process-Oriented Approaches (Scriven and Stake)

Scriven’s Model/Goal-free evaluation◦Limits are not set on studying the expected effects of the program vis-à-vis the goals

Page 16: Program evaluation

Process-Oriented Approaches

Countenance model (Stake; 1967)

1. Begin with a rationale2. Fix on descriptive operations3. End with judgmental operations at 3 levels:

Antecedents Transactions Outcomes

Page 17: Program evaluation

Approaches to Program Evaluation

Decision-Facilitation Approaches◦Evaluators attempt to avoid making judgments

◦Gathering information that will help the administrators and faculty in the program make their own judgments and evaluation

◦Examples: CICP, CSE, Discrepancy model

Page 18: Program evaluation

Decision-Facilitation Approaches

CIPP (Context, Input, Process, Product)

4 key elements in performing program evaluation: Stufflebeam (1974)

1. Evaluation is performed in the service of decision making, hence it should provide information that is useful to decision makers.

2. Evaluation is a cyclic, continuing process and therefore must be implemented through a systematic program.

3. The evaluation process includes 3 main steps of delineating, obtaining, and providing. (methodology)

4. The delineating and providing steps in the evaluation process are interface activities requiring collaboration.

Page 19: Program evaluation

Decision-Facilitation Approaches

CSE (Center for the Study of Evaluation)

5 different categories of decisions (Alkin; 1969)

1. System assessment2. Program planning3. Program implementation4. Program improvement5. Program certification

Page 20: Program evaluation

Decision-Facilitation Approaches

Discrepancy Model (Provus; 1971)

“Program evaluation is the process of (1) defining program standards; (2)

determining whether the discrepancy exists between some aspect of program

performance and the standards governing the aspect of the program; and (3) using discrepancy information either to change

performance or to change program standards.”

Page 21: Program evaluation

Decision-Facilitation Approaches

5 stages that shows that discrepancy model is a process-oriented approach:

1. Program description stage2. Program installation stage3. Treatment adjustment stage4. Goal achievement analysis stage5. Cost-benefit analysis

Page 22: Program evaluation

Three Dimensions that Shape Point of View on Evaluation:

1. Formative vs. Summative2. Process vs. Product3. Quantitative vs. Qualitative

Page 23: Program evaluation

Three Dimensions

Purpose of InformationFormative evaluation

◦During the ongoing curriculum dev’t process◦To collect and analyze information that will help

in improving the curriculum

Summative evaluation◦End of a program◦To determine the degree to which the program

is successful, efficient, and effective.

Page 24: Program evaluation

Weakness of Summative Evaluation

Most language program are continuing institutions that

do not conveniently come to an end so that such

evaluation can be performed

Page 25: Program evaluation

Benefits of Summative Evaluation

Identifying the success and failure of the program

Provides an opportunity to stand back or consider what has been achieved in the longer view

Combination of F and S:◦Can put the program and its staff in a strong

position for responding any crises that might be brought on by the evaluation from outside the program

Page 26: Program evaluation
Page 27: Program evaluation
Page 28: Program evaluation
Page 29: Program evaluation

Three Dimensions

Types of InformationProcess evaluation

◦Focuses on the workings of a program

Product evaluation◦Focus on whether the goals of the program are being achieved

Page 30: Program evaluation

Three Dimensions

Types of Data and AnalysesQuantitative data

◦Countable bits of information which are usually gathered using measures that produce results in the form of language

Qualitative Data◦Consist of more holistic information based on

observations

Page 31: Program evaluation

Doing Program Evaluation

Page 32: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Existing information

Records Analysis X X

Systems X X

Literature Review X

Letter Writing X

Page 33: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Tests

Proficiency X

Placement X

Diagnostic X

Achievement X

Page 34: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Observations

Case Studies X

Diary Studies X

Behavior Observation X

Interactional Analyses X

Inventories X

Page 35: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Interviews

Individual X X

Group X X

Page 36: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Meetings

Delphi Technique X

Advisory X

Interest group X

Review X

Page 37: Program evaluation

Instruments and Procedures

Quantitative Qualitative

Questionnaires

Biodata Survey X

Opinion Survey X X

Self-ratings X

Judgmental ratings X

Q-sort X

Page 38: Program evaluation

Gathering Evaluation Data

Quantitative Evaluation StudiesQualitative Evaluation StudiesUsing Both Quantitative and

Qualitative Methods

Page 39: Program evaluation

Gathering Evaluation Data

Quantitative data Bits of information that are countable and are gathered using measures that produce results in the form of numbers.

Page 40: Program evaluation

Gathering Evaluation Data

The importance of using quantitative data is not so much in the collection of those data, but rather in the analysis of the data, which should be carried out in such a way that patterns emerge.

Page 41: Program evaluation

Gathering Evaluation Data

A classic example of what many people think an evaluation study ought to be is a quantitative, statistics-based experimental study designed to investigate the effectiveness of a given program.

Page 42: Program evaluation

Gathering Evaluation Data

Key Terms:Experimental Group

One that receives the treatment

Control GroupReceives no treatment

Page 43: Program evaluation

Gathering Evaluation Data

Treatment is something that the experimenter does to the experimental group or rather than an experience through which they go (as in learning experience).

Page 44: Program evaluation

Gathering Evaluation Data

The purpose of giving treatment to the experimental group and nothing to the control group is to determine whether the treatment has been effective.

Page 45: Program evaluation

Gathering Evaluation Data

Qualitative Evaluation Studies

Consists of information that is more holistic than quantitative data.

Page 46: Program evaluation

Gathering Evaluation Data

Goal Of Qualitative Research

To collect data in order to analyze them in such a way that patterns emerge so that sense can be made of the results and the quality of the program can be evaluated.

Page 47: Program evaluation

Gathering Evaluation Data

Using Both Quantitative and Qualitative Methods

Both types of data can yield valuable information in any evaluation, and therefore ignoring either type of information would be pointless and self-defeating. Sound evaluation practices will be based on all available perspectives so that many types of information can be gathered to strengthen the evaluation process and ensure that the resulting decisions will be as informed, accurate, and useful as possible.

Page 48: Program evaluation

Program Components as Data Sources

The purpose of gathering all this information is, of course, to determine the effectiveness of the program, so as to improve each of the components and the ways that they work together.

Page 49: Program evaluation

Program Components as Data Sources

The overall purpose of evaluation is to determine the general effectiveness of the program, usually for purposes of improving it or defending its utility to outside administrators or agencies.

Naturally, the curriculum components under discussion are needs analysis, objectives, testing, materials, teaching, and the evaluation itself.

Page 50: Program evaluation

Program Components as Data Sources

Effective?

Quantitative study that demonstrates the students (who receive a language learning treatment) significantly outperformed a control group (who did not receive the treatment) is really only showing that the treatment in question is better than nothing.

Lynch (1986)

Page 51: Program evaluation

Questions Primary Data Sources

N AE NE AD LS Y S I S

Which of the needs that were originally identified turned out to be accurate (now that the program has more experience with students and their relationship to the program) in terms of what has been learned in testing, developing materials, teaching and evaluation?

All original needs analysis documents

OBJECTIVES

Which of the original objectives reflect real student needs in view of the changing perceptions of those needs and all of the other information gathered in testing, materials development, teaching, and evaluation?

Criterion-Referenced Tests (Diagnostic)

Page 52: Program evaluation

TEST ING

To what degree are the students achieving the objectives of the courses? Were the norm-referenced and criterion-referenced tests valid?

Criterion-referenced tests (achievement), Test Evaluation Procedures

MATERIALS

How effective are the materials (whether adopted, developed, or adapted) at meeting the needs of the students as expressed in the objectives?

Materials Evaluation Procedure

TEACHING

To what degree is instruction effective?

Classroom observations and student evaluations

Page 53: Program evaluation

Program Component as Data Sources

More detailed questions:What were the original perception of the students’

needs?How accurate was this initial thinking (now that

we have more experience with the students and their relationship to the said program)?

Which of the original needs, especially as reflected in the goals and objectives, are useful and which are not useful?

What newly perceived needs must be addressed? How do these relate to those perceptions that were found to be accurate? How must be the goals and objectives be adjusted accordingly?

Page 54: Program evaluation

Program Component as Data Sources

Efficient?Evaluators could set up a study to investigate the degree to which the amount of time can be compressed to make the learning process more efficient.

Page 55: Program evaluation

Questions Primary Data Sources

N AE NE AD LS Y S I S

Which of the original student needs turned out to be the most efficiently learned? Which were superfluous?

Original needs analysis documents and criterion referenced tests (both diagnostic and achievement)

OBJECTIVES

Which objectives turned out to be needed by the students and which they already know?

Criterion-referenced tests (diagnostic)

Page 56: Program evaluation

TESTING

Were the norm-referenced and criterion-referenced tests efficient and reliable?

Test Evaluation Procedures

MATERIALS

How can material resources be reorganized for more efficient use by teachers and students?

Materials blueprint and scope-and-sequence charts

TEACHING

What types of support are provided to help teachers and students?

Orientation documentation and administrative support structure

Page 57: Program evaluation

Program Component as Data Sources

AttitudesThe third general area concern in language program evaluation will usually center on the attitudes of the teachers, students, and administrators regarding the various components of the curriculum as they were implemented in the program.

Page 58: Program evaluation

Questions Primary Data Sources

Needs Analysis What are the students’, teachers’, and

administrators’ attitude or feelings about the situational and language needs of the students? Before program?

After?

Needs analysis questionnaires and any

resulting documents

Objectives What are the students’, teachers’, and

administrators’ attitude or feelings about the usefulness of the objectives as originally formulated? Before program?

After?

Evaluation interviews and questionnaires

Page 59: Program evaluation

Testing What are the students’, teachers’, and

administrators’ attitude or feelings about the

usefulness of the tests as originally developed?

Before? After?

Evaluation interviews, meetings, and questionnaires

Materials What are the students’, teachers’, and

administrators’ attitude or feelings about the

usefulness of the materials as originally adopted, developed

and/or adapted? Before? After?

Evaluation interviews, meetings, and questionnaires

Teaching What are the students’, teachers’, and

administrators’ attitude or feelings about the

usefulness of the teaching as originally

delivered? Before? After?

Evaluation interviews, meetings, and questionnaires

Page 60: Program evaluation