chapter 07 slides 4e

57
CHAPTER 7 Werner & DeSimone 1 Evaluating HRD Programs

Upload: ahsan-khan-niazi

Post on 27-Nov-2014

169 views

Category:

Documents


9 download

TRANSCRIPT

Page 1: Chapter 07 Slides 4e

CHAPTER 7

Werner & DeSimone

1

Evaluating HRD Programs

Page 2: Chapter 07 Slides 4e

Effectiveness

Werner & DeSimone (2006)

2

The degree to which a training (or other HRD program) achieves its intended purpose

Measures are relative to some starting point

Measures how well the desired goal is achieved

Page 3: Chapter 07 Slides 4e

Evaluation

Werner & DeSimone (2006)

3

Page 4: Chapter 07 Slides 4e

HRD Evaluation

Werner & DeSimone (2006)

4

Textbook definition:“The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”

Page 5: Chapter 07 Slides 4e

In Other Words…

Werner & DeSimone (2006)

5

Are we training: the right peoplethe right “stuff”the right waywith the right materialsat the right time?

Page 6: Chapter 07 Slides 4e

Evaluation Needs

Werner & DeSimone (2006)

6

Descriptive and judgmental information needed Objective and subjective data

Information gathered according to a plan and in a desired format

Gathered to provide decision making information

Page 7: Chapter 07 Slides 4e

Purposes of Evaluation

Werner & DeSimone (2006)

7

Determine whether the program is meeting the intended objectives

Identify strengths and weaknessesDetermine cost-benefit ratioIdentify who benefited most or leastDetermine future participantsProvide information for improving HRD

programs

Page 8: Chapter 07 Slides 4e

Purposes of Evaluation – 2

Werner & DeSimone (2006)

8

Reinforce major points to be madeGather marketing informationDetermine if training program is appropriateEstablish management database

Page 9: Chapter 07 Slides 4e

Evaluation Bottom Line

Werner & DeSimone (2006)

9

Is HRD a revenue contributor or a revenue user?

Is HRD credible to line and upper-level managers?

Are benefits of HRD readily evident to all?

Page 10: Chapter 07 Slides 4e

How Often are HRD Evaluations Conducted?

Werner & DeSimone (2006)

10

Not often enough!!!Frequently, only end-of-course participant

reactions are collectedTransfer to the workplace is evaluated less

frequently

Page 11: Chapter 07 Slides 4e

Why HRD Evaluations are Rare

Werner & DeSimone (2006)

11

Reluctance to having HRD programs evaluated

Evaluation needs expertise and resources

Factors other than HRD cause performance improvements – e.g., Economy Equipment Policies, etc.

Page 12: Chapter 07 Slides 4e

Need for HRD Evaluation

Werner & DeSimone (2006)

12

Shows the value of HRDProvides metrics for HRD efficiencyDemonstrates value-added approach

for HRDDemonstrates accountability for HRD

activitiesEveryone else has it… why not HRD?

Page 13: Chapter 07 Slides 4e

Make or Buy Evaluation

Werner & DeSimone (2006)

13

“I bought it, therefore it is good.”“Since it’s good, I don’t need to post-test.”Who says it’s:

Appropriate? Effective? Timely? Transferable to the workplace?

Page 14: Chapter 07 Slides 4e

Evolution of Evaluation Efforts

Werner & DeSimone (2006)

14

1. Anecdotal approach – talk to other users 2. Try before buy – borrow and use

samples 3. Analytical approach – match research

data to training needs 4. Holistic approach – look at overall HRD

process, as well as individual training

Page 15: Chapter 07 Slides 4e

Models and Frameworks of Evaluation

Werner & DeSimone (2006)

15

Table 7-1 lists six frameworks for evaluationThe most popular is that of D. Kirkpatrick:

Reaction Learning Job Behavior Results

Page 16: Chapter 07 Slides 4e

Kirkpatrick’s Four Levels

Werner & DeSimone (2006)

16

Reaction Focus on trainee’s reactions

Learning Did they learn what they were supposed to?

Job Behavior Was it used on job?

Results Did it improve the organization’s effectiveness?

Page 17: Chapter 07 Slides 4e

Issues Concerning Kirkpatrick’s Framework

Werner & DeSimone (2006)

17

Most organizations don’t evaluate at all four levels

Focuses only on post-trainingDoesn’t treat inter-stage improvementsWHAT ARE YOUR THOUGHTS?

Page 18: Chapter 07 Slides 4e

Other Frameworks/Models

Werner & DeSimone (2006)

18

CIPP: Context, Input, Process, Product (Galvin, 1983)

Brinkerhoff (1987): Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth

Page 19: Chapter 07 Slides 4e

Other Frameworks/Models – 2

Werner & DeSimone (2006)

19

Kraiger, Ford, & Salas (1993): Cognitive outcomes Skill-based outcomes Affective outcomes

Holton (1996): Five Categories: Secondary Influences Motivation Elements Environmental Elements Outcomes Ability/Enabling Elements

Page 20: Chapter 07 Slides 4e

Other Frameworks/Models – 3

Werner & DeSimone (2006)

20

Phillips (1996): Reaction and Planned Action Learning Applied Learning on the Job Business Results ROI

Page 21: Chapter 07 Slides 4e

A Suggested Framework – 1

Werner & DeSimone (2006)

21

Reaction Did trainees like the training? Did the training seem useful?

Learning How much did they learn?

Behavior What behavior change occurred?

Page 22: Chapter 07 Slides 4e

Suggested Framework – 2

Werner & DeSimone (2006)

22

Results What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?

Page 23: Chapter 07 Slides 4e

Data Collection for HRD Evaluation

Werner & DeSimone (2006)

23

Possible methods:InterviewsQuestionnairesDirect observationWritten testsSimulation/Performance testsArchival performance information

Page 24: Chapter 07 Slides 4e

Interviews

Werner & DeSimone (2006)

24

Advantages:FlexibleOpportunity for

clarificationDepth possiblePersonal contact

Limitations:High reactive effectsHigh costFace-to-face threat

potentialLabor intensiveTrained observers

needed

Page 25: Chapter 07 Slides 4e

Questionnaires

Werner & DeSimone (2006)

25

Advantages:Low cost to

administerHonesty increasedAnonymity possibleRespondent sets the

paceVariety of options

Limitations:Possible inaccurate

dataResponse conditions

not controlledRespondents set

varying pacesUncontrolled return

rate

Page 26: Chapter 07 Slides 4e

Direct Observation

Werner & DeSimone (2006)

26

Advantages:NonthreateningExcellent way to

measure behavior change

Limitations:Possibly disruptiveReactive effects are

possibleMay be unreliableNeed trained

observers

Page 27: Chapter 07 Slides 4e

Written Tests

Werner & DeSimone (2006)

27

Advantages:Low purchase costReadily scoredQuickly processedEasily administeredWide sampling

possible

Limitations:May be threateningPossibly no relation to

job performanceMeasures only

cognitive learningRelies on normsConcern for racial/

ethnic bias

Page 28: Chapter 07 Slides 4e

Simulation/Performance Tests

Werner & DeSimone (2006)

28

Advantages:ReliableObjectiveClose relation to job

performanceIncludes cognitive,

psychomotor and affective domains

Limitations:Time consumingSimulations often

difficult to createHigh costs to

development and use

Page 29: Chapter 07 Slides 4e

Archival Performance Data

Werner & DeSimone (2006)

29

Advantages:ReliableObjectiveJob-basedEasy to reviewMinimal reactive

effects

Limitations:Criteria for keeping/

discarding recordsInformation system

discrepanciesIndirectNot always usableRecords prepared for

other purposes

Page 30: Chapter 07 Slides 4e

Choosing Data Collection Methods

Werner & DeSimone (2006)

30

Reliability Consistency of results, and freedom from

collection method bias and error

Validity Does the device measure what we want to

measure?

Practicality Does it make sense in terms of the

resources used to get the data?

Page 31: Chapter 07 Slides 4e

Type of Data Used/Needed

Werner & DeSimone (2006)

31

Individual performanceSystemwide performanceEconomic

Page 32: Chapter 07 Slides 4e

Individual Performance Data

Werner & DeSimone (2006)

32

Individual knowledgeIndividual behaviorsExamples:

Test scores Performance quantity, quality, and timeliness Attendance records Attitudes

Page 33: Chapter 07 Slides 4e

Systemwide Performance Data

Werner & DeSimone (2006)

33

ProductivityScrap/rework ratesCustomer satisfaction levelsOn-time performance levelsQuality rates and improvement rates

Page 34: Chapter 07 Slides 4e

Economic Data

Werner & DeSimone (2006)

34

ProfitsProduct liability claimsAvoidance of penaltiesMarket shareCompetitive positionReturn on investment (ROI)Financial utility calculations

Page 35: Chapter 07 Slides 4e

Use of Self-Report Data

Werner & DeSimone (2006)

35

Most common methodPre-training and post-training data Problems:

Mono-method bias Desire to be consistent between tests

Socially desirable responses Response Shift Bias:

Trainees adjust expectations to training

Page 36: Chapter 07 Slides 4e

Research Design

Werner & DeSimone (2006)

36

Specifies in advance:the expected results of the studythe methods of data collection to be usedhow the data will be analyzed

Page 37: Chapter 07 Slides 4e

Research Design Issues

Werner & DeSimone (2006)

37

Pretest and Posttest Shows trainee what training has accomplished Helps eliminate pretest knowledge bias

Control Group Compares performance of group with training against

the performance of a similar group without training

Page 38: Chapter 07 Slides 4e

Recommended Research Design

Werner & DeSimone (2006)

38

Pretest and posttest with control groupWhenever possible:

Randomly assign individuals to the test group and the control group to minimize bias

Use “time-series” approach to data collection to verify performance improvement is due to training

Page 39: Chapter 07 Slides 4e

Ethical Issues Concerning Evaluation Research

Werner & DeSimone (2006)

39

ConfidentialityInformed consentWithholding training from control groupsUse of deceptionPressure to produce positive results

Page 40: Chapter 07 Slides 4e

Assessing the Impact of HRD

Werner & DeSimone (2006)

40

Money is the language of business.You MUST talk dollars, not HRD

jargon.No one (except maybe you) cares about

“the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”

Page 41: Chapter 07 Slides 4e

HRD Program Assessment

Werner & DeSimone (2006)

41

HRD programs and training are investments

Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers

You must prove your worth to the organization – Or you’ll have to find another organization…

Page 42: Chapter 07 Slides 4e

Two Basic Methods for Assessing Financial Impact

Werner & DeSimone (2006)

42

Evaluation of training costsUtility analysis

Page 43: Chapter 07 Slides 4e

Evaluation of Training Costs

Werner & DeSimone (2006)

43

Cost-benefit analysis Compares cost of training to benefits gained

such as attitudes, reduction in accidents, reduction in employee sick-days, etc.

Cost-effectiveness analysis Focuses on increases in quality, reduction in

scrap/rework, productivity, etc.

Page 44: Chapter 07 Slides 4e

Return on Investment

Werner & DeSimone (2006)

44

Return on investment = Results/Costs

Page 45: Chapter 07 Slides 4e

Calculating Training Return On Investment

Werner & DeSimone (2006)

45

    Results Results    

Operational How Before After Differences ExpressedResults Area Measured Training Training (+ or –) in $

Quality of panels % rejected 2% rejected 1.5% rejected

.5% $720 per day

    1,440 panels 1,080 panels

360 panels

$172,800      per day   per day     per year

Housekeeping Visual 10 defects 2 defects 8 defects

Not measur-    inspection   (average)   (average)     able in $

   using

  

   

 

  20-item

       

 

  checklist

       

Preventable Number of 24 per year 16 per year 8 per year

   accidents   accidents

       

  Direct cost $144,000 $96,000 per

$48,000 $48,000 per

   of each   per year   year

   year

   accident

       

 

 

Return

Investment

    Total savings: $220,800.00

ROI = =    

   

   

   SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by

permission.

Operational Results Training Costs

= $220,800$32,564

= 6.8

Page 46: Chapter 07 Slides 4e

Types of Training Costs

Werner & DeSimone (2006)

46

Direct costsIndirect costsDevelopment costsOverhead costsCompensation for participants

Page 47: Chapter 07 Slides 4e

Direct Costs

Werner & DeSimone (2006)

47

Instructor Base pay Fringe benefits Travel and per diem

MaterialsClassroom and audiovisual equipmentTravel Food and refreshments

Page 48: Chapter 07 Slides 4e

Indirect Costs

Werner & DeSimone (2006)

48

Training managementClerical/AdministrativePostal/shipping, telephone, computers,

etc.Pre- and post-learning materialsOther overhead costs

Page 49: Chapter 07 Slides 4e

Development Costs

Werner & DeSimone (2006)

49

Fee to purchase programCosts to tailor program to organizationInstructor training costs

Page 50: Chapter 07 Slides 4e

Overhead Costs

Werner & DeSimone (2006)

50

General organization supportTop management participationUtilities, facilitiesGeneral and administrative costs,

such as HRM

Page 51: Chapter 07 Slides 4e

Compensation for Participants

Werner & DeSimone (2006)

51

Participants’ salary and benefits for time away from job

Travel, lodging, and per-diem costs

Page 52: Chapter 07 Slides 4e

Measuring Benefits

Werner & DeSimone (2006)

52

Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of

labor and materials Reduction in preventable accidents measured in

dollars ROI = Benefits/Training costs

Page 53: Chapter 07 Slides 4e

Utility Analysis

Werner & DeSimone (2006)

53

Uses a statistical approach to support claims of training effectiveness: N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from

training SDy = Dollar value of untrained job performance (in

standard deviation units) C = Cost of training

U = (N)(T)(dt)(Sdy) – C

Page 54: Chapter 07 Slides 4e

Critical Information for Utility Analysis

Werner & DeSimone (2006)

54

dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained

SDy = standard deviation in dollars, or overall productivity of organization

Page 55: Chapter 07 Slides 4e

Ways to Improve HRD Assessment

Werner & DeSimone (2006)

55

Walk the walk, talk the talk: MONEYInvolve HRD in strategic planningInvolve management in HRD planning and

estimation efforts Gain mutual ownership

Use credible and conservative estimatesShare credit for successes and blame for

failures

Page 56: Chapter 07 Slides 4e

HRD Evaluation Steps

Werner & DeSimone (2006)

56

1. Analyze needs.2. Determine explicit evaluation strategy.3. Insist on specific and measurable training

objectives.4. Obtain participant reactions.5. Develop criterion measures/instruments

to measure results.6. Plan and execute evaluation strategy.

Page 57: Chapter 07 Slides 4e

Summary

Werner & DeSimone (2006)

57

Training results must be measured against costs

Training must contribute to the “bottom line”

HRD must justify itself repeatedly as a revenue enhancer