chapter 07 slides 4e
TRANSCRIPT
CHAPTER 7
Werner & DeSimone
1
Evaluating HRD Programs
Effectiveness
Werner & DeSimone (2006)
2
The degree to which a training (or other HRD program) achieves its intended purpose
Measures are relative to some starting point
Measures how well the desired goal is achieved
Evaluation
Werner & DeSimone (2006)
3
HRD Evaluation
Werner & DeSimone (2006)
4
Textbook definition:“The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”
In Other Words…
Werner & DeSimone (2006)
5
Are we training: the right peoplethe right “stuff”the right waywith the right materialsat the right time?
Evaluation Needs
Werner & DeSimone (2006)
6
Descriptive and judgmental information needed Objective and subjective data
Information gathered according to a plan and in a desired format
Gathered to provide decision making information
Purposes of Evaluation
Werner & DeSimone (2006)
7
Determine whether the program is meeting the intended objectives
Identify strengths and weaknessesDetermine cost-benefit ratioIdentify who benefited most or leastDetermine future participantsProvide information for improving HRD
programs
Purposes of Evaluation – 2
Werner & DeSimone (2006)
8
Reinforce major points to be madeGather marketing informationDetermine if training program is appropriateEstablish management database
Evaluation Bottom Line
Werner & DeSimone (2006)
9
Is HRD a revenue contributor or a revenue user?
Is HRD credible to line and upper-level managers?
Are benefits of HRD readily evident to all?
How Often are HRD Evaluations Conducted?
Werner & DeSimone (2006)
10
Not often enough!!!Frequently, only end-of-course participant
reactions are collectedTransfer to the workplace is evaluated less
frequently
Why HRD Evaluations are Rare
Werner & DeSimone (2006)
11
Reluctance to having HRD programs evaluated
Evaluation needs expertise and resources
Factors other than HRD cause performance improvements – e.g., Economy Equipment Policies, etc.
Need for HRD Evaluation
Werner & DeSimone (2006)
12
Shows the value of HRDProvides metrics for HRD efficiencyDemonstrates value-added approach
for HRDDemonstrates accountability for HRD
activitiesEveryone else has it… why not HRD?
Make or Buy Evaluation
Werner & DeSimone (2006)
13
“I bought it, therefore it is good.”“Since it’s good, I don’t need to post-test.”Who says it’s:
Appropriate? Effective? Timely? Transferable to the workplace?
Evolution of Evaluation Efforts
Werner & DeSimone (2006)
14
1. Anecdotal approach – talk to other users 2. Try before buy – borrow and use
samples 3. Analytical approach – match research
data to training needs 4. Holistic approach – look at overall HRD
process, as well as individual training
Models and Frameworks of Evaluation
Werner & DeSimone (2006)
15
Table 7-1 lists six frameworks for evaluationThe most popular is that of D. Kirkpatrick:
Reaction Learning Job Behavior Results
Kirkpatrick’s Four Levels
Werner & DeSimone (2006)
16
Reaction Focus on trainee’s reactions
Learning Did they learn what they were supposed to?
Job Behavior Was it used on job?
Results Did it improve the organization’s effectiveness?
Issues Concerning Kirkpatrick’s Framework
Werner & DeSimone (2006)
17
Most organizations don’t evaluate at all four levels
Focuses only on post-trainingDoesn’t treat inter-stage improvementsWHAT ARE YOUR THOUGHTS?
Other Frameworks/Models
Werner & DeSimone (2006)
18
CIPP: Context, Input, Process, Product (Galvin, 1983)
Brinkerhoff (1987): Goal setting Program design Program implementation Immediate outcomes Usage outcomes Impacts and worth
Other Frameworks/Models – 2
Werner & DeSimone (2006)
19
Kraiger, Ford, & Salas (1993): Cognitive outcomes Skill-based outcomes Affective outcomes
Holton (1996): Five Categories: Secondary Influences Motivation Elements Environmental Elements Outcomes Ability/Enabling Elements
Other Frameworks/Models – 3
Werner & DeSimone (2006)
20
Phillips (1996): Reaction and Planned Action Learning Applied Learning on the Job Business Results ROI
A Suggested Framework – 1
Werner & DeSimone (2006)
21
Reaction Did trainees like the training? Did the training seem useful?
Learning How much did they learn?
Behavior What behavior change occurred?
Suggested Framework – 2
Werner & DeSimone (2006)
22
Results What were the tangible outcomes? What was the return on investment (ROI)? What was the contribution to the organization?
Data Collection for HRD Evaluation
Werner & DeSimone (2006)
23
Possible methods:InterviewsQuestionnairesDirect observationWritten testsSimulation/Performance testsArchival performance information
Interviews
Werner & DeSimone (2006)
24
Advantages:FlexibleOpportunity for
clarificationDepth possiblePersonal contact
Limitations:High reactive effectsHigh costFace-to-face threat
potentialLabor intensiveTrained observers
needed
Questionnaires
Werner & DeSimone (2006)
25
Advantages:Low cost to
administerHonesty increasedAnonymity possibleRespondent sets the
paceVariety of options
Limitations:Possible inaccurate
dataResponse conditions
not controlledRespondents set
varying pacesUncontrolled return
rate
Direct Observation
Werner & DeSimone (2006)
26
Advantages:NonthreateningExcellent way to
measure behavior change
Limitations:Possibly disruptiveReactive effects are
possibleMay be unreliableNeed trained
observers
Written Tests
Werner & DeSimone (2006)
27
Advantages:Low purchase costReadily scoredQuickly processedEasily administeredWide sampling
possible
Limitations:May be threateningPossibly no relation to
job performanceMeasures only
cognitive learningRelies on normsConcern for racial/
ethnic bias
Simulation/Performance Tests
Werner & DeSimone (2006)
28
Advantages:ReliableObjectiveClose relation to job
performanceIncludes cognitive,
psychomotor and affective domains
Limitations:Time consumingSimulations often
difficult to createHigh costs to
development and use
Archival Performance Data
Werner & DeSimone (2006)
29
Advantages:ReliableObjectiveJob-basedEasy to reviewMinimal reactive
effects
Limitations:Criteria for keeping/
discarding recordsInformation system
discrepanciesIndirectNot always usableRecords prepared for
other purposes
Choosing Data Collection Methods
Werner & DeSimone (2006)
30
Reliability Consistency of results, and freedom from
collection method bias and error
Validity Does the device measure what we want to
measure?
Practicality Does it make sense in terms of the
resources used to get the data?
Type of Data Used/Needed
Werner & DeSimone (2006)
31
Individual performanceSystemwide performanceEconomic
Individual Performance Data
Werner & DeSimone (2006)
32
Individual knowledgeIndividual behaviorsExamples:
Test scores Performance quantity, quality, and timeliness Attendance records Attitudes
Systemwide Performance Data
Werner & DeSimone (2006)
33
ProductivityScrap/rework ratesCustomer satisfaction levelsOn-time performance levelsQuality rates and improvement rates
Economic Data
Werner & DeSimone (2006)
34
ProfitsProduct liability claimsAvoidance of penaltiesMarket shareCompetitive positionReturn on investment (ROI)Financial utility calculations
Use of Self-Report Data
Werner & DeSimone (2006)
35
Most common methodPre-training and post-training data Problems:
Mono-method bias Desire to be consistent between tests
Socially desirable responses Response Shift Bias:
Trainees adjust expectations to training
Research Design
Werner & DeSimone (2006)
36
Specifies in advance:the expected results of the studythe methods of data collection to be usedhow the data will be analyzed
Research Design Issues
Werner & DeSimone (2006)
37
Pretest and Posttest Shows trainee what training has accomplished Helps eliminate pretest knowledge bias
Control Group Compares performance of group with training against
the performance of a similar group without training
Recommended Research Design
Werner & DeSimone (2006)
38
Pretest and posttest with control groupWhenever possible:
Randomly assign individuals to the test group and the control group to minimize bias
Use “time-series” approach to data collection to verify performance improvement is due to training
Ethical Issues Concerning Evaluation Research
Werner & DeSimone (2006)
39
ConfidentialityInformed consentWithholding training from control groupsUse of deceptionPressure to produce positive results
Assessing the Impact of HRD
Werner & DeSimone (2006)
40
Money is the language of business.You MUST talk dollars, not HRD
jargon.No one (except maybe you) cares about
“the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”
HRD Program Assessment
Werner & DeSimone (2006)
41
HRD programs and training are investments
Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers
You must prove your worth to the organization – Or you’ll have to find another organization…
Two Basic Methods for Assessing Financial Impact
Werner & DeSimone (2006)
42
Evaluation of training costsUtility analysis
Evaluation of Training Costs
Werner & DeSimone (2006)
43
Cost-benefit analysis Compares cost of training to benefits gained
such as attitudes, reduction in accidents, reduction in employee sick-days, etc.
Cost-effectiveness analysis Focuses on increases in quality, reduction in
scrap/rework, productivity, etc.
Return on Investment
Werner & DeSimone (2006)
44
Return on investment = Results/Costs
Calculating Training Return On Investment
Werner & DeSimone (2006)
45
Results Results
Operational How Before After Differences ExpressedResults Area Measured Training Training (+ or –) in $
Quality of panels % rejected 2% rejected 1.5% rejected
.5% $720 per day
1,440 panels 1,080 panels
360 panels
$172,800 per day per day per year
Housekeeping Visual 10 defects 2 defects 8 defects
Not measur- inspection (average) (average) able in $
using
20-item
checklist
Preventable Number of 24 per year 16 per year 8 per year
accidents accidents
Direct cost $144,000 $96,000 per
$48,000 $48,000 per
of each per year year
year
accident
Return
Investment
Total savings: $220,800.00
ROI = =
SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by
permission.
Operational Results Training Costs
= $220,800$32,564
= 6.8
Types of Training Costs
Werner & DeSimone (2006)
46
Direct costsIndirect costsDevelopment costsOverhead costsCompensation for participants
Direct Costs
Werner & DeSimone (2006)
47
Instructor Base pay Fringe benefits Travel and per diem
MaterialsClassroom and audiovisual equipmentTravel Food and refreshments
Indirect Costs
Werner & DeSimone (2006)
48
Training managementClerical/AdministrativePostal/shipping, telephone, computers,
etc.Pre- and post-learning materialsOther overhead costs
Development Costs
Werner & DeSimone (2006)
49
Fee to purchase programCosts to tailor program to organizationInstructor training costs
Overhead Costs
Werner & DeSimone (2006)
50
General organization supportTop management participationUtilities, facilitiesGeneral and administrative costs,
such as HRM
Compensation for Participants
Werner & DeSimone (2006)
51
Participants’ salary and benefits for time away from job
Travel, lodging, and per-diem costs
Measuring Benefits
Werner & DeSimone (2006)
52
Change in quality per unit measured in dollars Reduction in scrap/rework measured in dollar cost of
labor and materials Reduction in preventable accidents measured in
dollars ROI = Benefits/Training costs
Utility Analysis
Werner & DeSimone (2006)
53
Uses a statistical approach to support claims of training effectiveness: N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from
training SDy = Dollar value of untrained job performance (in
standard deviation units) C = Cost of training
U = (N)(T)(dt)(Sdy) – C
Critical Information for Utility Analysis
Werner & DeSimone (2006)
54
dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained
SDy = standard deviation in dollars, or overall productivity of organization
Ways to Improve HRD Assessment
Werner & DeSimone (2006)
55
Walk the walk, talk the talk: MONEYInvolve HRD in strategic planningInvolve management in HRD planning and
estimation efforts Gain mutual ownership
Use credible and conservative estimatesShare credit for successes and blame for
failures
HRD Evaluation Steps
Werner & DeSimone (2006)
56
1. Analyze needs.2. Determine explicit evaluation strategy.3. Insist on specific and measurable training
objectives.4. Obtain participant reactions.5. Develop criterion measures/instruments
to measure results.6. Plan and execute evaluation strategy.
Summary
Werner & DeSimone (2006)
57
Training results must be measured against costs
Training must contribute to the “bottom line”
HRD must justify itself repeatedly as a revenue enhancer