beyond kirkpatrick: taking a fresh look at analysis and
TRANSCRIPT
101 Beyond Kirkpatrick:
Taking a Fresh Look at Analysis and Evaluation
www.eLearningGuild.com
Allison Rossett, San Diego State University
July 23 & 24, 2009Boston, MA
Page 1Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
BEYOND KIRKPATRICKBEYOND KIRKPATRICKA FRESH LOOK AT A FRESH LOOK AT ANALYSIS AND EVALUATIONANALYSIS AND EVALUATION
Allison [email protected]
1Copyright © 2009 Allison Rossett
WHERE ARE WE TODAY?WHERE ARE WE TODAY?
2
1. Economic pain and upheaval2. Technology3. Evidence-based decision-making4. Workplace learning and support
3
44 trends affect analysis and evaluationtrends affect analysis and evaluation
July 23 & 24, 2009Boston, MA
Page 2Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
4Economic upheaval propels the other trends
1. Economic pain and upheaval2. Technology3. Evidence-based decision-making4. Workplace learning and support
5
44 trends affect analysis and evaluationtrends affect analysis and evaluation
Technology alters deliveryTechnology alters delivery
Forty-two percent of organizations anticipate decreasing classroom learningSeventy-two percent intend to increase their asynchronous e-learning.
[Chief Learning Officer’s Business Intelligence study]6
July 23 & 24, 2009Boston, MA
Page 3Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
1. Economic pain and upheaval2. Technology3. Evidence-based decision-making4. Workplace learning and support
7
44 trends affect analysis and evaluationtrends affect analysis and evaluation
What What CEOsCEOs MeasureMeasure
Source: The Conference Board
Ranking Description“of
greatest concern”
1 Sustained and steady top-line growth 37.5%
2 Profit growth 36.1%
3 Consistent execution of strategy by top management
33.4%
4 Speed, flexibility, adaptability to change 33.1%
5 Customer loyalty / retention 29.4%
6 Stimulating innovation / creativity / enabling entrepreneurship 23.9%
7 Corporate reputation 22.9%
8 Speed to market 22.7%
9 [Product] Innovation 20.8%
10 Improving productivity 20.3%
8
Source: “Restructuring: Results From the ASTD Benchmarking Forum”
94%
34%
13%3%
0%
20%
40%
60%
80%
100%
Level 1 Level 2 Level 3 Level 4
Percentage of Courses Evaluated at Each of
Kirkpatrick’s Four Levels
Reaction Learning Behavior Results
What What WeWe MeasureMeasure–– WHEN we measureWHEN we measure
9
It does not have to be this way.
July 23 & 24, 2009Boston, MA
Page 4Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
Committed to evidenceCommitted to evidence
10
Evidence at the center of practice? Evidence at the center of practice?
New targets or “evaluands”Quest for triangulationPreference for numbers AND storiesPervasive metrics to enlighten decisions◦ Convergence of analysis and evaluation
11©2009 Allison Rossett
LetLet’’s be inquisitive about that programs be inquisitive about that programOutcome indicators? Lawsuits? Early warnings?Which assets are used? Which not? Why? Reactions to coaches and tool? Helpful?Anonymous survey about barriers to ethical choices, unresolved concernsDashboard that rolls up results
12
July 23 & 24, 2009Boston, MA
Page 5Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
1. Economic pain and upheaval2. Technology3. Evidence-based decision-making4. Workplace learning and support
13
44 trends affect analysis and evaluationtrends affect analysis and evaluation
Delivering support into workDelivering support into work
Coast Guard boarding officers must know about many vesselsUnacceptable error rate; costly trainingNow they use a blend, a short course plus PDA to inspect and report
14
West Point plus ….. support in their dangerous workplace
July 23 & 24, 2009Boston, MA
Page 6Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
CHANGE IS HAPPENINGCHANGE IS HAPPENING
16
Nothing is sacrosanctNothing is sacrosanctConsider American football and basketball◦NFL teams hit 85% of their field goals in
2008; in 1974, it was 60%◦ In college basketball, three-pointers fell at
about the same rate as two-pointers. What did the NCAA do?
17
NEW METRICSNEW METRICS
18
July 23 & 24, 2009Boston, MA
Page 7Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
KirkpatrickKirkpatrick’’s model is good. s model is good. Is it sufficient?Is it sufficient?Level 4: Does it matter? Does it advance
strategy?Level 3: Are they doing it (objectives)
consistently and appropriately?+++++++++++++++++++++++++
Level 2: Can they do it (objectives)? Do they show the skills and abilities?
Level 1: Did they like the experience? Satisfaction? Use? Repeat use?
20
21
More Purposes
July 23 & 24, 2009Boston, MA
Page 8Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
PurposesPurposes–– linked to linked to planningplanning1. To match participants to what they really
need2. To examine alignment, querying the
transfer and performance system3. To find out if the right information and
worked examples are delivered on demand4. To identify emergent needs, problems and
opportunities and plan responses
22©2009 Allison Rossett
PurposesPurposes–– linked to linked to reportingreporting5. To fulfill promises to regulatory and
government agencies6. To determine contributions to business
outcomes7. To contribute to talent management via
programs/results that attract and retain8. To advance the careers of our people 9. To tally all that we do and how much it
is worth to the organization
23©2009 Allison Rossett
PurposesPurposes–– linked to linked to improving improving
10. To improve upon our efforts, programs and the work of instructors/facilitators
11. To determine if learning happened 12. To determine how engaged our people
are with their development, including contributions to networks and communities
24©2009 Allison Rossett
July 23 & 24, 2009Boston, MA
Page 9Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
PLEASE look at the 12 purposes and PLEASE look at the 12 purposes and rate each for an important initiative. rate each for an important initiative. Lean manufacturing? New supervisors? Lean manufacturing? New supervisors? Selling higher in the organization? Ethics?Selling higher in the organization? Ethics?
25
Top priority
Some priority No priority
You canYou can’’t tackle all the purposes.t tackle all the purposes.This was a project for a mutual fund company. Two years prior they had launched an initiative to convert 45 trainers to performance consultants. Now they want to know how it’s going. What are their purposes? No, they can’t pursue all 12. No time, few resources.
26Copyright 2009 Allison Rossett
To match participants to what they really needTo examine the transfer and performance systemTo determine contributions to business outcomes
How to find out How to find out ““how it how it waswas goinggoing””I limited my purposes. What methods then?◦ To match participants to what they need
Anonymous online survey for 45 consultants seeking their assessment of skills and knowledge associated with performance consulting. Seek confidence to perform.Interview consultants about lingering questions, and barriers what they require to deliver on change in their rolesInterview randomly selected customers. What did they handle well? What not so well?
◦ To determine contributions to business outcomesInterview randomly selected customers. What were they seeking when they came to us? What business results? What indicators would signify success? Measure.Repeat requests from customers? Better framed requirements?
◦ To examine the transfer and performance systemIn anonymous online survey for consultants, include a question about drivers/blockers. What would help them move forward to deliver on this new role?Interview consultants’ supervisors regarding what drives/blocks performanceLook at the performance mgmt system. Has it changed with the new roles?See Brinkerhoff SCM
27Copyright 2009 Allison Rossett
July 23 & 24, 2009Boston, MA
Page 10Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
28
Use purposes to plan methodsUse purposes to plan methodsPurposes Sources Questions Indicators
To see if learning occurred
ParticipantsSupervisorsCustomersRecords
How would you handle this? (assessments, tests)
Can you ID errors?
Test scoresError rateCall backsCustomer satSpeedy completion
To determine contributions to business outcomes
ExecutivesManagersStrategy docsSee Spitzer LEM
What do you expect? What indicators would satisfy? Delight?
Error rateCall backsCustomer satSpeedy completionTailored indicators
To see if we contribute to talent management
EmployeesPotential employeesHR, managers
Why did you join us? Why are you departing? How do you perceive learning offering?
RecruitmentRetention rateEmployee satisfactionEngagement
To see if we advance the careers of our people
Participants; peersSupervisors HR colleaguesPerf mgmt system
Where to from here for you? What more must you know and do? Do you know how to move forward? Resources?
Are career paths specified? Increase in promotions from within? Retention up?
29Copyright 2009 Allison Rossett
M More data, more sources, more often
A Actionable: PLAN, REPORT, IMPROVE
R Repurposing methods, data
B Baked in– as you build programs
L Lean, based on smaller bites of data
E Everywhere learning, support & info are 30
Copyright 2009 Allison Rossett
My
Mar
ble
Mod
elM
y M
arbl
e M
odel
July 23 & 24, 2009Boston, MA
Page 11Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
31 [800 956-7739]
Useful links, I hopeUseful links, I hopeSDSU Encyclopedia of Educational Technology
http://coe.sdsu.edu/eet/
SDSU EDTEC graduate programs http://edweb.sdsu.edu/Edtec/distance/
Rossett & Schafer’s book:Job aids and performance support:Moving from Knowledge in the Classroom to knowledge everywhere
http://www.colletandschafer.com/perfsupp/index.html
Rossett’s First Things Fast book http://www.jbp.com/rossett.html
Rossett’s Beyond the Podium book http://www.pfeiffer.com/go/BTP;
A social network devoted to non-training interventions
www.pinotnet.ning.com
Pithy video introductions to Web 2.0 strategies
http://www.commoncraft.com/show
scroll down
[http://edweb.sdsu.edu/People/ARossett/TD_Feb08_Rossett_LEGEND.pdf]
July 23 & 24, 2009Boston, MA
Page 12Session 101 – Beyond Kirkpatrick: Taking a Fresh Look at Analysis and Evaluation – Allison Rossett, San Diego State University
ASTD (2007). State of the Industry Report. 2007. ASTD.Alvarez, K., Salas, E., & Garofano, C. M. (2004). An Integrated Model of Training Evaluation and Effectiveness. Human Resource Development Review, 3, 385.Boudreau, J. W., & Ramstad, P. M. (2006). Talentship and HR Measurement and Analysis: From ROI to Strategic Organizational Change. HR. Human Resource Planning, 29(1), 25.Brinkerhoff, R. O. (2005). The Success Case Method: A Strategic Evaluation Approach to Increasing the Value and Effect of Training. Advances in Developing Human Resources, 7(1), 86.Holton III, E. F. (2005). Holton's Evaluation Model: New Evidence and Construct Elaborations. Advances in Developing Human Resources, 7(1), 37.Kim, K, Bonk, C.J. & Oh, E. (September 2008). The present and future state of blended learning in workplace settings in the United State. Performance Improvement 47(8), 5-16.
Kirkpatrick, D. (1959). Techniques for evaluating training programs. Journal of the American Society of Training Directors, 13 (3-9), 21-26.O'Driscoll, T., & Sugrue, B. (2006). Valuing Human Capital and HRD: A Literature Review (Literature Review): IBM Almaden Services Research.
34
Phillips, J. (2003). Return on Investment in Training and Performance Improvement (2nd ed.). Boston, MA: Butterworth-Heinemann.
Pulchino, J. (August 2006). Usage and value of the Kirkpatrick four levels of training evaluation research report. A eport published by the Elearning Guild, www.elearningguild.com.
Rossett, A. (1999). First things fast, a handbook for performance analysis. SF:Wiley/Pfeiffer. www.jbp.com/rossett.htmlRossett, A. (2007). Leveling the Levels. Training and Development, 61(2), 48-53.Rossett, A. & McDonald, J. (Vol 11, 2006) Evaluating technology enhanced continuing medical education. Medical Education Online. http://www.med-ed-online.org/pdf/t0000074.pdfRuss-Eft, D., & Preskill, H. (2005). In Search of the Holy Grail: Return on Investment Evaluation in Human Resource Development. Advances in Developing Human Resources, 7(1), 71.Spitzer, D. R. (2005). Learning Effectiveness Measurement: A New Approach for Measuring and Managing Learning to Achieve Business Results. Advances in Developing Human Resources, 7(1), 55.Twitchell, S., Holton, E. F. I., & Trott, J. (2000). Technical training evaluation practices in the United States. Performance Improvement Quarterly, 13(3), 84-110
35
Job AidJob Aid–– Purposes Purposes MethodsMethodsPurposes Sources Questions Indicators
Focus on your project. What purposes are germane?
Triangulate. Go to many sources, including incumbents and work products.
Ask in several ways, in general, and then through specifics.
What vivid outputs, numbers, deliverables would satisfy the client? You?
Example below…. A key sales exec wanted to encourage global “sharing”
He went to a conference and encountered “2.0”and is now pushing for it.
The learning team immediately put in place sales community blogsand a series of podcaststhat capture ideas.
How to improve? How to plan to achieve sales goals? This example starts by looking at engagement.
To determine how engaged our people are with their development, including contributions to networks and communities
Sales people; sales supervisors; work products (presentations, proposals, blogs)
Use numbers. How much? How widespread? Do they return over time?Opinion data. How useful? Do they repurpose ideas and tailor to their needs?
Is the blog up to date? Is it widely used? Do presentations and proposals reflect ideas found there?Are podcastsdownloaded? Does discussion follow?
36Copyright 2009 Allison Rossett