little book - measuring training effectiveness

24
MEASURABLE OUTCOMES FROM LEARNING MEASUR RABLE OUTCOM ME ES FROM LEARNING

Upload: sowmini-malayappan

Post on 08-Nov-2014

32 views

Category:

Documents


8 download

DESCRIPTION

Measuring Training Effectiveness

TRANSCRIPT

Page 1: Little Book - Measuring Training Effectiveness

MEASURABLE OUTCOMES FROM LEARNINGMEASURRABLE OUTCOMMEES FROM LEARNING

Page 2: Little Book - Measuring Training Effectiveness

CONTENTS

1. LEARNING WITH IMPACT

2. PoPuLAR LEARNING EvALuATIoN ModELs

4. HoW CITRIX’s TRAINING MAdE AN IMPACT ACRoss 3 CoNTINENTs

5. WHy 90% oF TRAINING doEsN’T HAvE IMPACT

6. RoI Is dEAd

7. LoNG LIvE RoI

8. Is sALARy-BAsEd RoI THE WAy FoRWARd?

9. HAPPy sHEETs

10. HoW (NoT) To LEARN WITH IMPACT

11. HoW To LEARN WITH IMPACT

12. THE NEW WoRLd KIRKPATRICK ModEL

14. EvALuATIoN AT HERTFoRdsHIRE CouNTy CouNCIL

15. ToP TIPs

16. LEARNING TRANsFER: HoW do you do IT?

17. FAIL To PREPARE, PREPARE To FAIL

18. FRoM TRANsACTIoN To TRANsFoRMATIoN

19. THE FuTuRE oF EvALuATIoN

20. THE EvoLuTIoN oF EvALuATIoN

Produced by Reed Learning

Lovingly designed by L-and-CO.com

Page 3: Little Book - Measuring Training Effectiveness

1

LEARNING WITH

IMPACT

earning is an investment. And like any investment, it’s crucial to measure the outcomes. But because learning is often so intangible and individual, that measurement is often pretty tough. It’s not like working out how much money you saved by swapping the office over to energy-saving lightbulbs – there are a whole host of factors to consider.

Results are at the heart of what we do, and we think it’s essential that learning is a journey with a destination, not a one-off event. That’s why we’ve created this book in partnership with some of the leading thinkers in the L&D industry – including Training Journal, The Kite Foundation, the ASTD, Kirkpatrick Partners and many, many more. Inside you’ll discover jargon demystified, what best practice looks like and some innovative ideas to guide your own learning programmes.

We hope you find it useful. In fact, we hope it has real impact!

CONTENTS

1. LEARNING WITH IMPACT

2. PoPuLAR LEARNING EvALuATIoN ModELs

4. HoW CITRIX’s TRAINING MAdE AN IMPACT ACRoss 3 CoNTINENTs

5. WHy 90% oF TRAINING doEsN’T HAvE IMPACT

6. RoI Is dEAd

7. LoNG LIvE RoI

8. Is sALARy-BAsEd RoI THE WAy FoRWARd?

9. HAPPy sHEETs

10. HoW (NoT) To LEARN WITH IMPACT

11. HoW To LEARN WITH IMPACT

12. THE NEW WoRLd KIRKPATRICK ModEL

14. EvALuATIoN AT HERTFoRdsHIRE CouNTy CouNCIL

15. ToP TIPs

16. LEARNING TRANsFER: HoW do you do IT?

17. FAIL To PREPARE, PREPARE To FAIL

18. FRoM TRANsACTIoN To TRANsFoRMATIoN

19. THE FuTuRE oF EvALuATIoN

20. THE EvoLuTIoN oF EvALuATIoN

Produced by Reed Learning

Lovingly designed by L-and-CO.com

Page 4: Little Book - Measuring Training Effectiveness

PHILLIPS’ Evaluation ModelBased on Kirkpatrick’s model. It adds a fifth step, ROI, which is calculated using this seven-stage process.

NET PROGRAMME BENEFITS

PROGRAMME COSTSROI=

1.COLLECTING PRE PROGRAME DATA

2.COLLECTING POSTPROGRAME DATA

3.ISOLATINGTHE EFFECTSOF THEPROGRAM

4.CONVERTINGDATA TOMONETARYVALUE

5.TABULATINGPROGRAMCOSTS

6.CALCULATINGRETURN ON INVESTMENT

7.IDENTIFYINGINTANGIBLEBENEFITS

1.COLLECTING PRE PROGRAMME DATA

2.COLLECTING POSTPROGRAMME DATA

3.ISOLATINGTHE EFFECTSOF THEPROGRAMME

4.CONVERTINGDATA TOMONETARYVALUE

5.TABULATINGPROGRAMMECOSTS

6.CALCULATINGRETURN ON INVESTMENT

7.IDENTIFYINGINTANGIBLEBENEFITS

CIRO Evaluation

CONTEXTIdentifying training needs and objectivesINPUTDesigning and delivering trainingREACTIONQuality of trainee experienceOUTCOME a) Immediate – individual changes before returning to workb) Intermediate – individual transferring changes to workc) Ultimate – departmental or organisational results

SCRIVEN’s Key Evaluation Checklist

PRELIMINARIES FOUNDATIONS

CONCLUSIONS

PRELIMINARIES

SUB-EVALUATIONS

I Executive summaryII PrefaceIII Methodology

1 Background and context2 Descriptions and definitions3 Consumers4 Resources5 Values

6 Process evaluation7 Outcome evaluation8&9 Comparative cost-effectiveness10 Exportability11 Overall significance

12 Recommendations and explanations13 Responsibilities14 Reporting and follow-up15 Meta-evaluation

BRINKERHOFF’s Success Case MethodBrinkerhoff’s model focuses on narratives and stories, supported by evidence:1 Identify the goals of the learning opportunity and connect them to business needs 2 Survey participants to identify best and worst cases 3 Obtain corroborating evidence 4 Analyze the data 5 Communicate findings

There are lots of different evaluation methods out there. Here’s a quick guide to some of the most popular.LEVEL FOUR−RESULTSTo what degree targeted outcomes occur, as a result of learning event(s) and subsequent reinforcement

LEVEL THREE−BEHAVIOURTo what degree participants apply what they learned during training when they are back on the job

LEVEL TWO−LEARNINGTo what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event

LEVEL ONE−REACTIONTo what degree participants react favourably to the learning event

KIRKPATRICK’s Model of Training Evaluation

2

Page 5: Little Book - Measuring Training Effectiveness

PHILLIPS’ Evaluation ModelBased on Kirkpatrick’s model. It adds a fifth step, ROI, which is calculated using this seven-stage process.

NET PROGRAMME BENEFITS

PROGRAMME COSTSROI=

1.COLLECTING PRE PROGRAME DATA

2.COLLECTING POSTPROGRAME DATA

3.ISOLATINGTHE EFFECTSOF THEPROGRAM

4.CONVERTINGDATA TOMONETARYVALUE

5.TABULATINGPROGRAMCOSTS

6.CALCULATINGRETURN ON INVESTMENT

7.IDENTIFYINGINTANGIBLEBENEFITS

1.COLLECTING PRE PROGRAMME DATA

2.COLLECTING POSTPROGRAMME DATA

3.ISOLATINGTHE EFFECTSOF THEPROGRAMME

4.CONVERTINGDATA TOMONETARYVALUE

5.TABULATINGPROGRAMMECOSTS

6.CALCULATINGRETURN ON INVESTMENT

7.IDENTIFYINGINTANGIBLEBENEFITS

CIRO Evaluation

CONTEXTIdentifying training needs and objectivesINPUTDesigning and delivering trainingREACTIONQuality of trainee experienceOUTCOME a) Immediate – individual changes before returning to workb) Intermediate – individual transferring changes to workc) Ultimate – departmental or organisational results

SCRIVEN’s Key Evaluation Checklist

PRELIMINARIES FOUNDATIONS

CONCLUSIONS

PRELIMINARIES

SUB-EVALUATIONS

I Executive summaryII PrefaceIII Methodology

1 Background and context2 Descriptions and definitions3 Consumers4 Resources5 Values

6 Process evaluation7 Outcome evaluation8&9 Comparative cost-effectiveness10 Exportability11 Overall significance

12 Recommendations and explanations13 Responsibilities14 Reporting and follow-up15 Meta-evaluation

BRINKERHOFF’s Success Case MethodBrinkerhoff’s model focuses on narratives and stories, supported by evidence:1 Identify the goals of the learning opportunity and connect them to business needs 2 Survey participants to identify best and worst cases 3 Obtain corroborating evidence 4 Analyze the data 5 Communicate findings

There are lots of different evaluation methods out there. Here’s a quick guide to some of the most popular.LEVEL FOUR−RESULTSTo what degree targeted outcomes occur, as a result of learning event(s) and subsequent reinforcement

LEVEL THREE−BEHAVIOURTo what degree participants apply what they learned during training when they are back on the job

LEVEL TWO−LEARNINGTo what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event

LEVEL ONE−REACTIONTo what degree participants react favourably to the learning event

KIRKPATRICK’s Model of Training Evaluation

3

Page 6: Little Book - Measuring Training Effectiveness

LEADING PROVIDER OF VIRTUALISATION TECHNOLOGIES

4

C itrix is a leading provider of cloud, networking and virtualisation technologies.

Citrix products touch 75 percent of Internet users each day.

In 2010 Citrix announced X-5 – a goal to grow more than 50% by 2015. Following their 2010 annual employee engagement survey, they identified that employees needed more training and opportunities to grow in order to meet this growth objective. Citrix created a revitalised L&D strategy to address this. The focus was on creating consistent, scalable and accessible learning that could be tailored to local stakeholders in 19 countries.

LINKING IMPROVED SATISFACTION…As well as measuring learner response Citrix measured the overall impact of their training strategy in EMEA with two key metrics in their employee engagement survey, one of which was:

…TO IMPROVED PERFORMANCE

These improvements correlated with a 20% growth in revenue.

20% growth from 2010-2012 puts Citrix on track to achieve their X-5 goal of 50% growth from 2010-2015.

“Do you feel you have access to sufficient

training to improve your skills?”

YEAR

10 12 13 14 1511

Page 7: Little Book - Measuring Training Effectiveness

It’s essential to realise that

effective training begins long before a learning event

begins!Research suggests

that only 10% of training is transferred into

sustainable performance improvement!*

Richard Griffin

*Sugrue, B., & Rivera, R. (2005). ASTD State of the industry report 5

Carry out training needs analysis – but don’t forget that training is not the only option Ensure employees are learning ready. For example, there’s no point sending someone on an eLearning programme if they can’t use a computer Have clear, measurable training objectives Consider how impact will be measured: who will be interested in reviewing the results?

Ensure it’s job relevant Make sure that the training programme’s design, content and delivery style are appropriate to the audience

Managers should regularly ask for feedback about the training from employees Set training-related goals Ensure new learning can be applied quickly and frequently in employee’s day-to-day work

Evaluate! Not only does evaluation allow the impact of training to be assessed, it also reinforces employee learning Look for barriers, such as workload, that may have prevented employees from transferring their learning. Evaluation outcomes are as much about a supportive workplace as the quality of the training

BEfORE TRAINING STARTS TRANSfER

TRAINING

IMPACT

Page 8: Little Book - Measuring Training Effectiveness

6

There is well-worn joke about a stranger in a car asking for directions in the country. The local replies – “if I were you I wouldn’t start from here.” This

seems to be an apt description of our profession’s approach to evaluation.

Our inherited approach is firmly based in the concept of hierarchical evaluation. We should, we are told, proceed by capturing and analysing information on reaction (did they like the course); learning (did they learn anything); behaviour (did they do anything differently as a result); return (what were the bottom line benefits). In practice – as every survey ever conducted shows – we only collect information at the lowest level (reaction). This produces two effects. The first is that we beat ourselves up for not doing what we ‘ought’ to do. The other is that we look for increasingly complex ways of isolating the effects of a training intervention.

We must recognise that the skill set that drives value for the business has changed and is learned rather than trained.

Two of the essential attributes required in the knowledge-driven economy are technological acumen and influencing skills. We acquire both through trial and error, and peer-group support rather than formal training.

Learning in today’s organisations is a diffuse activity, not a single event. Our evaluation and ROI models are products of a time when skill sets and learning methods were quite different. The information we produce using the traditional approach is designed to justify our existence. It should be no surprise that senior managers are not interested.

There is still a need for trainers to ask ‘are we putting our efforts towards the right objectives?’ However, in today’s economy, we need to discover a better way of finding the answers than ROI.

Martyn SlomanFormer learning advisor for the CIPD and a Teaching Fellow in the Department of Management & Organisational Psychology at Birkbeck College.

Page 9: Little Book - Measuring Training Effectiveness

7

The key to this debate is that we have long confused evaluation and measurement. Like Martyn, I believe that L&D departments should not just produce figures

to measure things for the sake of it. But I do think that figures have their place as a tool to critically evaluate the impact of learning.

ROI was introduced on the basis that it put a hard financial figure on the value of training, isolating its effects and justifying the training budget. The problem is, as Martyn points out, that nobody really cares about the ROI of a specific training intervention. What matters is the impact against defined objectives. The fault isn’t with calculating the return on investment. It’s that we don’t define the costs, problems and contexts clearly enough to make that calculation worthwhile.

All learning needs a clear purpose. Why should something be learned, and what benefit will that learning have? Once this has been established, the content,

intended results and required resources can be defined.

Martyn is right – our current evaluation activities attempt to give us information no-one really needs (or wants) to know. But that doesn’t mean we should throw out ROI altogether. We just need to measure the things that matter. Has the whole learning programme changed how people work? What exactly has it changed? If so, how much are those changes saving us? That’s where ROI becomes a useful tool, not just a number to be crunched after every intervention.

Neville Pritchard Is a leading people development specialist and thinker with over 25 years of experience. He’s a Fellow of the CIPD and former Learning Director for Barclays Bank.

Page 10: Little Book - Measuring Training Effectiveness

8

IS SALARY-BASED ROI THE WAY FORWARD?IS SALARY-BASED ROI THE WAY FORWARD?

Whose responsibility is it to prove ROI on training? There are hundreds of articles and books on this subject, and maybe

it’s just gotten a little complicated. So, here’s a view which puts

responsibility for embedding the learning at the foot of the business. It’s an approach that says the ROI should be calculated from the salary paid to the person doing the learning, not on the final bill of the learning itself. The cost of learning is an investment but the return needs to be sought from those going on it.

Our responsibility as L&D professionals is to put forward the absolute best opportunities and resources to learn. Having the ROI

focus on salaries puts the onus on the manager and individual to show the short-to long-term benefits. Otherwise we lose our chance to change performance and behaviours in the long term by getting bogged down on measuring expenditure on training rather than the impact it has.

We are part of making sure that the L&D function is not seen as where performance is changed, rather simply where the change starts. Directly equating training outcomes to job performance through salary-based ROI is a great way to do just that.

Teresa EwingtonLearning & Development Manager, Thames Water

Page 11: Little Book - Measuring Training Effectiveness

I LOVED IT!

9

WHAT’S GOING ON HERE? For years now we’ve assumed the learner experience is key to measuring the effectiveness of training. Hence the inflated importance given to “enter-trainers”.

But learning is not just for the learners. Most of those who take part in work-based learning are doing it for the benefit of their employers. And rightly so, since they usually foot the bill.

But surely a happy learner is a more motivated learner, and so more likely to transfer their learning? Perhaps, but not necessarily.

Meta-study research by Sitzmann on 68,245 trainees showed that variance in learner satisfaction accounted for:• 2% of the variance in factual knowledge • 5% of the variance in skill-based knowledge • 0% of the variance in training transfer

Happy sheets are not even an accurate reflection of satisfaction. The majority of learners lie. Not because they are bad people but because they don’t want to offend the nice, fun person they’ve spent the day with. Happy sheet data can be manipulated – end the day with a fun exercise, give out chocolates and watch your results magically improve. Historically, HR, L&D departments and trainers secretly have a fondness for happy sheets because they reaffirm how great we are at what we do.

Happy sheets can provide valuable data if used thoughtfully as part of a more in-depth evaluation process. The problem is using them as a justification – wasting lots of time collecting and processing the happy sheet data as if it’s the most important thing, when what really matters is business impact and transfer.

he very mention of happy sheets will almost universally draw knowing looks and dismissive sneers from L&D professionals, despite the fact that we almost universally continue to use them.

Ben Waldman is one of Reed Learning’s L&D experts. For the last ten years he has worked as a facilitator, coach and consultant specialising in leadership development.

Page 12: Little Book - Measuring Training Effectiveness

How (not) to learn with impact How to learn with impact

Still, he makes some friends... Impact man meets with his boss and discusses going on the ‘Managing Difficult Super villains’ course.

...and he's able to apply what he's learnt to get the right outcome.

With a plan in place, he's feeling confident taking on Major Disengagement...

He Turns up with some pre-course work and objectives.

makes a note of specific actions he'll take.

...pays attention (sometimes)...

...and lets everyone know what a nice time he's had on his evaluation form.

impact man goes back to work taking on the evil villain, Major Disengagement.

but because he DIDN’T paY attention on his course things don't go as well as they could!

But it could all be so different...

How did that happen?

Grrrrr!

He’s saved the day again!

Impact Man has been sent on a course, but he doesn't really know why.

10

Page 13: Little Book - Measuring Training Effectiveness

How (not) to learn with impact How to learn with impact

Still, he makes some friends... Impact man meets with his boss and discusses going on the ‘Managing Difficult Super villains’ course.

...and he's able to apply what he's learnt to get the right outcome.

With a plan in place, he's feeling confident taking on Major Disengagement...

He Turns up with some pre-course work and objectives.

makes a note of specific actions he'll take.

...pays attention (sometimes)...

...and lets everyone know what a nice time he's had on his evaluation form.

impact man goes back to work taking on the evil villain, Major Disengagement.

but because he DIDN’T paY attention on his course things don't go as well as they could!

But it could all be so different...

How did that happen?

Grrrrr!

He’s saved the day again!

Impact Man has been sent on a course, but he doesn't really know why.

11

Page 14: Little Book - Measuring Training Effectiveness

Level 4: RESULTS

Leading indicatorsShort term observations and measurements suggesting that critical behaviours are on track to create a positive impact on desired results.

Desired outcomesThese are the measurable business objectives that were identified before the learning intervention.

Required driversRequired drivers are processes and systems that reinforce, monitor, support and reward performance of critical behaviours on the job.

The New World

By Kirkpatrick Partners

MO

NITOR REINFO

RCEREW

ARDENCOURAG

ELEVEL 3:

BehaviourON

T

HE JOB LEARNING

Level 1: Reaction

EngagementEngagement is the degree to which participants are actively involved in and contributing to the learning experience.

RelevanceRelevance is the degree to which training participants will have the opportunity to use or apply what they learned in training on the job. Relevance is vital.

Customer satisfactionCustomer satisfaction is the degree to which participants react favourably to the learning event.

Level 2: LEARNINGThe degree to which participants acquire the intended knowledge, skills and attitudes based on their participation in the learning event.

Level 2 also includes:

Confidence Confidence is the degree to which training participants think they will be able to do what they learned during training on the job.

CommitmentCommitment is the degree to which learners intend to apply the knowledge and skills learned during training to their jobs.

12

Page 15: Little Book - Measuring Training Effectiveness

Level 4: RESULTS

Leading indicatorsShort term observations and measurements suggesting that critical behaviours are on track to create a positive impact on desired results.

Desired outcomesThese are the measurable business objectives that were identified before the learning intervention.

Required driversRequired drivers are processes and systems that reinforce, monitor, support and reward performance of critical behaviours on the job.

The New World

By Kirkpatrick Partners

MO

NITOR REINFO

RCEREW

ARDENCOURAG

E

LEVEL 3:Behaviour

ON T

HE JOB LEARNING

Level 1: Reaction

EngagementEngagement is the degree to which participants are actively involved in and contributing to the learning experience.

RelevanceRelevance is the degree to which training participants will have the opportunity to use or apply what they learned in training on the job. Relevance is vital.

Customer satisfactionCustomer satisfaction is the degree to which participants react favourably to the learning event.

Level 2: LEARNINGThe degree to which participants acquire the intended knowledge, skills and attitudes based on their participation in the learning event.

Level 2 also includes:

Confidence Confidence is the degree to which training participants think they will be able to do what they learned during training on the job.

CommitmentCommitment is the degree to which learners intend to apply the knowledge and skills learned during training to their jobs.

13

© 2010-2012 Kirkpatrick Partners, LLC. All right reserved. Used with permission. Visit Kirkpatrickpartners.com for more information.

Page 16: Little Book - Measuring Training Effectiveness

14

Reed Learning and Hertfordshire County Council have been working in partnership since 2011 to deliver a large catalogue of Leadership, Management,

ICT and Personal Development solutions to a complex, diverse workforce. Measuring the quality of those solutions is vital to the organisation’s success in a challenging financial climate.

Before attending any Reed Learning event, Hertfordshire’s employees will complete a pre-course questionnaire asking them to rate their confidence against each of the key learning indicators and the importance of these to their role. This information is passed to the trainer who uses it to adapt the session to suit the needs of the group.

Eight weeks after the event, all delegates will then be asked to complete a post-course questionnaire asking them to rate their new level of confidence against each of the key learning indicators. Along with ‘happy sheet’ information, this is then used to measure whether the course was effective.

Using the Reed Learning Evaluation System, Hertfordshire County Council have

been able to demonstrate a 17% improvement in learner confidence and ensure that the right people are attending the right course at the right time.

EVALUATION AT HERTfORDSHIRE COUNTY COUNCIL

Page 17: Little Book - Measuring Training Effectiveness

15

I1. Ask the right questionsBefore doing any kind of survey design, get it straight in your head why you want to do the survey. The best trick is to write down all the conclusions you would want to see. E.g. “The new induction course reduces speed to competence by X%.”

Be ambitious! Don’t restrict yourself only to what is easy to measure. Whatever you do though, choose things to measure that explore what happened, not just what people think.

2. Ask the right peopleThink creatively about who the right witness to this impact is.

Is it the learner? Probably not. When it comes to performance it’s usually a manager. When it comes to behaviour it could be a colleague or direct report.

I mpact means making a difference to business performance. It could relate to personal performance, behaviour change, colleague performance or

even impact on society. But it has to be tangible and measurable. Here are some top tips for making sure you…

3. Ask at the right timePlan in advance by asking a few of your witnesses up front how long it typically takes them to notice a change in behaviour or performance. Then schedule the survey to go out at this time to optimise your results.

usE suRvEys To MEAsuRE IMPACT

CHRIs RoBINsoN, BoosT EvALuATIoN

Page 18: Little Book - Measuring Training Effectiveness

16

I

Learning transfer: How do you do it?

Simon Chilley

Improving the amount of learning transferred is the goal of most learning professionals. The question is: ‘How do you do it?’

It comes down to three essential factors:*

Learner Characteristics: the ability, personality and motivation of the learner. The line manager is crucial here. Picking the right type of learning for each person, discussing how the new skills will be applied in the workplace and endorsing the quality of the learning before the event can all significantly increase the likelihood of learning transfer.

Learning Design: the quality of the learning itself. Relevant learning activities that closely resemble real-life, including the opportunity to make mistakes and incorporate credible feedback on practical activity, are all shown to lead to increased learning transfer.

Work Environment: the support and opportunity the learner gets to apply their new skills. Again, the line manager is crucial. Firstly in creating a climate in which applying new skills, and giving and receiving feedback are the norm and secondly in actively creating opportunities for new skills to be applied.

Learning professionals are often only responsible for one of these three things – Learning Design. Unfortunately Learning Design is shown to be much less effective when the Learner Characteristics and Work Environment do not support learning transfer.

So, if your organisation is looking to maximise learning transfer you must engage your line managers, vary the types of learning to suit your learners and create a work climate where new skills can be practised.

Simon Chilley is a Programme Director at Reed Learning and a learning transfer specialist.

* Baldwin, T. T. and Ford, J. K. (1988), “Transfer of Training: A Review and Directions for Future Research”, Personnel Psychology, vol.41, no.1, pp. 63-103.

Page 19: Little Book - Measuring Training Effectiveness

17

Getting results from training is down to the training company, right? You’ve paid good money, so you should just be able to sit back and relax. Well, not really.

Even the very best training providers will struggle to deliver results if you don’t do your bit. First, do the hard work of identifying what your business really needs. Get out and about in the organisation and ask the right questions. Once you’ve done your analysis, identify both learning and behavioural goals; managers are always more interested in the latter. Think business outcomes!

Ensure that the people you are nominating for the course are the right people. In these difficult times it’s not enough that someone needs training. They have to want it. Sell the course, don’t just inform people about it. Explain what’s in it for them – and be bold in detailing what you expect them to do to deliver a return on the investment.

Encourage learners to put their new capabilities to work after the learning event. It’s rarely possible to offer financial incentives, so be imaginative. Try to link achievement

after training, to eligibility for promotion or other recognition.

Ensure that learners have immediate opportunities to use their new capabilities and create a ‘transfer community’ of other learners, trainers and colleagues to hold them to account.

Last, but very definitely not least, don’t forget the managers. They can make or break your efforts so make sure that they understand their role – before, during and after the course. There is no point in them agreeing to release their people to attend training, if they aren’t going to be actively involved in making that investment pay off.

Robert Terry is founder of The Kite Foundation, a not-for-profit think-tank that develops solutions to support the application of newly-learned skills in the workplace.

www.kite-foundation.com

Page 20: Little Book - Measuring Training Effectiveness

18

Nothing is more guaranteed to pull an audience to an event than something focused on the Holy Grail of L&D – evaluation. So why do so

many people find this area of expertise so difficult? Do mature L&D departments and professionals need to evaluate their work as much as they do?

I suspect that other departments in organisations don’t feel the need to prove their worth as we do – they see themselves as an integral part of the business and if that business is growing and doing well they are satisfied that they are doing their job. The mature L&D professional should be more focused on understanding the business.

In the past training and learning was viewed as something separate to the business. This separation meant that we felt the need to have the statistical evidence to prove that our work impacted on the organisation we were serving. I’m glad to say that many L&D specialists have moved on from this purely transactional relationship to something that is much more transformational and that move brings a change in emphasis for the L&D skill set. Debbie Carter is Director of Research at Training Journal, one of the UK’s leading resources for L&D professionals.

from Transaction to TransformationDebbie Carter

COMMANDERCOURAGE

fELICITY fLEX

Captain Curiosity

HERE ARE THREE TRAITs you CAN dEvELoP To HELP TRANsFoRM youR BusINEss:

Curiosity – find out what’s happening inside and outside the business and talk to all your stakeholders.

CourAGE – challenge and ask questions to help the business understand itself better. Many problems are not solved by training or learning initiatives.

FLEXiBiLity – nothing stays the same. Support people in discarding old practices and in embracing new ideas.

Page 21: Little Book - Measuring Training Effectiveness

1919

Evaluating the impact of learning must be a priority

As business becomes more competitive – and more global – a key differentiator for success will be talent. A study by ASTD and IBM found that senior executives and CEOs agree that learning is strategically valuable. Because the learning function is ultimately responsible for making sure the workforce is well-skilled and the talent pipeline is intact, leaders must measure the impact of what they achieve in ways that matter to the organisation.

Many organisations have a long way to go when it comes to ensuring that learning is truly and demonstrably aligned with the business. Fortunately, most are seeing the importance of this.

The future lies with learning professionals becoming as astute about

business as they are about learning. ASTD found a strong correlation between goal alignment and market performance, and a strong correlation between alignment and a learning function that is effective at meeting goals – its own and the organisation’s.

The future of evaluation will not be defined by a particular model or methodology. The future of evaluation is a matter of strategic relevance and impact. When training and development efforts are designed and aligned to meet business needs and goals, their impact on results will be determined in metrics that are most meaningful to the organization.

Tony Bingham is the CEO of the American Society for Training and Development, the world’s largest organisation dedicated to training & development professionals.

By Tony Bingham

Page 22: Little Book - Measuring Training Effectiveness

18451792 1960s-1990s

1940 1959

1911 2004 – Present1897

“Evaluation is a very young

discipline – even though it

is a very old practice.”

Michael Scriven

1845

WiLLiaM FaRiSh iS THE FIRsT To uSe The “quanTiTaTive MaRk” (oR nuMeRicaL ScoRe) To aSSeSS hiS STuDenTS

pRinTeD FoRMaL TeSTS WeRe uSeD FoR THE FIRsT TiMe To aSSeSS LeaRneRS in BoSTon, Ma.

JoSeph Rice uSeS TeSTing To FinD No LINK BeTWeen TiMe SpenT LeaRning To SpeLL anD coMpeTence

FReDeRick TayLoR puBLiSheS “THE PRINCIPLEs oF sCIENTIFIC MANAGEMENT”, FocuSing on conSTanT quanTiTaTive TeSTing anD eFFiciency

20

Page 23: Little Book - Measuring Training Effectiveness

18451792 1960s-1990s

1940 1959

1911 2004 – Present1897

“Evaluation is a very young

discipline – even though it

is a very old practice.”

Michael Scriven

1845

RaLph TyLeR FiniSheS an 8-yeaR STuDy anD PIoNEERs THE IdEA oF BEHAvIouRAL EvALuATIoN aS an aLTeRnaTive To FacTuaL evaLuaTion

DonaLD kiRkpaTRick FiRST puBLiSheS hiS iDeaS on TRaining evaLuaTion. aLong WiTh SoMe ReviSionS, THIs FRAMEWoRK Is sTILL WIdELy usEd TodAy

a huge Range oF evaLuaTion MeThoDoLogieS SpRing up. in 1997 WoRThen eT aL CLAssIFy THEM INTo 6 CATEGoRIEs oF “oRIENTATIoN”: oBjECTIvEs, MANAGEMENT, CoNsuMER, EXPERTIsE, AdvERsARy ANd PARTICIPANT

FiTzpaTRick eT aL iDenTiFy 12 FuTuRE TRENds FoR EvALuATIoN, incLuDing gReaTeR uSe oF TechnoLogy anD uSing quaLiTaTive anD quanTiTaTive anaLySiS TogeTheR

21

Page 24: Little Book - Measuring Training Effectiveness

MEASURABLE OUTCOMES FROM LEARNINGMEASURRABLE OUTCOMMEES FROM LEARNING

Ever wondered whether your learning programmes are really making a difference? Then this book is for you.

In our latest Little Book, Reed Learning have partnered with Training Journal and some of L&D’s top thinkers to demystify training evaluation and discover how to deliver learning with real impact.

Visit us at www.reedlearning.com to find out more.

IN PARTNERSHIP WITH