evaluating intended continuing education outcomes

52
Joshua D. Southwick, MRC, CRC David Vandergoot, PhD

Upload: xanti

Post on 30-Jan-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Evaluating Intended Continuing Education Outcomes. Joshua D. Southwick, MRC, CRC David Vandergoot, PhD. Outline. Why Continuing Education? Intended Outcomes of Continuing Education Actual Results of Continuing Education Evaluating Training Why Evaluate? How to Evaluate? - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Evaluating Intended Continuing  Education  Outcomes

Joshua D. Southwick, MRC, CRCDavid Vandergoot, PhD

Page 2: Evaluating Intended Continuing  Education  Outcomes

Why Continuing Education? Intended Outcomes of Continuing Education Actual Results of Continuing Education Evaluating Training

◦ Why Evaluate?◦ How to Evaluate?

Approaches to Evaluation Guiding Principles

Recommendations Examples Practice & Share (if time)

2

Page 3: Evaluating Intended Continuing  Education  Outcomes

3

Page 4: Evaluating Intended Continuing  Education  Outcomes

We Need Qualified Rehabilitation Counselors◦ The Rehabilitation Act of 1973, as amended,

requires “qualified vocational rehabilitation counselors” to provide services

◦ Ethically relevant: CRCs “practice only within the boundaries of their

competence” (CRCC, p. 11) CRCs “recognize the need for continuing

education . . . to maintain competence in the skills they use” (p. 11)

4

Page 5: Evaluating Intended Continuing  Education  Outcomes

5

Page 6: Evaluating Intended Continuing  Education  Outcomes

Continuing Professional Development (CPD) involves “the continuous acquisition of new knowledge, skills, and attitudes to enable competent practice.”1

◦ Well-trained employees may feel less frustration, more job satisfaction and more job commitment2

◦ After pre-service, graduates should expect to learn new skills or hone existing skills3

Gaining specialty-specific expertise Understanding ever-changing challenges arising within the

field Becoming familiar with promising and evidence-based

practices emerging from new empirical research

6

1. Peck, McCall, McLaren, & Rotem, 2000, p. 4322. Allen & van der Velden, 20013. Leahy et al., 2009

Page 7: Evaluating Intended Continuing  Education  Outcomes

Certification or Licensure Maintenance Team Building Networking Increase Organizational Effectiveness and

Efficiency◦ Increased capacity to serve individuals

Better services for persons with disabilities◦ Greater consumer satisfaction

7

Page 8: Evaluating Intended Continuing  Education  Outcomes

Training has often been less effective than expected:◦ Managerial Training – people are learning but not

applying1

◦ Medical Professionals2, 3

Training has been effective:◦ medium to large effect sizes for training outcome

criteria related to learning (e.g., knowledge; d=0.63), behavior (e.g., job-related behavior changes; d=0.62), and results (e.g., productivity; d=0.62)4

1. Powell & Yalcin (2010)2. Davis, O’Brien, Freemantle, Wolf, Mazmanian, & Taylor-Vaisey (1999)3. Green & Seifert (2005)4. Arthur, Bennett, Edens, & Bell (2003)

8

Page 9: Evaluating Intended Continuing  Education  Outcomes

New Zealand study, Flett, Biggs, & Alpass (1994)◦ Finding: professional training decreased

occupational stress, thereby increasing the rehabilitation practitioner’s ability to work effectively

Christensen, Boisse, Sanchez, & Friedmann (2004)◦ Finding: a one-day training workshop impacted VR

counselors’ knowledge and reported practice in substance abuse screening (*never to rarely)

9

Page 10: Evaluating Intended Continuing  Education  Outcomes

Despite the intended outcomes of continuing education, the return on this training investment remains, to a great extent, unmeasured and unknown.

Most training programs are evaluated only for the participants reactions (i.e., satisfaction; Alliger & Janak, 1989; Van Buren & Erskine, 2002)

10

Page 11: Evaluating Intended Continuing  Education  Outcomes

11

Page 12: Evaluating Intended Continuing  Education  Outcomes

It is important to evaluate continuing education in order to validate and improve such training efforts

When budgets are tight, it may be necessary to justify training expenses

12

Page 13: Evaluating Intended Continuing  Education  Outcomes

Comic showing 2 men at a chalkboard. The chalkboard has a complicated formula on it. In step 2 of the formula are the words "then a miracle occurs." One man says: "I think you should be more explicit here in step two."

13

Continuing Education Training

Better Outcomes for Persons with Disabilities

Page 14: Evaluating Intended Continuing  Education  Outcomes

Kirkpatrick’s Four Levels Logic Model

14

Page 15: Evaluating Intended Continuing  Education  Outcomes

The Four Levels◦ Level 1: Reaction – To what degree participants

react favorably to the learning event.◦ Level 2: Learning – To what degree participants

acquire the intended knowledge, skills, and attitudes based on their participation in the learning event.

◦ Level 3: Behavior – To what degree participants apply what they learned during training when they are back on the job.

◦ Level 4: Results - To what degree targeted outcomes occur, as a result of the learning event(s) and subsequent reinforcement.

15

The Four LevelsLevel 1:Reaction

To what degree participants react favorably to the learning event.

Level 2:Learning

To what degree participants acquire the intended knowledge, skills, and attitudes based on their participation in the learning event.

Level 3:Behavior

To what degree participants apply what they learned during training when they are back on the job.

Level 4:Results

To what degree targeted outcomes occur, as a result of the learning event(s) and subsequent reinforcement.

*From Kirkpatrick & Kirkpatrick (2010)

Page 16: Evaluating Intended Continuing  Education  Outcomes

Reaction – 78% Learning – 32% Behavior – 19% Results – 7% (Reported across multiple

disciplines)

16

*Reported across multiple disciplines

Morin, L., & Renaud, S. (2004). Participation in corporate university training: Its effect on individual job performance. Canadian Journal of Administrative Sciences, 21(4), 295-306.

Page 17: Evaluating Intended Continuing  Education  Outcomes

A logic model shows the rationale or program theory for how program planners believe that the resources and activities invested in a program will produce the expected outcomes.

Used to:◦ Visually display the components of a program◦ Identify measures that will be useful in evaluating

the program outcomes

17

Page 18: Evaluating Intended Continuing  Education  Outcomes

InputsResources (human, financial, organizational, community)ActivitiesImplementation; how resources are used (projects, events,

actions)OutputsParticipationDirect products (deliverables)OutcomesImpact (expected changes or benefits)Short-term = learningMedium-term = action or behaviorLong-term = conditionsAdapted from University of Wisconsin-Extension-Cooperative

Extension, 2003; W. K. Kellogg Foundation, 2004

18

*Adapted from University of Wisconsin-Extension-Cooperative Extension, 2003; W. K. Kellogg Foundation, 2004

Page 19: Evaluating Intended Continuing  Education  Outcomes

InputsActivitiesOutputsOutcomes

19

Activities

Inputs

Outcomes

Outputs

Page 20: Evaluating Intended Continuing  Education  Outcomes

Inputsprogram plannersinstructor preparationtraining materialsmoneyfacilitiestechnologyActivitiesUsing resources to implement a:continuing education programconferenceWorkshopwebinarOutputsverification of attendance at the trainingCEU credits earnedsatisfaction scoresOutcomesShort-term: increases in participants’ knowledge, skills, and confidenceMedium-term: participants implement new knowledge and skillsLong-term: increases in agency-level performance, efficiency, and consumer satisfaction

20

Page 21: Evaluating Intended Continuing  Education  Outcomes

Which criteria should be measured in order to most accurately assess the outcomes of continuing education?

What types of measures can act as indicators that professionals are developing?

21

Page 22: Evaluating Intended Continuing  Education  Outcomes

Knowledge Translation (KT) Organization Development (OD)

22

Page 23: Evaluating Intended Continuing  Education  Outcomes

“A move beyond the simple dissemination of knowledge into actual use of knowledge”1

Barriers to KT / research utilization2:◦ Environmental & organizational factors (culture,

leadership)◦ Individual factors (age, years of service)◦ Difficulty accessing research (database access,

time)◦ Difficulty determining the relevance of research

1. Straus, S. E., Tetroe, J., & Graham, I. (2009). Defining knowledge translation. Canadian Medical Association Journal, 181(3-4), 165-168.

2. Johnson, K., Brown, P., Harniss, M., & Schomer, K. (2010). Knowledge translation in rehabilitation counseling. Rehabilitation Education, 24(3-4), 239-250.

23

Page 24: Evaluating Intended Continuing  Education  Outcomes

24

Page 25: Evaluating Intended Continuing  Education  Outcomes

Organizational Development: A method for designing, implementing, and reinforcing intentional organizational changes1

Key characteristic of OD:◦ The action taken is deliberately and consciously

designed to bring about change over a specified time period, and there must be some way to demonstrate and/or measure the degree to which the change occurred2

1. Cummings & Worley, 2009

2. Worley & Feyerherm, 2003

25

Page 26: Evaluating Intended Continuing  Education  Outcomes

Notes:1. Kirkpatrick’s levels2. Knowledge Translation principles3. Organization Development principles 26

KT2

Page 27: Evaluating Intended Continuing  Education  Outcomes

Requirements for pre-approval of CRC continuing education credits:◦ >60 minutes◦ Focus is to increase knowledge of or skills in

rehabilitation counseling◦ Clearly defined learning objectives or expected

outcomes◦ Participants complete an evaluation of the program’s

value (not an evaluation of learning)◦ Accessible, barrier free location◦ For CE through written means, multiple choice

questions are required.CRCC (2011)

27

Page 28: Evaluating Intended Continuing  Education  Outcomes

VR agency does an in-service training1. They identify learners needs/wants; Involve learners

in training planning process (Adult learning theory; OD)

2. Set objectives for learning (Use these in the evaluation questions & indicators)

3. Hold training4. Evaluate:

Reaction (satisfaction survey) Learning (pre-post quiz) Behavior (2-3 month follow up survey on Objectives)

(Knowledge Translation) Results (3-12 month follow up on agency indicators

specifically related to Objectives) (Organization Development)

28

Page 29: Evaluating Intended Continuing  Education  Outcomes

29

Page 30: Evaluating Intended Continuing  Education  Outcomes

Timing◦ 2-12 months post training? Or the amount of time

estimated for participants to implement new skills.

Measurements not too distal from training objectives (if too distal, you won’t see the impact)

May also want to assess organizational culture (did it support or hinder implementation)

30

Page 31: Evaluating Intended Continuing  Education  Outcomes

Include planned assessments in the course outline

Build assessments so that they are fully integrated into the course

Ensure participants that their satisfaction scores will remain anonymous (i.e., they will not be tracked by IP address)

Provide immediate feedback when feasible

31

Page 32: Evaluating Intended Continuing  Education  Outcomes

32

Page 33: Evaluating Intended Continuing  Education  Outcomes

Online training provided to 42 counselors Trained to implement a case management

model in 10 sites Trainees were administered a knowledge

check as a post-training assessment Ongoing training provided using Case

Reviews Performance evaluated using benchmarks

of key model indicators aggregated by site

Page 34: Evaluating Intended Continuing  Education  Outcomes

Conduct training and evaluate the extent of content learned using a knowledge check

Assess interim performance by conducting case reviews using a protocol reflective of model processes and providing one-on-one instruction as needed

Evaluate relationship of performance on case review protocol with model performance indicators (interim assessment reported here)

Eventually relate performance on model indicators with employment outcomes (will not be available for several years)

Page 35: Evaluating Intended Continuing  Education  Outcomes

Provide online trainingAssess KnowledgeConduct quarterly case reviewsProvide TA as neededMonitor site performance monthlyAnalyze individual and site data(back to) Conduct quarterly case reviews

35

Provide online

training

Assess Knowledge

Conduct quarterly

case reviews

Provide TA as needed

Monitor site performance

monthly

Analyze individual and

site data

Page 36: Evaluating Intended Continuing  Education  Outcomes

Did those who completed all the online courses do better than those who did not complete the courses on their first Case Review?

Page 37: Evaluating Intended Continuing  Education  Outcomes

Score on First Case Review 8 did not complete all the courses 34 did complete them Means:

◦ Completers averaged 76% ◦ Non Completers averaged 86%◦ This is not a significant difference

Implication – more assistance needed

Page 38: Evaluating Intended Continuing  Education  Outcomes

Did those who completed all the online courses do better than those who did not complete the courses as averaged over all their Case Reviews?

Page 39: Evaluating Intended Continuing  Education  Outcomes

The average score of all case reviews Means:

◦ Completers averaged 80%◦ Non Completers averaged 82%◦ This is not a significant difference

Providing training in and of itself may not be sufficient to achieve desired performance

Page 40: Evaluating Intended Continuing  Education  Outcomes

What is the relationship between the average course grade and Case Review scores?

First Case Review score: Correlation = .09 Averaged Case Review scores: Correlation

= .12 These are both significant at the .01 level Implication – Although these results are in the

desired direction, they are weak and reinforce the need for ongoing technical assistance

Page 41: Evaluating Intended Continuing  Education  Outcomes

What is the degree of improvement in Case Review Ratings over time?

Mean Case Review Ratings: Quarter 1 – 77.9Quarter 2 – 70.7Quarter 3 – 81.9Quarter 4 – 85.3Quarter 5 – 89.0

Implication – providing ongoing technical assistance indicates improved conformance to model expectations

Quarter Mean Case Review Rating

1 77.9

2 70.7

3 81.9

4 85.3

5 89.0

Page 42: Evaluating Intended Continuing  Education  Outcomes

This analysis was based on aggregated data by site

How do Case Review scores, as measured by aggregating most recent two reviews, relate to the most recent performance indicators?

Correlation with: ◦ Key indicators most reflective of course content

= .59◦ Total benchmark score = .65◦ Ranking of site performance = .71

Page 43: Evaluating Intended Continuing  Education  Outcomes

Training with follow up technical assistance leads to desired performance. Simply providing training may not lead to success.

Looking at data using the individual as unit of analysis and also using aggregated site data as unit of analysis leads to enhanced understanding of the impact of training and technical assistance

Page 44: Evaluating Intended Continuing  Education  Outcomes

44

Page 45: Evaluating Intended Continuing  Education  Outcomes

Professional Conference 2 follow-up surveys were sent

◦ 1st survey immediately following the conference◦ 3 month follow-up survey after the conference

Sought to identify factors that facilitated or hindered Knowledge Translation

Sought to identify the types of collaboration that resulted from networking at the conference

45

Page 46: Evaluating Intended Continuing  Education  Outcomes

N = 98 Reported KT = 76 Facilitators of KT

◦ Personal Interest = 72% (55)◦ Opportunity in current situation = 61% (46)◦ Belief that applying the knowledge/skills will make a

positive difference = 45% (34)◦ High Self-efficacy = 39% (30)◦ Supportive policies and/or superiors = 34% (26)◦ Peer Interest = 33% (25)

New collaborations as a result of the conference:◦ About 500 new collaborative projects (not unique)

reported among 76 responders

46

Page 47: Evaluating Intended Continuing  Education  Outcomes

Reported no KT = 22 Barriers to KT

◦ Lack of personal interest = 0◦ Lack of peer interest = 1◦ *Lack of opportunity = 8◦ Lack of supportive policies and/or superiors = 0◦ Low Self-efficacy = 0◦ Belief that applying the knowledge/skills will not

make a difference = 1◦ *Lack of Time = 6

47

Page 48: Evaluating Intended Continuing  Education  Outcomes

48

Page 49: Evaluating Intended Continuing  Education  Outcomes

49

Page 50: Evaluating Intended Continuing  Education  Outcomes

Validation of the effectiveness of continuing education

Timely feedback to continuing education providers

Evaluate at levels beyond just Reaction: Learning, Behavior, Results

The very act of evaluating (and planning to evaluate) can positively impact:◦ Organization of the continuing education activity◦ Engagement of the participants during the continuing

education activity◦ Transfer of knowledge and skills to workplace

behaviors◦ Results for the entire organization

50

Page 51: Evaluating Intended Continuing  Education  Outcomes

Research:◦ A potential framework for gathering data on the

effectiveness of continuing education programs (learning, behavior, results, impact).

Practice:◦ A tool in planning, evaluating, validating, and

improving ongoing counselor training efforts.◦ Better ensure that continuing education increases

the competence of practitioners and thereby improves services and outcomes for service recipients.

51

Page 52: Evaluating Intended Continuing  Education  Outcomes

52