do your osce marks accurately portray your students’ competency?    a . de villiers, e. archer

1
Do your OSCE marks accurately portray your students’ competency? A. de Villiers, E. Archer Stellenbosch University CONTEXT Objective Structured Clinical Examination (OSCE) examiner training is widely utilised to address some of the reliability and validity issues that accompany the use of this assessment tool. An OSCE skills course was developed and implemented at the Stellenbosch Faculty of Health Sciences and the influence on participants (clinicians) was evaluated. METHOD Participants attended the OSCE skills course which included theoretical sessions (about topics such as standard- setting, examiner influence and assessment instruments), as well as two staged OSCEs, one at the beginning and the other at the end of the course. During the OSCEs each participant examined a student role-player performing a technical skill while being video recorded. Participants’ behaviour during, and assessment results from the two OSCEs were evaluated, as well as the feedback from participants regarding the course and group interviews with student role players. RESULTS There was a significant improvement in inter-rater reliability (see figure I) as well as a slight decrease in inappropriate examiner behaviour, such as teaching and prompting during the second OSCE. Furthermore, overall feedback from participants and perceptions of student role-players were positive. CONCLUSION In this study, examiner conduct and inter-rater reliability was positively influenced by the following interventions: examiner briefing; involvement of examiners in constructing assessment instruments as well as viewing (on DVD) and reflection of assessment behaviour by examiners. This study proposes that the development and implementation of an OSCE skills course is a worthwhile endeavour in improving validity and reliability of the OSCE as an assessment tool. Figure I: Participants’ rating (A) against Standard (Rating) Time=After Scatterplot of A against Rating Data - Narrow in Analysis - 15Sep2010.stw 15v*24c A = 19.446+0.6314*x 30 40 50 60 70 80 90 Standard (Rating) 40 45 50 55 60 65 70 75 80 85 Participants (A) Time=Before Box & Whisker Plot: Difference From Standard Mean Mean±SE Mean±1.96*SE Yes No Health Sciences Education -30 -20 -10 0 10 20 30 40 DifferenceFromStandard An interesting finding was that participants who had previously undergone Health Sciences Education training generally deviated less from the standard, than did those participants who had not (p=0.01). See figure II. Figure II: The effect of Health Sciences education on participants’ rating [email protected]

Upload: michelle-dillard

Post on 01-Jan-2016

17 views

Category:

Documents


0 download

DESCRIPTION

Time=After. Scatterplot of A against Rating. Data - Narrow in Analysis - 15Sep2010.stw 15v*24c. A = 19.446+0.6314*x. 85. 80. 75. 70. 65. Participants (A). 60. 55. 50. 45. 40. 30. 40. 50. 60. 70. 80. 90. Standard (Rating). Time=Before. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Do your OSCE marks accurately portray your students’ competency?    A . de Villiers, E. Archer

Do your OSCE marks accurately portray your students’ competency?    A. de Villiers, E. Archer

Stellenbosch UniversityCONTEXTObjective Structured Clinical Examination (OSCE) examiner training is widely utilised to address some of the reliability and validity issues that accompany the use of this assessment tool. An OSCE skills course was developed and implemented at the Stellenbosch Faculty of Health Sciences and the influence on participants (clinicians) was evaluated.

METHODParticipants attended the OSCE skills course which included theoretical sessions (about topics such as standard-setting, examiner influence and assessment instruments), as well as two staged OSCEs, one at the beginning and the other at the end of the course. During the OSCEs each participant examined a student role-player performing a technical skill while being video recorded. Participants’ behaviour during, and assessment results from the two OSCEs were evaluated, as well as the feedback from participants regarding the course and group interviews with student role players.

RESULTSThere was a significant improvement in inter-rater reliability (see figure I) as well as a slight decrease in inappropriate examiner behaviour, such as teaching and prompting during the second OSCE. Furthermore, overall feedback from participants and perceptions of student role-players were positive.

CONCLUSION In this study, examiner conduct and inter-rater reliability was positively influenced by the following interventions: examiner briefing; involvement of examiners in constructing assessment instruments as well as viewing (on DVD) and reflection of assessment behaviour by examiners. This study proposes that the development and implementation of an OSCE skills course is a worthwhile endeavour in improving validity and reliability of the OSCE as an assessment tool.

Figure I: Participants’ rating (A) against Standard (Rating)

Time=AfterScatterplot of A against Rating

Data - Narrow in Analysis - 15Sep2010.stw 15v*24c

A = 19.446+0.6314*x

30 40 50 60 70 80 90

Standard (Rating)

40

45

50

55

60

65

70

75

80

85

Pa

rtic

ipa

nts

(A

)

Time=BeforeBox & Whisker Plot: Difference From Standard

Mean Mean±SE Mean±1.96*SE

Yes No

Health Sciences Education

-30

-20

-10

0

10

20

30

40

Dif

fere

nc

eF

rom

Sta

nd

ard

An interesting finding was that participants who had previously undergone Health Sciences Education training generally deviated less from the standard, than did those participants who had not (p=0.01). See figure II.

Figure II: The effect of Health Sciences education on participants’ rating

[email protected]