standard setting for clinical assessments katharine boursicot, bsc, mbbs, mrcog, mahpe reader in...

Post on 01-Apr-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Standard setting for clinical assessments

Katharine Boursicot, BSc, MBBS, MRCOG, MAHPEReader in Medical EducationDeputy Head of the Centre for Medical and Healthcare EducationSt George’s, University of London

The Third International Conference on Medical Education in the Sudan

WHAT are we testing in clinical assessments?

• Clinical competence

• What is it?

A popular modern model: elements of competence

• Knowledgeo factualo applied: clinical reasoning

• Skillso communicationo clinical

• Attitudeso professional behaviour

Tomorrow’s Doctors, GMC 2003

Knows

Knows how

Shows how

Behaviour~ skills/attitudes

Cognition~ knowledge

Another popular medical model of competence

Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.

Does

Pro

fess

ion

al

auth

enti

city

Assessment of competence

• A review of developments over the last 40 years

Knows

• 1960: National Board of Medical Examiners in the USA introduced the MCQ

• MCQs conquered the world

• Dissatisfaction due to limitation of MCQs

Knows

Knows how

Shows how

Does

Knows how

• 1965: Introduction of PMPPatient Management Problem

Knows

Knows how

Shows how

Does

Patient Management Problem

Clinical Scenario

Action Action Action

Action Action

Action Action Action

Action Action ActionAction Action Action

Action

Knows how

• 1965: Introduction of PMPo Patient Management Problem

• Well constructed SBA format MCQs can test the application of knowledge very effectively

Knows

Knows how

Shows how

Does

Shows how

• 1975: Introduction of Objective Structured Clinical Examination (OSCE)

OSCEs are conquering the world

Knows

Knows how

Shows how

Does

Does

• > 2000: emerging new methods

• WBAs – Workplace-Based Assessmentso Mini Clinical Examination Exerciseo Direct Observation of Practical Procedureo OSATSo Masked standardized patientso Video assessmento Patient reportso Peer reportso Clinical work samples

o ………

Knows

Knows how

Shows how

Does

Mini CEX (Norcini, 1995)

• Short observation (15-20 minutes) and evaluation of clinical performance in practice using generic evaluation forms completed by different examiners

(cf. http://www.abim.org/minicex/)

Example of mini-CEX form

DOPS – Direct Observation of Practical Procedure

OSATS – Objective structured Assessment of Technical Skills

WBAs – Workplace-Based Assessments

• All based on the principle of an assessor observing a student/trainee in a workplace or practice setting

Past 40 years: climbing the pyramid.....

Knows

Shows how

Knows how

Does

Knows Factual tests: SBA-type MCQs…..

Knows how (Clinical) Context based tests:SBA, EMQ, MEQ…..

Shows how Performance assessment in vitro:OSCEs

DoesPerformance assessment in vivo:Mini-CEX, DOP, OSATS, …..

Standard setting – why bother?

• To assure standards o At graduation from medical schoolo For licensingo For a postgraduate (membership) degreeo For progression from one grade to the nexto For recertification

At graduation from medical school

• To award a medical degree to students who meet the University’s standards (University interest)

• To distinguish between the competent and the insufficiently competent (Public interest)

• To certify that graduates are suitable for provisional registration (Regulatory/licensing body interest)

• To ensure graduates are fit to undertake F1 posts (employer interest)

Definition of Standards

• A standard is a statement about whether an examination performance is good enough for a particular purposeo a particular score that serves as the

boundary between passing and failingo the numerical answer to the question

“How much is enough?”

Standard setting

All methods described in the literature are based on ways of translating

expert (clinical) judgement

into a score

‘Classical’ standard setting methods

• For written test items:o Angoff’s methodo Ebel’s method

• For OSCEs:o Borderline group methodo Regressions based method

Performance-based standard setting methods

• Borderline group method

• Contrasting group method

• Regression based standard method

Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C Comparison of a rational and an empirical standard setting procedure for an OSCE, Medical Education, 2003 Vol 37 Issue 2, Page 132

Kaufman DM, Mann KV, Muijtjens AMM, van der Vleuten CPM. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Acad Med 2000; 75:267-271.

The examiner’s role in standard setting

• Uses the examiner’s clinical expertise to judge the candidate’s performance

• Examiner allocates a global judgement based on the candidate’s performance at that station

• Remember the level of the examination Pass

BorderlineFail

Borderline Group Method

Checklist1.

2.

3.

4.

5.

6.

7.

TOTAL

Passing score

Borderline score distribution

Pass, Fail, Borderline

Test score distribution

Contrasting groups method

Checklist

1.

2.

3.

4.

5.

6.

7.

TOTAL

Pass, Fail, Borderline P/B/F

Test score distribution

Passing score

PassFail

Regression based standard

Checklist

1.

2.

3.

4.

5.

6.

7.

TOTAL

Overall rating 1 2 3 4 5

1 2 3 4 5

ChecklistScore

X

X = passing score

1 = Clear fail2 = Borderline3 = Clear pass4 = Excellent5 = Outstanding

Clear Borderline Clear Excellent Outstanding fail pass

Work Based Assessment tools

• No gold standard

standard setting method!

Standard setting

• Standards are based on informed judgments about examinees’ performances against a social or educational construct

e.g.

• competent practitioner

• suitable level of specialist knowledge/skills

Standard setting for Work Based Assessment tools

o Based on descriptors for a particular level of training

o Information gathering relying on descriptive and qualitative judgemental information

o Descriptors agreed by consensus/panel of clinical experts

o Purpose of WBA tools: formative rather than summative: feedback

Feedback

• Giving feedback to enhance learning is some form of judgement by the feedback giver on the knowledge and performance of the recipient

• It is a very powerful tool!

WBAs and feedback

• Underlying principle of WBA tools is FEEDBACK fromo Teacher/supervisoro Peers/team memberso Other professionalso Patients

Conclusions

• It’s not easy to set standards for Work Based Assessments (in the ‘classic’ sense)

• Expert professional judgement is required• Wide sampling from different sources:

range of tools, contexts, cases and assessors

• Feedback to the trainee

top related