mbbs, mph, mcps, mrcgp (uk), friph (uk), fhae (uk) trainee evaluation method ass. prof. dr. abdul...

Post on 13-Jan-2016

219 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

MBBS, MPH, MCPS, MRCGP (UK), FRIPH (UK), FHAE (UK)

Trainee evaluation method

Ass. Prof. Dr. Abdul Sattar KHAN

Family & Community Medicine Department

College of Medicine

King Faisal University

Objectives At the end of the session, participants are expected to know:

•What is an assessment process?•Who should assess the students?•Why assess the students?•What should be assessed?•How should the students be assessed?•When should the students be assessed?•Where should the students be assessed?

04/21/23 ASK 2

What is Learning?

A curricular definition

Learning - acquiring knowledge, skills, attitudes, and values and experiences.

A cognitive definition

Learning - a process of formulating new and more complex understandings of the world

Learning - revising and refining mental constructs, i.e., the understandings that guide how we think, speak and behave

http://dakota.fmpdata.net/PsychAI/PrintFiles/DefLrng.pdf

04/21/23 ASK 3

How do we know that learning is done?

04/21/23 ASK 4

Key Relationship in Learning Process

Teaching

Learning

Assessment

04/21/23 ASK 5

What is an assessment process?

04/21/23 ASK 6

The Oxford Dictionary1 defines assessment as ‘the action of assessing’. In the context of medical education, assessment could be defined as determining the competence of the product, that is, the Health Professional. It has three pillars2: 1.Cognitive, 2.Affective 3.psychomotor

2. Newble D. Assessment. In: Jolly B, Rees L. (Editors) Medical education in the millennium, Oxford: Oxford University Press; 1998.p.131–42.

1. Oxford Dictionary of English, 2nd Edition, Oxford University Press; 2012

Who should assess the students?

Medical schools

International accrediting body

National accrediting body

Professional bodies

Licensure bodies

Department

Individual teacher

patients / community

Students themselves

04/21/23 ASK 7

8

Why assess the students?

“fit for purpose”

Ranking of student

Measurement of improvement in a

student

To diagnose student difficulties

Evaluation of teaching methodMotivating student to study Provision of feedback for the teacher

04/21/23 ASK

What should be assessed?

Competencies / Outcomes

•CanMeds roles•Medical expert•Communicator•Collaborator•Manager•Health advocate

•Scholar•Professional

ACGME (Accreditation council for Graduate Medical Education)

competencies• Medical knowledge• Patient care• Practice-based

learning & improvement

• Interpersonal and communication skills

• Professionalism• Systems-based

practice

www.royalcollege.ca/common/documents/canmeds/

http://www.acgme.org/acgmeweb/04/21/23 ASK 9

Miller’s pyramid of competence

Knows

Shows how

Knows how

Does

Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.

04/21/23 ASK 10

How should the students be assessed?

04/21/23 ASK 11

Essay Type QuestionsLong & ShortModified Essay

Objective type Questions Multiple Choice Question MCQSupply-item True-False itemExtending matching item

Assessing know

Knows

Shows how

Knows how

Does

KnowsEssay type, MEQ, MCQ etc

04/21/23 ASK 12

Relative characteristics of different types of examination

Applications Essay Short answ

er

MCQ Extended Matching

MEQ

Application of knowledge

Excellent Good Poor Good Excellent

Assessment Excellent Good Poor Fair Good

Coverage of topic Poor Good Excellent

Excellent Excellent

Reliability of score Poor to Fair

Good Excellent

Excellent Excellent

Ease of scoring Poor Fair Excellent

Excellent Excellent

Preparation time Min to Mod

Mod Large Mod Mod

Total cost Large Mod Low Low Mod

Cheating Difficult Difficult

Easy Easy Difficult 04/21/23 ASK 13University of Dundee, 2004

How should the students be assessed?

How should the students be assessed?

04/21/23 ASK 14

Assessment of skillsPatient Management Problem PMP.Observational Assessment, OSCE, OSPE. MiniCEX, DOPS etc.

A new approach: 1.Response Format (Open-ended & MCQ)2.Stimulus Format (Context-free & Context-rich – EMQ & Key-feature)

3.Hybrid Format (Script Concordance Test- SCT)

Assessing knowing how

Knows

Shows how

Knows how

Does

Knows

Knows howWritten complex simulations (PMPs)

04/21/23 ASK 15

Assessing showing how

Knows

Shows how

Knows how

Does

Knows how

Shows how

Performance assessment in vitro (OSCE)

04/21/23 ASK 16

Assessing does

Knows

Shows how

Knows how

Does

Shows how

Does

Performance assessment in vivo by judging work samples (Mini-CEX, CBD, MSF, DOPS, Portfolio)

04/21/23 ASK 17

Tomorrow’s Doctors

Assessments will be fit for purpose – that is: ValidReliableGeneralizableFeasibleFair

04/21/23 ASK 18

http://www.gmc-uk.org/education/undergraduate/tomorrows_doctors.asp

How be sure the process is appropriate?

anatomy

physiology

int medicine

surgery

psychology

item poolreviewcommittee

testadministration

item analysesstudentcomments

Info to users

item bank

Pre-test review Post-test review04/21/23 ASK 19

Maastricht Review Process

Reliabilities across methods

TestingTime inHours

1

2

4

8

MCQ1

0.62

0.76

0.93

0.93

Case-BasedShortEssay2

0.68

0.73

0.84

0.82

PMP1

0.36

0.53

0.69

0.82

OralExam3

0.50

0.69

0.82

0.90

LongCase4

0.60

0.75

0.86

0.90

OSCE5

0.47

0.64

0.78

0.88

PracticeVideo

Assess-ment7

0.62

0.76

0.93

0.931Norcini et al., 19852Stalenhoef-Halling et al., 19903Swanson, 1987

4Wass et al., 20015Petrusa, 20026Norcini et al., 1999

In-cognito

SPs8

0.61

0.76

0.92

0.93

MiniCEX6

0.73

0.84

0.92

0.967Ram et al., 19998Gorter, 2002

04/21/23 ASK 20Cees van der Vleuten, 2010

Factors influencing reliability

Environmental errors

Processing errors

Generalization errors

Bias errors

a. Weightingb. Rater

prejudicec. Halo

effectd. Leniency

and stringency04/21/23 ASK 21

Factors influencing validity

Unclear instructions

Complicated vocabulary

Too easy/difficult items

Unintentional clues

Not matching with outcome

Inadequate time

Too few items

Unsuitable item arrangement

Detectable answer pattern

04/21/23 ASK 22

Measuring the unmeasurable

Knows

Shows how

Knows how

Does

“Domain independent” skills

“Domain specific” skills

Assessment (mostlyin vivo) heavily relying onexpert judgment and qualitative information

04/21/23 ASK 23Cees van der Vleuten, 2010

Measuring the unmeasurable

Self assessment

Peer assessment

Co-assessment (combined self, peer, teacher assessment)

Multisource feedback

Log book/diary

Learning process simulations/evaluations

Product-evaluations

Portfolio assessment

04/21/23 ASK 24Cees van der Vleuten, 2010

When should the students be assessed?

•At the beginning of the course

•During the course

(Formative - Diagnostic)

•At the end of the course

(Summative - Certifying)

04/21/23 ASK 25

Where should the students be assessed?

•Examination hall

•Class room

•Hospital wards

•Out patients department (OPD)

•Work Places

• Community

04/21/23 ASK 26

The Myths of testing

Testing motivates by threating

•If I threaten you will fail and then you will try harder

•Maximize anxiety to maximize learning

Testing helps teachers make important instructional decisions

•Students are not assessment users

04/21/23 ASK 27(Stiggins, 2004)

The Myths of testingI

mportant assessment decisions can be made once a year

•Investment of time, effort, and money into large-scale testing supports this belief

Learning how to assess is not as important as learning how to teach•Teachers teach and testing professionals test

04/21/23 ASK 28(Stiggins, 2004)

Take Home Message

When assessing students it is important to familiar with maximum methods.

So, use as many as possible…

04/21/23 ASK 29

top related