assessment – why and how? stephanie b. jones, md associate professor, harvard medical school vice...

Post on 29-Mar-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Assessment – Why and How?

Stephanie B. Jones, MD

Associate Professor, Harvard Medical School

Vice Chair for Education

Department of Anesthesia, Critical Care and Pain Medicine

Beth Israel Deaconess Medical Center

Problems

• No one teaches us how to assess– We judge as we were judged

• Ideal vs reality– Keeping up with a moving target (ACGME)– Faculty time and energy

• We work in a time-based system– Should residency be truly competency based?

Assessment – Why and How?

• Definitions– Including difference between feedback and assessment

• Tools– Global, checklists, 360°, portfolios

• Limitations and Questions

Assessment – a definition

“…process of collecting, synthesizing, and interpreting information to aid decision-making.”– successful completion of a rotation– promotion– remediation

Airasian PW. Classroom assessment, 3rd ed. 1997

Feedback vs Assessment

• Feedback formative evaluation– Provide information for improvement– Directly from the source

• Assessment summative evaluation– How well a goal has been met– Judgment after the fact

• Reality overlap

Ende J. JAMA 1983;250:777-781

Feedback and Assessment

• Can use same tools for both

• Allows learner to practice, know goals

• End result shouldn’t be a surprise

Duffy et al. Acad Med 2004;79:495-507

Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

Feedback and Assessment

• Fundamentals of laparoscopic surgery (FLS)– MCQ exam– 5 skill stations

• BIDMC– PGY 4 surgery residents must pass to advance to PGY 5 year

Barriers to feedback

• Faculty aren’t taught how to do effectively– Rosenblatt & Schartel 1999– Only 20% of programs offered formal training– Need reinforcement

• Residents often not directly observed

• Impact on faculty evaluations

• Time and money

Assessment

• ACGME– Core competencies

• Patient care• Medical knowledge• Practice-based learning and improvement• Interpersonal and communication skills• Professionalism• Systems-based practice

• ABA– Certify that the graduate has “demonstrated

sufficient professional ability to practice competently and independently in the field of Anesthesiology”

Competency

“….the ability to handle a complex professional task by integrating the relevant cognitive, psychomotor, and affective skills”.

Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

An analogy

• “Diagnosing” whether resident should be promoted, graduated, etc• Just like a complex patient, may need multiple tests and opinions to make the “diagnosis”

Joyce B. ACGME

What should happen

• Design an assessment system– Collection of assessment tools– Decide who does evaluations– Decide what will be evaluated– Evaluation schedule

What does happen

• Continue to use existing system• Open ACGME toolbox

– New tools– More data

• Are the tools reliable and valid?• Do we ever use the extra data?

http://www.acgme.org/Outcome/assess/Toolbox.pdf

Definitions

• Validity – Does assessment measure what it intends to measure

• Reliability – Scores from assessment are reproducible (consistency)

Choosing assessment tools

• Valid data• Reliable data• Feasible• External validity• Provide valuable information

• Some compromise will be involvedLynch and Swing, ACGMEwww.acgme.org/outcome

The assessment system

• Consistent with program objectives• Objectives are representative

– You can’t assess everything

• Multiple tools• Multiple observations

– Looking for patterns– Doesn’t mean you need to add more questions

Lynch and Swing, ACGMEwww.acgme.org/outcome

The assessment system

• Multiple observers– Improves reliability

• Assessed according to pre-specified criteria– Goals and objectives

– Faculty training

• Fair

Lynch and Swing, ACGME

www.acgme.org/outcome

Summing up…

“A good assessment programme will incorporate several competency elements and multiple sources of information to evaluate those competencies on multiple occasions using credible standards.”

Van der Vleuten & Schuwirth. Med Educ 2005;39:309-317

Some challenges

• Setting “passing” criteria for qualitative information• Mixed messages• Differentiating performance in a testing situation versus performing with real patients

Holmboe ES. Acad Med 2004;79:16-22

Global evaluations

• The “old” standard• Usual end-of-rotation evaluation• More useful with behavior-based descriptions or

anchors

Professionalism and Honesty1) Residents must demonstrate a commitment to carrying out professional responsibility, adherence to ethical principles. Demonstrate respect, compassion, and integrity; a responsiveness to the needs of patients. Demonstrate a commitment to confidentially of patient information, informed consent, departmental polices and guidelinesUnsatisfactory, Below Expectations, Good, Above Expectations, Excellent, N/A

Global evaluations

• Useful in context of summative assessment• Williams et al, SIU – 3 item global evaluation

– Clinical performance– Professional behavior– Overall performance in comparison to peers

Williams et al. Surgery 2005;137:141-7

Global evaluations

• Can be used for more specific feedback/assessment– Doyle et al, British Columbia

• Technical skills in OR, GRITS

– Vassiliou et al, McGill

• Assessment of laparoscopic skills, GOALS

Doyle et al. Am J Surg 2007; 193:551-5

Vassiliou et al. Am J Surg 2005;190:107-113

Global rating index for technical skills “GRITS”

“GRITS”

Halo effect?

• Global evaluations clearly subject to “halo effect”• Vogt et al, University of TN

– Gyn surgical skills– Videotaped “hand only” and “waist up”– Scores differed between 2 views (both directions)

Vogt et al. Am J Obstet Gynecol 2003; 189:688-91.

Checklists

• Simulation– Scavone et al, Northwestern

• Simulated GA for csxn• CA3 vs CA1, 150 vs 128 points

– Murray et al, Washington University• Series of studies on acute skills performance• Multiple scenarios tested• Senior residents scored best, varied with scenario

Scavone et al Anesthesiology 2006;105:260-6Murray et al Anesth Analg 2005;101:1127-34

Checklists

• Standardized patients– OSCE

• Observations of skills– Preanesthesia consult– Machine checkout

De Oliveira Filho and Schonhorst. Anesth Analg 2004;99:62-9

360° evaluation

• Derived from business world

• Multisource evaluation– Different perspectives– Lends credibility– Can/should include self-evaluation

• Time-sensitive

360° evaluation

• Resident position in hierarchy not fixed– Change rotations

– Change attendings

– Change types of rotations

• But can still create action plans based upon results– PBLI and SBP

Massagli et al. Am J Phys Med Rehabil 2007;86:845-52

360° evaluation

• Opportunity to include patient feedback• More “real” than standardized patients?

– Overcomes limitation of observer not knowing how patient really feels– Staff evaluation of interpersonal and communication skills often based upon interactions with staff, not patients.

Duffy et al. Acad Med 2004;79:495-507

360° evaluation • Worth the trouble?• Brinkman et al

– Pediatrics– Added parents and nurses– Better feedback for communication skills and

professionalism

• Weigelt et al– Surgery, trauma/CC rotation– Added RNs, NPs, ICU fellows, Chief resident, trauma

nurse clinicians– No change in ratings with added groups

Arch Pediatr Adolesc Med 2007:161:103-4; Curr Surg 2004;61:616-26

360° evaluation

• Self-evaluation– Adult learning theory

• Curriculum should be “learner-centric”• Have to “know what you don’t know”

– Often poor correlation between self-assessment and external measures

– 360 ° allows opportunity to reconcile conflict• Not “how good am I?”, but “how can I get better?”

Schneider et al. Am J Surg 2008;195:16-19.

Portfolios

• Requires reflection and self-assessment• Skills needed for life-long learning• But…

– Need mentor who can facilitate or just becomes a bunch of stuff in a folder– If used for summative assessment, must be very clear about requirements

Structured portfolio

Holmboe et al. Am J Med 2006;119:708-714

Structured portfolio

Holmboe et al. Am J Med 2006;119:708-714

Remaining questions

• Are residents truly “adult learners”?• What is the best way to assess the assessments?• Does any of this really improve outcomes in a time-limited residency?• How can we assess residents after graduation?

top related