using an institutional report card to support evidence-based gme decision making conference session:...

Post on 23-Dec-2015

214 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Using an Institutional Report Card to Support Evidence-Based GME Decision

MakingConference Session: SES46

20101ACGME Annual Education Conference

Ann Dohn, MA, DIO, Alice Edler, MD, MPH, MA (Educ), Nancy Piro, PhD, and Bardia Behravesh, EdD,

Program Managers/Ed Specialists

Department of Graduate Medical EducationStanford Hospital & Clinics

AGENDA

• This workshop session will:– Discuss the need for comparative programmatic

evaluation – Describe existing “report card/scorecard” models from

industry that could be used in our institutions– Present an example of the “Stanford Report Card” for

comparative program evaluation– Facilitate exercises that will support participants in

developing “report cards” and their uses based on individual needs.

Session Objectives

• At the end of this session, participants will be able to:1. Understand the basis of organizational

performance assessment 2. Describe some different models for

organizational report cards3. Identify key areas to include in GME “Report

Cards” 4. Understand key considerations for using and

distributing programmatic evaluation data.

Stanford Background

Stanford University Medical Center currently sponsors 82 ACGME-accredited training programs with over 1000 enrolled residents and fellows.

Stanford University Medical Center Mission• Dedication to pursuing the highest quality of patient care and

graduate medical education, recognizing that one of its major responsibilities is the provision of organized educational programs. – Support of quality graduate medical education programs and

excellence in residency training and research. – Guidance and supervision of the resident while facilitating the

resident’s professional and personal development and ensuring safe and appropriate care for patients.

– Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission.

– Ensuring that all of its graduate medical education programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.

Why Do This?

• We know we’re great…..our residents love us!!– Every Program Director will tell you so…

Can We Wait?

• Can we afford to be slow moving?• Can we wait for ACGME site visits?• Can we wait for Internal Reviews?

But….

• Our goal is a five year ACGME cycle• Internal reviews at the 2-1/2 year mark…• A lot can happen in 2-1/2 years

We Think We Need This

• ACGME and Institutions are increasingly holding DIOs and GME Committees accountable for their utilization of institutional resources.

• Actions / decisions must be based on documented real-time analyses of needs.

DIOs

need to be able to make evidence-based programmatic

decisions based on comparative data

Few Models Exist Today For GME• Prior to the era of outcome competency,

educational quality was perceived solely as test score measurement and credentialing accomplishment. – with the introduction of core competency education

medical educators, learners and patients are demanding a more holistic approach to quality medical training.

Few Models Exist TodayFor GME• The concept of Institutional Accountability is

relatively new. – Until the ACGME Outcome Project, there was no

centralized curriculum oversight in GME, unlike medical schools or UME.

The Report Card Vision• In 2005, Stanford hired

its first PhD in GME– The vision was to

develop tools to construct evidence based decision-making for Graduate Medical Education consistent with our mission

– “We needed a Report Card”…

SIGH….

• It wasn’t as easy as first thought!

Our First Attempt …

Background on InstitutionalReport Cards

• Government and Industry Models– Multiple models exist and can be used as per

specific purpose:• GRPA (Government Performance and Results Act)• Organizational Report Cards• Balanced Scorecard• Benchmarking• Program Evaluations• Social Indicators

Report Card vs. Balanced Scorecard

Org

Focus

Regular

Data Collection

External Assess-ment

Data Trans-

Formation

External audience

Aligned with mission statement

Report

Card

+ + + + + +

Balanced

Score Card

+ + - + - +

Which Model to Choose?

• We needed a model that was:– Organizationally focused and managed – Track record of effective use– Fit our existing structure with multiple programs and

organizations– Flexible enough to be adapted for use on an annual

basis – not an accreditation cycle-Regular Data Collection

– “Easily digestible” internal and external measurement dimensions

Our ChoiceBalanced Scorecard Framework

in an Organization Report Card

(Scorecard) Tool

Best of Both Worlds

Stanford Hospital & Clinics Report Card• The SHC Report Card is built on the Balanced

Scorecard conceptual framework for translating an organization’s vision into a set of performance indicators distributed among four perspectives adapted for GME: 1. Resident Perception Measurements2. Program Processes3. Learning Outcomes4. Financial/Growth

Stanford Hospital & Clinics Report Card

1. Resident Perception Measurements“Guidance and supervision of the resident while

facilitating the resident’s professional and personal development and ensuring safe and appropriate care for patients.”

2. Program Processes“Ensuring that all of its graduate medical education

programs meet or exceed the Institutional and Program Requirements promulgated by the Accreditation Council for Graduate Medical Education.”

Stanford Hospital & Clinics Report Card

3. Learning Outcomes “Support of quality graduate medical education programs and excellence in residency training and research.”4. Financial/Growth“Provision of adequate funding of graduate medical education to ensure support of its faculty, residents, ancillary staff, facilities, and educational resources to achieve this important mission.”

The Balanced Scorecard Approach

• The Balanced Scorecard is a performance measurement and performance management system developed by Robert Kaplan and David Norton (1992, 1996) – adopted by a wide range of leading edge

organizations, both public and private.

(“The Balanced Scorecard--Measures That Drive Performance,” Harvard Business Review, Jan-Feb 1992; and “The Balanced Scorecard-Translating Strategy into Action,” Harvard Business School Press, 1996)

Stanford Hospital & Clinics Report Card

• Indicators are designed to measure SHC’s progress toward achieving its vision; other indicators are designed to measure the long term drivers of success.

• Through the balanced scorecard, SHC :– Monitors its current performance (finances, resident

satisfaction, learning outcomes and program process results)

– Monitors its efforts to improve processes, educate residents

– Enhances its ability to grow, learn and improve the quality of its fellowship and residency educational programs.

Balanced Scorecard Strategic Perspectives

Institutional / Financial Growth

MissionVision

Strategy

Learning

Program Processes

Resident

Do we continue to improve (outcomes)?

Are our programs excelling?

How do our residents see us?

Are we putting our resources in the right

places?

Measurement Across The Continuum

• PRE: Measuring events that occur before the trainee arrives– NRMP Results

• PERI: During Residency– ACGME Survey

• POST: After they have left training to start their career. – Alumni Survey

Selection of Report Card Measures

PRE PERI POSTPRE PERI POST

RESIDENT PERCEPTIONS

PROGRAM PROCESSES

LEARNING OUTCOMES

FINANCIAL / GROWTH

• # Applicants/Open Positions• Match Depth• % Top Medical Schools

GME Internal HS Survey•Overall Satisfaction•Recommendation of Program•Teaching Quality•Curriculum Quality•Educational Leadership•Wellness IndexACGME Survey•Compliant Responses

• Alumni Survey

• Core Competency Self-Assessment

• ITSE Scores• Annual Resident

Publications • Annual Resident

Presentations• # Safety Incident Reports

• # Res in Program• Grants Awarded• Subspecialties/Program• Physical Space/Facilities

• Faculty Eval of program• Resident Eval of program• Faculty Publications• Duty Hr Violations• ACGME Cycle Length• # ACGME Citations

• Specialty Board Scores

• Core Competency Post Assessment

“Voice of the Residents”RESIDENTS   PRE       PERI     POST

 # Applicants/

Open Positions Match Depth% Top Medical

Schools

HS Survey: Overall

Satisfaction

HS Survey: Recommend

Program?

HS Survey: Teaching Quality

HS Survey: Curriculum

Quality

HS Survey: Educational Leadership

HS Survey: Wellness Index

ACGMESurvey

% ?-CompliantResponses

AlumniSurvey: Overall

Satisfaction

  >20:1 > 2 sd >90% >=5.0/6.0 >=5.0/6.0 >=5.0/6.0 >=5.0/6.0 >=5.0/6.0 >=5.0/6.0 >80 >80%

Program A                    

Program B                    

Program C                    

Program D                    

Program E                    

Program F                    

Program G                    

Program H                    

Program I                    

Program J                    

Program K                    

Program L                    

Program M                    

Program N                    

Program O                    

Program P                    

Program Q                    

Program R                    

Program ProcessesPROGRAM       PERI    

 Faculty

program Eval'sResident

program Evals

# Faculty Publications (last 5 years)

Total # Duty Hrviolations

ACGMECycle Length

ACGMECycle Length

* New program# ACGME Citations Last RRC Review

  >=5.0/6.0 >=5.0/6.0 >Inst Avg 0>=Current Inst Avg >=2.0 yrs 0

Program A              

Program B              

Program C              

Program D              

Program E              

Program F              

Program G              

Program H              

Program I              

Program J              

Program K              

Program L              

Program M              

Program N              

Program O              

Program P              

Program Q              

Program R              

Program Matrix Medical

Knowledge Patient Care

Practice-Based learning and Improvement

Professionalism Interpersonal and Communication Skills

Systems-Based Practice

Patient safety notes

Resident publications

ITE and board scores

Growth in # of subspecialty programs

Duty hours violations

Program evaluations

Teaching quality

Curriculum quality

Wellness score

ACGME survey compliance

Learning OutcomesLEARNING PRE PERI POST

 Core Competency Self Assessment ITSE Scores

Annual Resident Publications

Annual Resident Presentations

# Valid & Serious Safety Incident

ReportsSpecialty Board

ScoresCore Competency Post Assessment

  Baseline > Nat Avg > Inst Avg > Inst Avg >0 > Nat Avg > Pre Score

Program A          

Program B          

Program C          

Program D          

Program E          

Program F          

Program G          

Program H          

Program I          

Program J          

Program K          

Program L          

Program M          

Program N          

Program O          

Program P          

Program Q          

Program R          

Financial / GrowthFINANCIAL   PERI

  # of Res in program Grants Awarded# Subspecialties /

ProgramExpansion in Clinical

Programs

  >Prior 10YR Avg >Inst Avg  >Prior 10YR Avg >Prior 5 years

Program A      

Program B      

Program C      

Program D      

Program E      

Program F      

Program G      

Program H      

Program I      

Program J      

Program K      

Program L      

Program M      

Program N      

Program O      

Program P      

Program Q      

Program R      

Case Study - Stanford

• How the DIO uses the Report Card

How Do We Use this Data?

• Look at Indicators that are Resident Driven – “Voice of the Resident”– Is there a discrepancy between the voice of the

resident and the other indicators?• Would the majority of the residents not choose the

program again yet the Board Scores are high?

How Do We Use this Data?

• How Do the Programs Compare Against Each Other?

• How do they compare against their ACGME Cycles?

What’s Next?

• GME Staff Brainstorming Session• The Why’s – Why are programs where they

are?• Where do we need to focus our resources?

Presenting the Data

• Program Directors Monthly Forum– Protect the Name of the Program

• Growth and Change not Blame

Presenting the Data

• Individual Meetings with Program Directors– Share Complete Data

Action Planning

• GME Staff working with Program Directors• Sharing Findings with GMEC and

Administration

Political Fallout

• No Program Director wants to be at the bottom…

• Defensiveness• Bragging Rights

And by the way… this will help you answer:

COMMON INSTITUTIONAL REVIEW DOCUMENTQuestion 30b:

“Describe how the sponsoring institution monitors that each program provides effective educational experiences for residents that lead to measureable achievement of educational outcomes of the ACGME competencies.”

Questions

top related