introduction to improving the patient experience series

Post on 13-Jan-2016

38 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

Introduction to Improving the Patient Experience Series. Part 2 – March 9, 2011. Measuring the Patient Experience Tammy Fisher, MPH Director, Quality & Performance Improvement San Francisco Health Plan. Agenda. Purposes of Measurement Measurement to identify areas for improvement - PowerPoint PPT Presentation

TRANSCRIPT

Introduction to Improving the Patient Experience Series

Measuring the Patient Experience

Tammy Fisher, MPHDirector, Quality & Performance Improvement

San Francisco Health Plan

Part 2 – March 9, 2011

Agenda

• Purposes of Measurement

• Measurement to identify areas for improvement– Tools, methodologies , frequency

• Measurement for testing & implementing changes– Data collection strategies, tools, and methodologies .

• Measurement to spread and sustain improvements – Tools, methodologies, frequency

• Lessons Learned from the field– San Francisco Health Plan

2

3

Purposes of Measurement

Aspect Improvement Accountability Research

Aim Improvement of care Comparison, choice, reassurance

New knowledge

Test Observability Test observations Evaluate current performance; no test

Test blinded

Bias & Sample Size

Consistent bias – just enough data

Measure and adjust to reduce bias – 100% of data

Design to eliminate bias – just in case data

Flexibility of hypothesis

Improvement of care No hypothesis Fixed hypothesis

Testing strategy Sequential tests No tests 1 test

Is change an improvement?

Run or control charts

No change focus Hypothesis tests (F-test, T-test, Chi-squared, P-value)

Confidentiality of data

Only used by those involved in improvement

Available for public consumption

Identities protected

3

Applying it to Patient Experience

1. Research • Source for changes to try • Helps build “will” to try changes

2. Improvement • Understand impact of changes quickly• Provide rapid feedback – engagement strategy• Convince others to try changes

3. Accountability• Sustainability- public reporting, pay for performance

4

Measurement Continuum for Improvement

5

Identify Areas and People for Improvement

• Robust surveys• Robust measurement methodologies• Review trended results • Data at the organization and individual

provider level • Look at composites strongly correlated

with overall ratings of experience• Align areas with strategic goals

“organizational or clinic energy”

6

Example of a Priority Matrix for CAHPS Health Plan Survey Results

                                                                                                                                                                                                   

7

Surveys

• Clinician Group CAHPS Survey• https://www.cahps.ahrq.gov/content/products/CG/

PROD_CG_CG40Products.asp?p=1021&s=213

• PBGH Short PAS Survey• PAS website:

http://www.cchri.org/programs/programs_pas.html • Short PAS survey:

http://www.calquality.org/programs/patientexp/documents/Short_Form_Survey_PCP_feb2010.doc

• Other surveys – Press Ganey and Avatar

8

Survey OptionsVendor Method of

AdministrationCost Considerations Groups using it

MTC: Ph-800-295-9681, ask for Guy Swenson

Telephonic $5-10/ completed survey

+ can customize survey and development costs are low and turn around is quick + rapid feedback (usually within two weeks of survey completion)- reporting is limited so need resources internally to manipulate data for reporting purposes

MG John Muir Physician Associates Camino Medical Group CQC doctors in first Collaborative

Sullivan/Luallin: ph- 619.283.8988 or at www.sullivan-luallin.com 

Mailed Survey Variable + recognized by CAPG+ good reporting capabilities + in wide use by multiple groups +option for customization

Many CA groups( , Beaver, Sharp)

Press Ganey www.pressganey.com

Mailed Survey Call for a quote. + robust survey, good reputation+ excellent reporting capability - especially good in hospitals/homecare, less so in outpatient

UCSF

PBGH doctor level survey: Ted VonGlahn, ph- 415-615-6318

Mailed survey once a year

$185/perdoctor

+ very robust reporting, including physician detailed actionable report+robust algorithms for selecting random samples- limited for QI purposes

40 groups in CA

AMGA –http://www.amga.org/QMR/PSAT/index_psat.asp

Point of service survey Check out costs on their website. A little complicated.

+ in wide use+ provides feedback regularly + analytic and reporting capabilities + good benchmarks +includes methodologies for assuring random sample - once data are forwarded to , report 5-6 weeks later

A large number of national and CA groups using it.

Avatar www.avatar-intl.com

Mailed survey Ask for a quote. +in wide use nationally+ provides feedback regularly+ includes methodologies for assuring random sample +good benchmarks+analytic and reporting capabilities

St. Joseph Heritage Medical Group

9

Robust Methodologies

• Mail administration– 3 waves of mailing (initial mail,

postcard reminder, second mail)

• Telephone administration– At least 6 attempts across different

days of the week and times of day

• Mixed mail and telephone administration– Boost mail survey response by adding

telephone administration10

Tips• Survey

– Include questions that matter most to consumers – Questions that ask about care experience– Applicability across heterogeneous populations – Demonstrates strong psychometric properties – Sufficient response categories (4 point – 6 point

scales)

– Reporting – Includes internal and external benchmarks

• Methodology– Appropriate sampling (reduce bias, large samples)– Standardized protocols– Timeframe- in the last 12 months

• Frequency– Annually 11

MEASUREMENT FOR QUALITY IMPROVEMENT

12

Purposes of Measurement

1. For Leadership to know if changes have an impact and to build a compelling case to spread changes to others

2. For providers and staff to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic

13

Three Key Questions

1.What are we trying to accomplish? ((Aim)

2.How will we know that a change is an improvement? (Measure)

3.What changes can we make that will result in an improvement? (Change)

14

AIM Statement

15

Selected Changes

16

PDSA – Rapid Cycle Improvement

•What changes are to be made?•Next cycle?

Act•Questions & predictions (why?)•Plan to carry out the cycle

Plan

Check/Study Do•Carry out the plan•Document problems and observations•Begin data analysis

•Complete data analysis•Compare data to predictions•Summarize what was learned

Adapted from the Institute for Healthcare Improvement Breakthrough Series College 17

Repeated Uses of PDSA Cycle

Hunches Theories Ideas

Changes That Result in Improvement

A P

S D

APS

D

A P

S D

D SP A

DATA

Very Small Scale Test

Follow-up Tests

Wide-Scale Tests of Change

Implementation of Change

Adapted from the IHI Breakthrough Series College

18

Evaluate Impact of Changes

• Data collection strategies/tools specific to changes tested & implemented

• Methodologies that allow for sequential testing – small samples, less standardization

• Data given to individuals testing changes • Enough data to know a change is an

improvement and to convince others to try it

• Frequent feedback during testing – daily, weekly, collecting data over time

• Inexpensive methods 19

20

Monthly Telephonic Surveys

21

Data Collection Tools

• Point of service surveys

• Telephonic surveys

• Comment cards

• Patient exit surveys

• Focus groups

• Kiosks, via web

• Feedback from people doing the changes

• Observation

• Patient Advisory Boards

22

Point of Service • Focus on meaningful measures tied to AIM

statement

• Have 4-6 response choices

• Include enough measures to appropriately evaluate aspect of care

• Consistent methodology; train staff collecting information

• Collect “just enough” data

• Need 15 measurement points for a run chart

• Data collection can be burdensome!

23

Telephonic Surveys

• More rapid feedback than mailed surveys

• Typically less expensive

• Outside vendors do it and provide reports

• Easy to manipulate data for reporting

• Less frequent – monthly data at best

• Literature suggests more bias than mailed surveys (not so important when testing)

24

Sample Comment Card

Comment CardWe would like to know what you think about your visit with Doctor X.

□ Yes, Definitely □ Yes, Somewhat, □ No

Did Dr. X listen carefully to you?

Did Dr. X explain things in a

way that was easy to understand?

Is there anything you would like to comment on further?

Thank you. We are committed to improving the care and services we provide our patients.

25

Patient Exit Interviews

• Rapid feedback on changes tested

• Not burdensome to collect data

• Uncover new issues which may go unreported in surveys

• Requires translation of information into actionable behaviors

• Providers “see” the feedback

• Include 3-5 questions, mix of specific measures and open ended questions

26

Patient Visit Walk-through Through the Eyes of Your Patients

Tips for making the "Walk Through" most productive:1. Determine with your staff where the starting point and

ending points should be, taking into consideration making the appointment, the actual office visit process, follow-up and other processes.

2. Two members of the staff should role play with each playing a role: patient and partner/family member.

3. Set aside a reasonable amount of time to experience the patient journey. Consider doing multiple experiences along the patient journey at different times.

4. Make it real. Note the part of the visit: time with registration, time in waiting room, time with MA/MEA, time with provider, discharge. Wear what the patient wears. Make a realistic paper trail including chart, lab reports and follow-up.

5. During the experience note both positive and negative experiences, as well as any surprises. What was frustrating? What was gratifying? What was confusing? Again, an audio or video tape can be helpful.

6. Debrief your staff on what you did and what you learned.

Date: Staff Members:Walk Through Begins When: Ends When:

Positives Negatives Surprises Frustrating/Confusing Gratifying

SIGNING IN/POINT OF-SERVICE FEENone

TIME WITH PROVIDERSpent enough time, all questions were answered during the visit

Takes forever- made copy of driver’s license; staff had no change for Pt-of-Svc fee.

None.

The number of steps involved to register a patient

I liked the Agenda-Setting Form the provider used.

Was not directed to waiting room, didn’t know what to do next.

When provider left, I didn’t know what was going to happen next.

Finally sitting down in waiting room.

All my questions were answered by provider.

Spreading & Sustaining Improvements• Survey

– Include questions that matter most to consumers – Questions that ask about care experience– Applicability across heterogeneous populations – Demonstrates strong psychometric properties

• Reporting – Comparisons within peer group

• Methodology– Appropriate sampling (reduce bias, large samples)– Standardized protocols– Risk adjustment

• Frequency– Monthly, Quarterly

28

Another Look at Data

• Medical Group in Los Angeles

29

LESSONS LEARNED: SAN FRANCISCO HEALTH PLAN

30

Areas for Improvement

• Provider-patient communication, office staff, & Access to care – Performed in the lowest quartile– PPC and Access strongly correlated

with overall ratings of care – Office staff support provider-patient

communication – Team approach

31

Improvement Project

• AIM: To improve CAHPS scores by achieving the 50th percentile in the following composites by MY 2012:– Access to care– Provider-patient communication

• APPROACH– Begin with 10 community clinics– Spread to most clinics by MY 2011

32

Purposes for Measurement

1. For Leadership to know if changes have an impact and to build a compelling case to spread changes to other clinics

2. For Clinics to get rapid feedback on tests of change to understand their progress towards their own aims and to spread to others in the clinic

33

Purpose 1 (for Spread)

Measures & Approach

Measures Methodology Frequency Reports

Patients’ ratings of their care

At provider level with roll up to clinic

Point-of-Care survey, about 30 questions, using a nationally recognized tool

Quarterly Risk-adjusted data, delineating statistical significance. Showing data over time.

Clinic Site Satisfaction

Online survey instrument

Quarterly Data over timeAnonymous

34

CAHPS Survey Results

For this provider, there was an 89% “confidence of change” in the 13% improvement for the measure: “Doctor Spends Enough Time with the Patient”

35

Patient Ratings of their Care• Standardized survey instrument based on the Clinician-

Group CAHPS visit survey, about 30 questions• Administered at the point of care by clinic

– SFHP provides surveys in 3 languages (English, Spanish, Chinese) and picks up surveys on Friday of each week

• Defined methodology – all patients, given after the visit• Three fielding periods: April 2010, Oct 2010, Jan 2011• Each fielding period is 4 weeks • Risk adjusted results at the provider level with roll up at

clinic level• Patient incentives – two movie tickets/survey• Extra incentives – up to $500 per clinic

36

Clinic/Practice Site Satisfaction

• Survey instrument based on the Dartmouth and Tantau & Associates, about 20 questions

• Administered online by SFHP

– SFHP sends a link to complete the survey online

– Anonymous, results can be aggregated by role

• Five fielding periods: March 2010, June 2010, Sept 2010, Dec 2010, March 2011

• Each fielding period is 2 weeks

• Results at the clinic level 2 weeks following the close of the measurement period

37

Purpose 2 (for Clinics) Measures & Approach

Measures Methodology Options

Frequency Reports

Patients’ ratings of their care

Select 5-7 measures based on AIM statement

1. Point of service survey

2. Telephonic survey

3. Patient exit interviews

4. Patient Advisory Boards

Weekly MonthlyClinics document experience and results in a narrative

38

Point of Care Survey

¿Fue Usted recibido de una manera amable?

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

6/15/10 6/30/10 7/15/10 7/30/10 8/14/10 8/29/10 9/13/10 9/28/10 10/13/10 10/28/10

Porc

enta

je q

ue re

spon

dió:

"Si

, defi

nitiv

amen

te"

Porcenta je que respondió: "Si ,defi nitivamente"

42% 75% 82% 90% 100% 100% 100% 100% 100%

6/15/10 6/30/10 7/15/10 7/30/10 8/15/10 9/15/10 9/30/10 10/15/10 10/30/10

N= 20 N=18 N=28 N= 19 N=17 N=15 N=15 N=20 N=15

We aim to make a statistically significant improvement in the number of patients who report "Yes, definitely" they received a warm greeting

39

Staff & Patient Feedback

• “During today’s visits, my experience was excellent! Before today my appointments were not that great, but today, I noticed an improvement- A big change! Very Helpful, Thank you”

• “During today’s visit, I noticed the staff with a better attitude towards their work, especially in the front desk.”

• Our staff and patients are loving the electronic patient summary discharge. The patients are saying. “I know have something to reference back to about my visit. It makes it easy on my to remember what I need to do to take care of my health.” “I feel that I am responsible for my health” “I have a contract with my doctor”

40

ChallengesLessons Learned

• Adapted the CAHPS Visit Based Survey - low reliabilities and less variation – few response categories

• Point of care methodology – introduced a lot of bias • Incentives were extremely helpful • Low literacy patients needed help with the survey • Very high scores on survey – switched from mean

to proportional scoring• Providers trusted “just enough data” to implement

change with their patients

41

top related