dr bill shearer - monash health - “making audit ordinary” and getting extraordinary results:...

Post on 25-Dec-2014

197 Views

Category:

Business

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Dr Bill Shearer delivered the presentation at the 2014 Clinical Audit Improvement Conference. The Clinical Audit Improvement Conference explored the role of clinical audit in the new era of National Care Standards. For more information about the event, please visit: http://bit.ly/clinicalaudit14

TRANSCRIPT

Making Audit

“ordinary”

And achieving extraordinary results!

Clinical Audit

“Clinical audit is a process that has been defined as "a quality improvement process that seeks to improve patient care and outcomes through systematic review of care against explicit criteria and the implementation of change".

(Wikipedia)

“Clinical audit is a quality improvement process that seeks to improve patient care and outcomes through systematic review of care against explicit criteria…Where indicated, changes are implemented…and further monitoring is used to confirm improvement in healthcare delivery.”

(Principles for Best Practice in Clinical Audit ,2002, NICE/CHI )

“Clinical audit is an independent review of the quality of clinical care against explicit criteria or recognised standards”

Audit means many things…

• Unit audit:

• A chat amongst friends

• Accreditation:

• A chat with new friends?

• Tax audit:

• Perhaps less friendly

The extremes of audit.

• Bedside audit

• Began in Queensland

• Now enshrined in the mechanisms of accreditation

• At its best it identifies opportunities for improvement

• Fits all the definitions that talk about a “process”

• “Chaps” audit

• Been around since Hippocrates

• Enshrined in the medical psyche

• At it’s best is a fantastic forum for peer learning

• And at its worst…

Clinical audit: a comprehensive review of the

literature

(The Centre for Clinical Governance Research in Health )

Limitations of clinical audit

•clarity and measurability of the criteria and standards chosen,

•quality of the data available,

•engagement of clinicians,

•involvement of consumers’

•skills and training of participants,

•time involved to undertake an audit,

•use of information technology,

•feedback provided,

•if and how the findings are translated into quality improvement strategies,

•evaluation of improvement strategies (closing the loop).

So what if audit was …

• Evidence based

• Targeted

• Designed & agreed by the clinicians responsible

• Automatic & IT-enabled

• Embedded in the organisation

• Unavoidable

• Designed by clinicians to drive improvement

• Designed to be self-sustaining

And here’s one I prepared earlier..

Target Zero

• Essentially a targeted improvement program

• Risk based

• Clinician driven

• Clinical audit embedded

• Specific

• Improvement is mandatory

• And innovation is inevitable

• AND a TRICK

A Trick?

• Our performance

• Not benchmarked – just not good enough!

• Of course Zero harm is unachievable

• But a 50% reduction in harm each year for just 5 years can’t be that hard can it?

• After all 96.875 % isn't zero!

• And of course we wont get there but imagine if we get 3 years of improvement!

Target Zero

• Extreme clinical risks only

• Executive responsible for the risk

• Performance reported monthly & directly to

• Executive Management Team

• Chief Executive

• Board of Directors

• Plan to reduce harm by 50% each year for 5 years

• A Risk Management Committee formed

Target Zero Risk Management

Committee.

• Executive chair

• Multidisciplinary & expert

• Evidence based:

• Performance measures – how we measured achievement

• Performance- what we should achieve

• Activities

Target Zero Risk Management

Committee.

• Program logic mapping exercise

• Identify the performance gap

• Identify inputs

• Describe what the end point looks like

• Describe how the end point will be “measured”

• Describe the activities and the short term outcomes along the way

PLM for deteriorating patient

The Evidence.

• What should our performance be as a minimum

• What should we be measuring

• Sensitivity / Specificity

• Process / Outcome

• Easily collected

• Reproducible

• What should we be doing

• Best practice

How did we make it work?

• Cornerstone of organisational strategic plan

• Therefore talked about by:

• Board of Directors

• Chief Executive

• Executive Management Team

• Monash Innovation & Quality Team

• Medical Directors

• Nursing & Site Directors

Talks Cheap

– and its not clinical audit

• Monthly reporting of performance measures

• Against a target for that year

• Either:

• Green = target achieved

• Amber = better than last year’s target

• Red = worse than last year’s target

• Reported to:

• Organisation

• Risk Management Committee

• Executive Management Committee

• Board Quality Committee

• Board of Directors

How did we embed Target Zero

• Quarterly report by Risk Management Committee (RMC) to executive

• Risk rating reviewed

• Actions taken

• Barriers identified by RMC

• Strength of controls

• Each risk reported activities ,achievements & barriers to Board Quality Committee yearly

How did we embed Target Zero -

downwards

• Critical part of Executive performance plan

• Therefore:

• Critical part of Medical & Nursing Directors performance plan

• Part of Unit Head & Nurse Unit Managers performance plan

• At least part of the conversation with clinicians

• Hawthorne effect

How did we embed Target Zero -

downwards

Unit Head M&M report

•Listing of all deaths and adverse events:

• Deaths

• MET calls, delayed METs, unplanned ICU admissions

• ISR 1 & 2 incidents incl. falls and medication errors

• Staph Aureus bacteraemia

• DVT/PE

•Taken from Riskman and clinical coding data

•Sent monthly via email direct to Unit Head

•Informs “Chaps” style of clinical audit

What did the reports look like?

Instant hits and perennial misses!

•DVT/ PE: an instant hit

•Deteriorating patients: an eventual hit

•Falls: perennial miss but actually a success story- but don’t tell anyone

What did the reports look like?

0.00

0.50

1.00

1.50

2.00

2.50

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

2009-10 2010-11 2011-12 2012-13 2013-14

Rate of preventable hospital acquired DVT/PE per 1000

separations

What did the reports look like?

0.00

0.10

0.20

0.30

0.40

0.50

0.60

0.70

0.80

0.90

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

Jan

Fe

b

Mar

Ap

r

May

Jun

Jul

Au

g

Se

p

Oct

No

v

De

c

2010 2011 2012 2013 2014

Rate of In-patient Cardiac Arrests

What did the reports look like?

Rate of falls resulting in serious injury (e.g. death, fracture

or head trauma) per 1000 bed days

What did we learn?

• We are not different from other health care organisations:

• We have the same risks

• These risks have the same causes

• These risks have the same solutions

• We are different from other health care organisations:

• The way a risk develops in each organisation varies

• The way solutions develop in each organisation varies

• The barriers to improvement vary between and within organsiations

Learning & learning organisations

and improvement

• Learning:

• the acquisition of knowledge or skills through study, experience, or being taught

• Learning organizations are skilled at five main activities:

• systematic problem solving,

• experimentation with new approaches,

• learning from their own experience and past history,

• learning from the experiences and best practices of others,

• transferring knowledge quickly and efficiently throughout the organization.

Triple Loop Learning: Chris Argyris

• What are we doing now

• Anything?

• Something?

• Can we do it better- Improvement

• Are we actually doing what we think we are- AUDIT

• How well are we doing what we do- AUDIT

• What else can we do- Innovation

• What haven’t we tried

• What hasn’t anyone tried

Another depiction attributed to

Argyris

• http://www.thorsten.org/wiki/index.php?title=Triple_Loop_Learning

Have we got a

problem?

Are we fixing it as well as

possible?

Is this the best solution?

What haven’t we thought of?

A learning view of Target Zero.

0

20

40

60

80

100

120

Year Zero

Year One

Year Two

Year Three

Year Four

Single Loop Learning

Double Loop Learning

(Anything is better)

( More is better)

No idea at all!

( Better is More )

Triple Loop Learning

Consider DVT/ PE(A single loop learning success story)

0

20

40

60

80

100

120

Year Zero

Year One

Year Two

Year ThreeYear Four

The Problem.

1. Ongoing measurement

2. No new actions or interventions

1. Org wide measurement of rate DVT/PE

& audit DVT prophylaxis

2. Standardised Risk assessment &

prophylaxis

3. Audits compliance

4. Case reviews of all in hospital DVT/PE

1. Continued measurement & auditing

compliance

2. National Medication Chart

3. SHIPP includes DVT/PE

4. Case reviews of all in hospital DVT/PE

Now Consider Detiorating patients

0

20

40

60

80

100

120

Year Zero

Year One

Year Two

Year ThreeYear Four

The Problem. Performance always amber or greenDouble loop learning from the start:

Missed MET callsLocal responsesEducation of ward staff

Single Loop Learning:MET introducedCall criteria agreed

World Best Practice!

What’s happening ?

Now consider FallsA double loop learning success/failure

0

20

40

60

80

100

120

Year Zero

Year One

Year Two

Year ThreeYear Four

The Problem. Steady decline in rate of falls but never of planned magnitude.Significant fluctuation in rate but difficult to attribute to any action of oursFocus on risk assessment, prevention strategies & assessment after fall has occurred.

Introduction of best practice prevention strategies& post falls Mx procedure.

Medical Engagement, Volunteer Observers

Approaching world

best rates

What did we do about falls?

• A process of constant revisiting:

• Why do people fall in hospital?

• What did we plan to do about that?

• Was this the best plan?

• How well was it implemented?

• How can we improve implementation

• What else can we do?

Making audit ordinary.

Why Target Zero worked

• Important

• Comparisons are odious & not allowed

• Performance measures

• Evidence based

• Clinician agreed

• Minimal effort

• Improvement unavoidable

• And there’s a trick within the trick!

• As it got harder everyone tried harder

• Then they tried different

Making Audit Ordinary

• Purposeful & worthy

• Clinician designed evidence based measures

• Easy to collect and reproducible data

• Questioning the data required not resented

• Simple improvement methodology:

• begins with PLM

• Clinicians chose what to improve

• IT enabled &distributed

• Process & outcome measures

• Owned by everryone

top related