constructive error: the error berg 2019 _y… · dures and switched off successive safety systems,...

25
11 November 2019 constructive error: the error berg M Gaetani & C Parshuram ICU Clinicians· Trainee & Faculty

Upload: others

Post on 17-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

11 November 2019

constructive error: the error berg

M Gaetani & C Parshuram ICU Clinicians· Trainee & Faculty

Page 2: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

Gaetani: none relevant to this presentation

Parshuram: named inventor Bedside Paediatric Early Warning System shares in Bedside Clinical Systems a clinical decision support company in part owned by SickKids

disclosures

Page 3: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention Experimentation with influenza virus

error staph. aureus on unwashed petri dishes

observation after vacation new ‘mould’ inhibited bacteria.

penicillin

Page 4: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention build a heart rhythm recording device

error faulty resistor emitted electric pulses

observation pulses paced the heart.

pacemaker

Page 5: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention chemist studying plastic

error knocked over cellulose nitrate

observation broken glass stayed in place.

safety glass

Page 6: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention coal tar analysis.

error didn’t wash hands after work in the lab.

observation everything touched at dinner was sweet.

saccharin

Page 7: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention hypertension and angina

error no effect seen on angina

observation improved / sustained erection.

sildenafil

Page 8: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

intention good

error happened

observation unexpected benefit (>lack of harm).

a moment of reflection

Page 9: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

loss of control in the controlled environment

medical error executing the plan for a controlled airway

measure of safety / quality of care

focus of safety activities. ...... OK.

unplanned extubation

Page 10: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

458 unplanned extubation events263 (57%) PPV <24 hoursa lot of bad clinical events 52 (11%) bradycardia 63 (14%) stridor 9 (2.0%) cardiac arrest 1 (0.2%) aspiration & prevention efforts ++

Figure 1

Annual incidence and rate of unplanned extubation

Num

ber

of e

vent

s

Eve

nts/

100

intu

bati

on d

ays

Year of event

0

10

20

30

40

50

60

70

80No PPVNon-Invasive PPVIntubation

201420132012201120102009200820072006200520040.0

0.2

0.4

0.6

0.8

0

10

20

30

40

50

60

70

80No PPVNon-Invasive PPVIntubation

201420132012201120102009200820072006200520040.0

0.2

0.4

0.6

0.8

2011 Quality improvement project

2012 Unplanned Extubation Huddle

2013 ETT securing devices

2008 Quality Leader appointed

Page 11: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

195 (43%) no PPV at 24 hours.

✓ fewer events, lower rates

Page 12: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

& we wondered...

...did this ‘safety event’ provide benefit ?

... was the error in planning to keep the child intubated ?

... could we learn - by identifying low risk extubations ?

Page 13: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core
Page 14: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

medical erroraction different than ideal or intended

& leading to...... patient injury & death the second victim the quest for zero error as patient safety gaol

Page 15: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

Heinrich industrial accident ratios“600-30-10-1” ratios Theory articulated in 1931 Empiric data in 1941 1.7 million ‘accidents’ pyramids consistent across.. 297 companies / 3 billion worker-hours

a triangle ‘pyramid’

>> accident elimination prevents death <<a stitch in time saves nine / an ounce of prevention is better than a pound of cure /

...two-wrongs don’t make a right

Page 16: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

error pyramid

Harmful

Theoretical harm

Potential

Near miss

Page 17: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

errors coincide = facilitating the error trajectory

cheese model of harmful error

key element of a reporting culture and this, in turn,requires the existence of a just culture—one possessinga collective understanding of where the line should bedrawn between blameless and blameworthy actions.5

Engineering a just culture is an essential early step increating a safe culture.

Another serious weakness of the person approachis that by focusing on the individual origins of error itisolates unsafe acts from their system context. As aresult, two important features of human error tend tobe overlooked. Firstly, it is often the best people whomake the worst mistakes—error is not the monopoly ofan unfortunate few. Secondly, far from being random,mishaps tend to fall into recurrent patterns. The sameset of circumstances can provoke similar errors,regardless of the people involved. The pursuit ofgreater safety is seriously impeded by an approach thatdoes not seek out and remove the error provokingproperties within the system at large.

The Swiss cheese model of systemaccidentsDefences, barriers, and safeguards occupy a keyposition in the system approach. High technology sys-tems have many defensive layers: some are engineered(alarms, physical barriers, automatic shutdowns, etc),others rely on people (surgeons, anaesthetists, pilots,control room operators, etc), and yet others depend onprocedures and administrative controls. Their functionis to protect potential victims and assets from localhazards. Mostly they do this very effectively, but thereare always weaknesses.

In an ideal world each defensive layer would beintact. In reality, however, they are more like slices ofSwiss cheese, having many holes—though unlike in thecheese, these holes are continually opening, shutting,and shifting their location. The presence of holes inany one “slice” does not normally cause a bad outcome.Usually, this can happen only when the holes in manylayers momentarily line up to permit a trajectory ofaccident opportunity—bringing hazards into damag-ing contact with victims (figure).

The holes in the defences arise for two reasons:active failures and latent conditions. Nearly all adverseevents involve a combination of these two sets of factors.

Active failures are the unsafe acts committed bypeople who are in direct contact with the patient orsystem. They take a variety of forms: slips, lapses, fum-bles, mistakes, and procedural violations.6 Active

failures have a direct and usually shortlived impact onthe integrity of the defences. At Chernobyl, forexample, the operators wrongly violated plant proce-dures and switched off successive safety systems, thuscreating the immediate trigger for the catastrophicexplosion in the core. Followers of the personapproach often look no further for the causes of anadverse event once they have identified these proximalunsafe acts. But, as discussed below, virtually all suchacts have a causal history that extends back in time andup through the levels of the system.

Latent conditions are the inevitable “resident patho-gens” within the system. They arise from decisionsmade by designers, builders, procedure writers, and toplevel management. Such decisions may be mistaken,but they need not be. All such strategic decisions havethe potential for introducing pathogens into thesystem. Latent conditions have two kinds of adverseeffect: they can translate into error provokingconditions within the local workplace (for example,time pressure, understaffing, inadequate equipment,fatigue, and inexperience) and they can createlonglasting holes or weaknesses in the defences(untrustworthy alarms and indicators, unworkable pro-cedures, design and construction deficiencies, etc).Latent conditions—as the term suggests—may liedormant within the system for many years before theycombine with active failures and local triggers to createan accident opportunity. Unlike active failures, whosespecific forms are often hard to foresee, latentconditions can be identified and remedied before anadverse event occurs. Understanding this leads toproactive rather than reactive risk management.

To use another analogy: active failures are likemosquitoes. They can be swatted one by one, but theystill keep coming. The best remedies are to create moreeffective defences and to drain the swamps in whichthey breed. The swamps, in this case, are the everpresent latent conditions.

Error managementOver the past decade researchers into human factorshave been increasingly concerned with developing thetools for managing unsafe acts. Error management hastwo components: limiting the incidence of dangerouserrors and—since this will never be wholly effective—creating systems that are better able to tolerate theoccurrence of errors and contain their damagingeffects. Whereas followers of the person approachdirect most of their management resources at trying tomake individuals less fallible or wayward, adherents ofthe system approach strive for a comprehensivemanagement programme aimed at several differenttargets: the person, the team, the task, the workplace,and the institution as a whole.3

High reliability organisations—systems operatingin hazardous conditions that have fewer than their fairshare of adverse events—offer important models forwhat constitutes a resilient system. Such a system has

Losses

Hazards

The Swiss cheese model of how defences, barriers, and safeguardsmay be penetrated by an accident trajectory

We cannot change the human condition,but we can change the conditions underwhich humans work

Education and debate

769BMJ VOLUME 320 18 MARCH 2000 www.bmj.com

Page 18: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

action different than ideal or intended1 ‘perfect intent’ can be wrong PACatheter. Tight Glucose Control. Keep intubated. ‘Requires oxygen’.

2 controlled environments in ICU & OR predicated on adherence to intent rather than absolute knowlege of “ideal”(perfect evidence).

3 error is routinely conflated with patient harm ‘An expert panel from the Institute of Medicine, part of the National Academy of Sciences, found that medical errors kill from 44000 to 98000 Americans each year’ (BMJ 1999)

4 error reducing technologies may not improve outcomes CPOE, CDSS, bar coding, electronic health-records (they reduce apparent errors).

error: misconstrued

Page 19: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

‘a goal’ ... increasingly challenged.

resilience / rescue / error mitigation-correction. zero ICU mortality... and zero seeking statistical models.

(aka quality & safety)...

zero error /harm

Thomas EJ. BMJ Qual Saf 2019;0:1–3. doi:10.1136/bmjqs-2019-009703 1

Editorial

McGovern Medical School at The University Texas Health Science Center at Houston, Houston, Texas, USA

Correspondence toDr Eric J Thomas, McGovern Medical School at The University Texas Health Science Center at Houston, Houston, TX 77030, USA; eric. thomas@ uth. tmc. edu

Accepted 20 September 2019

To cite: Thomas EJ. BMJ Qual Saf Epub ahead of print: [please include Day Month Year]. doi:10.1136/bmjqs-2019-009703

► http:// dx. doi. org/ 10. 1136/ bmjqs- 2019- 009443

The harms of promoting ‘Zero Harm’

Eric J thomas

© Author(s) (or their employer(s)) 2019. No commercial re-use. See rights and permissions. Published by BMJ.

In this issue, Amalberti and Vincent1 ask ‘what strategies we might adopt to protect patients when healthcare systems and organizations are under stress and simply cannot provide the standard of care they aspire to’. This is clearly a critical and much overdue question, as many healthcare organisations are in an almost constant state of stress from high workload, personnel shortages, high-complexity patients, new tech-nologies, fragmented and conflicting payment systems, over-regulation, and many other issues. These stressors put mid-level managers and front-line staff in situations where they may compromise their standards and be unable to provide the highest quality care. Such circum-stances can contribute to low morale and burn-out.

The authors provide guidance for addressing this tension of providing safe care during times of organisational stress, including principles for managing risk in difficult conditions, examples for managing this tension in other high-risk industries, and a research and develop-ment agenda for healthcare. Leaders at all levels of healthcare organisations should read this article.

These authors join others2 who advise that we should shift our focus from creating absolute safety (meaning the elim-ination of error and harm) towards doing a better job of actively managing risk. I want to expand on this point to explore how an excessive focus on absolute safety may paradoxically reduce safety.

Striving for absolute safety—often termed ‘zero harm’—is encouraged by some consultants, patient safety experts and regulators. Take for example the recently published book, ‘Zero Harm: How to Achieve Patient and Workforce Safety in Healthcare’,3 edited by three leaders of Press Ganey, a large organisa-tion that works with over 26 000 health-care organisations with the mission of helping organisations improve patient experience, including improving safety.

The book states, ‘We will only reduce serious safety events, and improve organi-zations’ overall performance, if every US healthcare system commits to zero harm as a sacred core value’ (Harm, p254).3 The book is commendable for presenting many accepted and effective practices for improving patient safety (many of which do not explicitly seek or argue for zero harm goals). Nine well-known leaders of exemplary US healthcare systems endorse the book. However, when I reflect on the field of patient safety research and my experience as a leader of efforts to improve patient safety, I can identify not only challenges, but potential harms of overemphasising zero harm goals.

Before I discuss these potential harms, I will first review the types of harms in healthcare and point out that some harms are inevitable and impossible to eliminate. This alone should cause reconsideration of zero harm goals. The patient safety movement began with studies that clas-sified harms as either unpreventable or preventable (including negligent) adverse events.4 5 Unpreventable harms include harms that are actually intended and neces-sary to treat disease—for example, the harm from a surgical incision to remove a ruptured appendix, and harms such as adverse drug reactions in a patient who has never received the medication before, or postoperative complications that we currently do not have the knowledge to prevent. Of course today’s unpreventable harm may with more research be tomor-row’s preventable harm. But nevertheless, at any given moment in history, there are harms for which we do not have the knowledge to prevent. It is primarily the responsibility of researchers and improve-ment experts, not the typical clinician or healthcare organisation, to understand how to prevent the unpreventable.

Preventable harms historically included those due to human errors, such as slips and lapses, negligent care, and those for which interventions have been tested and proven effective at preventing them, such

on Novem

ber 9, 2019 by guest. Protected by copyright.http://qualitysafety.bm

j.com/

BMJ Q

ual Saf: first published as 10.1136/bmjqs-2019-009703 on 9 O

ctober 2019. Dow

nloaded from

Thomas EJ. BMJ Qual Saf 2019;0:1–3. doi:10.1136/bmjqs-2019-009703 1

Editorial

McGovern Medical School at The University Texas Health Science Center at Houston, Houston, Texas, USA

Correspondence toDr Eric J Thomas, McGovern Medical School at The University Texas Health Science Center at Houston, Houston, TX 77030, USA; eric. thomas@ uth. tmc. edu

Accepted 20 September 2019

To cite: Thomas EJ. BMJ Qual Saf Epub ahead of print: [please include Day Month Year]. doi:10.1136/bmjqs-2019-009703

► http:// dx. doi. org/ 10. 1136/ bmjqs- 2019- 009443

The harms of promoting ‘Zero Harm’

Eric J thomas

© Author(s) (or their employer(s)) 2019. No commercial re-use. See rights and permissions. Published by BMJ.

In this issue, Amalberti and Vincent1 ask ‘what strategies we might adopt to protect patients when healthcare systems and organizations are under stress and simply cannot provide the standard of care they aspire to’. This is clearly a critical and much overdue question, as many healthcare organisations are in an almost constant state of stress from high workload, personnel shortages, high-complexity patients, new tech-nologies, fragmented and conflicting payment systems, over-regulation, and many other issues. These stressors put mid-level managers and front-line staff in situations where they may compromise their standards and be unable to provide the highest quality care. Such circum-stances can contribute to low morale and burn-out.

The authors provide guidance for addressing this tension of providing safe care during times of organisational stress, including principles for managing risk in difficult conditions, examples for managing this tension in other high-risk industries, and a research and develop-ment agenda for healthcare. Leaders at all levels of healthcare organisations should read this article.

These authors join others2 who advise that we should shift our focus from creating absolute safety (meaning the elim-ination of error and harm) towards doing a better job of actively managing risk. I want to expand on this point to explore how an excessive focus on absolute safety may paradoxically reduce safety.

Striving for absolute safety—often termed ‘zero harm’—is encouraged by some consultants, patient safety experts and regulators. Take for example the recently published book, ‘Zero Harm: How to Achieve Patient and Workforce Safety in Healthcare’,3 edited by three leaders of Press Ganey, a large organisa-tion that works with over 26 000 health-care organisations with the mission of helping organisations improve patient experience, including improving safety.

The book states, ‘We will only reduce serious safety events, and improve organi-zations’ overall performance, if every US healthcare system commits to zero harm as a sacred core value’ (Harm, p254).3 The book is commendable for presenting many accepted and effective practices for improving patient safety (many of which do not explicitly seek or argue for zero harm goals). Nine well-known leaders of exemplary US healthcare systems endorse the book. However, when I reflect on the field of patient safety research and my experience as a leader of efforts to improve patient safety, I can identify not only challenges, but potential harms of overemphasising zero harm goals.

Before I discuss these potential harms, I will first review the types of harms in healthcare and point out that some harms are inevitable and impossible to eliminate. This alone should cause reconsideration of zero harm goals. The patient safety movement began with studies that clas-sified harms as either unpreventable or preventable (including negligent) adverse events.4 5 Unpreventable harms include harms that are actually intended and neces-sary to treat disease—for example, the harm from a surgical incision to remove a ruptured appendix, and harms such as adverse drug reactions in a patient who has never received the medication before, or postoperative complications that we currently do not have the knowledge to prevent. Of course today’s unpreventable harm may with more research be tomor-row’s preventable harm. But nevertheless, at any given moment in history, there are harms for which we do not have the knowledge to prevent. It is primarily the responsibility of researchers and improve-ment experts, not the typical clinician or healthcare organisation, to understand how to prevent the unpreventable.

Preventable harms historically included those due to human errors, such as slips and lapses, negligent care, and those for which interventions have been tested and proven effective at preventing them, such

on Novem

ber 9, 2019 by guest. Protected by copyright.http://qualitysafety.bm

j.com/

BMJ Q

ual Saf: first published as 10.1136/bmjqs-2019-009703 on 9 O

ctober 2019. Dow

nloaded from

“The safety II approach focuses on successes and adaptation in addition to examining failures”

“Safety I posits that we can identify causal chains of events that lead to harm, and prescribe clear-cut interventions to prevent the harm.”

Page 20: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

Planning error are errors in what to do (& may be occult) Execution error is not following the plan These errors may coincide.

Exec

utio

n er

ror Planning error

No Yes

No controlled ✓ error

Yes error 2 errors

planning & execution errors

Page 21: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

error berg Actual

Clin

ical

Con

sequ

ence Theoretical

Actual

Theoretical

Benefit

Harm

Potential

Near miss

Near miss

Potential

Beneficial

Har

mfu

l

ANESTHESIOLOGY, V 130 • NO XXX XXX 2019 1

IMAGES IN Anesthesiology Brian P. Kavanagh, M.B., F.R.C.P.C., Editor

From the Department of Critical Care Medicine, (M.G., C.P.), the Center for Safety Research (M.G., C.P.); and Child Health Evaluative Sciences, The Research Institute, Hospital for Sick Children (C.P.); the Department of Paediatrics (M.G., C.P.), the Interdepartmental Division of Critical Care Medicine (M.G., C.P.), the Institute of Health Policy Management and Evaluation, the Center for Quality Improvement and Patient Safety, and the Faculty of Medicine (C.P.), University of Toronto, Toronto, Ontario, Canada.

Copyright © 2019, the American Society of Anesthesiologists, Inc. Wolters Kluwer Health, Inc. All Rights Reserved. Anesthesiology 2019; 130:00–00

The Error-bergReconceptualizing Medical Error as a Tool for Quality and SafetyMelany Gaetani, M.B.B.Ch., B.A.O., Christopher Parshuram, M.B.Ch.B., D.Phil.

Medical errors occur when there is deviation from the preconceived “ideal” plan or action. Classical doc-

trine associates error with harm and conceptual models generally don’t consider empiric observations that medical errors sometimes benefit patients.1 In the image above, the “Error-berg” represents the spectrum of consequences of medical error. At the base of each triangle are medical errors that have no patient impact (positive or negative); the apex of each represents the smaller number of errors that have important patient consequences.

Consider unplanned extubation following a surgical pro-cedure under general anesthesia where a patient is deemed unsuitable for extubation and transferred to the intensive care unit for ongoing mechanical ventilation. Adverse con-sequences may manifest as need for reintubation, airway injury, cardiac arrest or death (lower triangle).2 Learning, in these situations, traditionally focuses on prevention of unplanned extubation in order to improve patient safety.

Some patients, however, in whom unplanned extu-bation occurs, remain safely extubated, benefitting from “error” (upper triangle). Here, two errors occur: one error in planning (continue mechanical ventilation) and second in execution (unplanned extubation). Openness to learning from situations where delivered care has unexpected con-sequences, and reassessing preconceived assumptions about medical error may help improve quality of future care.3

The Error-berg also provides a potential explanation for the perpetuation of error in health care. Identification of incidental variation (errors), that highlight patient situations where the optimal plan or execution is uncertain, may lead to better care. Actively seeking beneficial medical errors can help astute clinicians redefine “ideal” care and improve practice and outcome.

Competing Interests

The authors declare no competing interests.

Correspondence

Address correspondence to Dr. Parshuram: [email protected]

References

1. Reason J: Human error: Models and management. BMJ 2000; 320:768–70

2. Al-Abdwani R, Williams CB, Dunn C, Macartney J, Wollny K, Frndova H, Chin N, Stephens D, Parshuram CS: Incidence, outcomes and outcome prediction of unplanned extubation in critically ill children: An 11 year experience. J Crit Care 2018; 44:368–75

3. Boring RL, Lew R, Ulrich TA, Savchenko K: When human error is good: Applications of beneficial error seeding, proc. 13th Int. Conf. Probabilistic Safety Assessment and Management (PSAM 13). 2016, pp 2–7

Actual

Clin

ical

Co

nse

qu

ence Theoretical

Actual

Theoretical

Harm

Potential

Near miss

Near miss

PotentialH

arm

ful

Copyright © 2019, the American Society of Anesthesiologists, Inc. Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.<zdoi;10.1097/ALN.0000000000002707>

ANESTHESIOLOGY, V 130 • NO XXX XXX 2019 1

IMAGES IN Anesthesiology Brian P. Kavanagh, M.B., F.R.C.P.C., Editor

From the Department of Critical Care Medicine, (M.G., C.P.), the Center for Safety Research (M.G., C.P.); and Child Health Evaluative Sciences, The Research Institute, Hospital for Sick Children (C.P.); the Department of Paediatrics (M.G., C.P.), the Interdepartmental Division of Critical Care Medicine (M.G., C.P.), the Institute of Health Policy Management and Evaluation, the Center for Quality Improvement and Patient Safety, and the Faculty of Medicine (C.P.), University of Toronto, Toronto, Ontario, Canada.

Copyright © 2019, the American Society of Anesthesiologists, Inc. Wolters Kluwer Health, Inc. All Rights Reserved. Anesthesiology 2019; 130:00–00

The Error-bergReconceptualizing Medical Error as a Tool for Quality and SafetyMelany Gaetani, M.B.B.Ch., B.A.O., Christopher Parshuram, M.B.Ch.B., D.Phil.

Medical errors occur when there is deviation from the preconceived “ideal” plan or action. Classical doc-

trine associates error with harm and conceptual models generally don’t consider empiric observations that medical errors sometimes benefit patients.1 In the image above, the “Error-berg” represents the spectrum of consequences of medical error. At the base of each triangle are medical errors that have no patient impact (positive or negative); the apex of each represents the smaller number of errors that have important patient consequences.

Consider unplanned extubation following a surgical pro-cedure under general anesthesia where a patient is deemed unsuitable for extubation and transferred to the intensive care unit for ongoing mechanical ventilation. Adverse con-sequences may manifest as need for reintubation, airway injury, cardiac arrest or death (lower triangle).2 Learning, in these situations, traditionally focuses on prevention of unplanned extubation in order to improve patient safety.

Some patients, however, in whom unplanned extu-bation occurs, remain safely extubated, benefitting from “error” (upper triangle). Here, two errors occur: one error in planning (continue mechanical ventilation) and second in execution (unplanned extubation). Openness to learning from situations where delivered care has unexpected con-sequences, and reassessing preconceived assumptions about medical error may help improve quality of future care.3

The Error-berg also provides a potential explanation for the perpetuation of error in health care. Identification of incidental variation (errors), that highlight patient situations where the optimal plan or execution is uncertain, may lead to better care. Actively seeking beneficial medical errors can help astute clinicians redefine “ideal” care and improve practice and outcome.

Competing Interests

The authors declare no competing interests.

Correspondence

Address correspondence to Dr. Parshuram: [email protected]

References

1. Reason J: Human error: Models and management. BMJ 2000; 320:768–70

2. Al-Abdwani R, Williams CB, Dunn C, Macartney J, Wollny K, Frndova H, Chin N, Stephens D, Parshuram CS: Incidence, outcomes and outcome prediction of unplanned extubation in critically ill children: An 11 year experience. J Crit Care 2018; 44:368–75

3. Boring RL, Lew R, Ulrich TA, Savchenko K: When human error is good: Applications of beneficial error seeding, proc. 13th Int. Conf. Probabilistic Safety Assessment and Management (PSAM 13). 2016, pp 2–7

Actual

Clin

ical

Co

nse

qu

ence Theoretical

Actual

Theoretical

Harm

Potential

Near miss

Near miss

Potential

Har

mfu

l

Copyright © 2019, the American Society of Anesthesiologists, Inc. Wolters Kluwer Health, Inc. Unauthorized reproduction of this article is prohibited.<zdoi;10.1097/ALN.0000000000002707>

explicitly acknowledges potential risk of benefit & harm from medical error.

based on theoretical and empiric observations.

Page 22: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

1 Software development: error seeding. Learning from multiple errors intentionally placed in software code during development.

2 Public Projects: Hirschman’s ‘Hiding Hand’ principle. ‘Impossible’ public projects are started (planning error) and the major obstacles are overcome (due to committed project leaders). Thus society benefits from completion of the impossible.

3 Drug Regulation: “Precautionary principle” Mandated drug product regulation constrains acquisition of helpful knowledge. Thus the ‘error’ of ‘mandated’ (a planning error) is not followed (an execution errro) the result may be benefical.

precedent

Page 23: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

potential application due to planning errors in...Unplanned extubation extubate/intubate decisions

Patient safety and medical errors schedules for trainees/ staff / frontline

Medication precision in children dosing guidance

Trainee transition to independence supervision - autonomy balance

>Learning about planning errors (‘best’ practice becomes better)

ie. good outcomes may result from execution ‘errors’ based on (unappreciated) planning error.

application

Page 24: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

1 Models of medical error are incomplete and problematic. 2 Zero error goals may paradoxically limit learning.3 Embracing errors has led to improvement in healthcare. 4 Potential applications to learn from beneficial errors. 5 The Error-Berg is a new conceptual model ...that explains the persistance of error in healthcare & ...how medical error can improve quality of care.

summary

Page 25: constructive error: the error berg 2019 _Y… · dures and switched off successive safety systems, thus creating the immediate trigger for the catastrophic explosion in the core

Christopher S Parshuram Department of Critical Care Medicine | Center for Safety Research | Child Health Evaulative Sciences | SickKids | University of Toronto

thank youM Gaetani & C Parshuram