cognitive errors in anesthesiology 21

7
CLINICAL PRACTICE Cognitive errors detected in anaesthesiology: a literature review and pilot study M. P. Stiegler 1 * , J. P. Neelankavil 1 , C. Canales 2 and A. Dhillon 1 1 Department of Anaesthesiology, David Geffen School of Medicine, UCLA, 757 Westwood Blvd-Suite 3325, Los Angeles, CA 90095-7403, USA 2 Department of Anaesthesiology and Perioperative Care, UC Irvine School of Medicine, Irvine, CA, USA * Corresponding author. E-mail: [email protected] Editor’s key points This paper deals with a novel subject of cognitive errors and psychology of decision-making in anaesthesia. An anaesthesia-specific catalogue of cognitive errors was created using qualitative methodology. The errors of key importance were seen in .50% of simulated emergencies. The study provides deeper insight into error generation and possible strategies to prevent them. Background. Cognitive errors are thought-process errors, or thinking mistakes, which lead to incorrect diagnoses, treatments, or both. This psychology of decision-making has received little formal attention in anaesthesiology literature, although it is widely appreciated in other safety cultures, such as aviation, and other medical specialities. We sought to identify which types of cognitive errors are most important in anaesthesiology. Methods. This study consisted of two parts. First, we created a cognitive error catalogue specific to anaesthesiology practice using a literature review, modified Delphi method with experts, and a survey of academic faculty. In the second part, we observed for those cognitive errors during resident physician management of simulated anaesthesiology emergencies. Results. Of .30 described cognitive errors, the modified Delphi method yielded 14 key items experts felt were most important and prevalent in anaesthesiology practice (Table 1). Faculty survey responses narrowed this to a ‘top 10’ catalogue consisting of anchoring, availability bias, premature closure, feedback bias, framing effect, confirmation bias, omission bias, commission bias, overconfidence, and sunk costs (Table 2). Nine types of cognitive errors were selected for observation during simulated emergency management. Seven of those nine types of cognitive errors occurred in .50% of observed emergencies (Table 3). Conclusions. Cognitive errors are thought to contribute significantly to medical mishaps. We identified cognitive errors specific to anaesthesiology practice. Understanding the key types of cognitive errors specific to anaesthesiology is the first step towards training in metacognition and de-biasing strategies, which may improve patient safety. Keywords: cognition; decision-making; diagnostic errors/prevention and control; medical mistakes; physicians/psychology Accepted for publication: 22 August 2011 Increasingly, attention is being focused on the prevention and management of medical errors that have been esti- mated to account for 44 000– 98 000 deaths annually in the USA. 1 Recent studies suggest that cognitive errors, a subset of medical errors involving faulty-thought processes and subconscious biases, are important contributors to missed diagnoses and patient injury. 2 4 Indeed, according to Groopman, 5 ‘a growing body of research shows that tech- nical errors account for only a small fraction of incorrect diagnoses and treatments. Most errors are mistakes in think- ing. And part of what causes these cognitive errors is our inner feelings, feelings we do not readily admit to and often don’t even recognize’. Thus, the psychology of decision- making is becoming increasingly appreciated in the non- medical and medical literature. Cognitive errors have been described in safety culture industries, such as aviation and nuclear power plants, 67 and in the literature of medical education and other medical spe- cialities, including internal medicine, emergency medicine, family practice, radiology, neurology, and pathology. 28 15 In the 1990s, early pioneers of the ‘Crisis Resource Manage- ment’ paradigms at Harvard and Stanford introduced this topic by including fixation error (also known as ‘anchoring’ or ‘tunnel-vision’) in their curricula. 11 16 17 Numerous addition- al cognitive errors have been identified and this concept has been greatly expanded in other disciplines since then. A com- prehensive review of cognitive errors or decision-making psychology has not yet been published in contemporary anaesthesiology literature. This manuscript sought to provide an expanded introduction to the concept of cognitive British Journal of Anaesthesia 108 (2): 229–35 (2012) Advance Access publication 8 December 2011 . doi:10.1093/bja/aer387 & The Author [2011]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: [email protected] at REDAR on July 14, 2015 http://bja.oxfordjournals.org/ Downloaded from

Upload: constantinpavaloaia

Post on 28-Jan-2016

217 views

Category:

Documents


0 download

DESCRIPTION

CC

TRANSCRIPT

Page 1: Cognitive Errors in Anesthesiology 21

CLINICAL PRACTICE

Cognitive errors detected in anaesthesiology: a literaturereview and pilot studyM. P. Stiegler1*, J. P. Neelankavil1, C. Canales2 and A. Dhillon1

1 Department of Anaesthesiology, David Geffen School of Medicine, UCLA, 757 Westwood Blvd-Suite 3325, Los Angeles, CA 90095-7403,USA2 Department of Anaesthesiology and Perioperative Care, UC Irvine School of Medicine, Irvine, CA, USA

* Corresponding author. E-mail: [email protected]

Editor’s key points

† This paper deals with anovel subject of cognitiveerrors and psychology ofdecision-making inanaesthesia.

† An anaesthesia-specificcatalogue of cognitiveerrors was created usingqualitative methodology.

† The errors of keyimportance were seen in.50% of simulatedemergencies.

† The study providesdeeper insight into errorgeneration and possiblestrategies to preventthem.

Background. Cognitive errors are thought-process errors, or thinking mistakes, which lead toincorrect diagnoses, treatments, or both. This psychology of decision-making has receivedlittle formal attention in anaesthesiology literature, although it is widely appreciated inother safety cultures, such as aviation, and other medical specialities. We sought toidentify which types of cognitive errors are most important in anaesthesiology.

Methods. This study consisted of two parts. First, we created a cognitive error catalogue specificto anaesthesiology practice using a literature review, modified Delphi method with experts, anda survey of academic faculty. In the second part, we observed for those cognitive errorsduring resident physician management of simulated anaesthesiology emergencies.

Results. Of .30 described cognitive errors, the modified Delphi method yielded 14 key itemsexperts felt were most important and prevalent in anaesthesiology practice (Table 1). Facultysurvey responses narrowed this to a ‘top 10’ catalogue consisting of anchoring, availabilitybias, premature closure, feedback bias, framing effect, confirmation bias, omission bias,commission bias, overconfidence, and sunk costs (Table 2). Nine types of cognitive errorswere selected for observation during simulated emergency management. Seven of thosenine types of cognitive errors occurred in .50% of observed emergencies (Table 3).

Conclusions. Cognitive errors are thought to contribute significantly to medical mishaps. Weidentified cognitive errors specific to anaesthesiology practice. Understanding the key typesof cognitive errors specific to anaesthesiology is the first step towards training inmetacognition and de-biasing strategies, which may improve patient safety.

Keywords: cognition; decision-making; diagnostic errors/prevention and control; medicalmistakes; physicians/psychology

Accepted for publication: 22 August 2011

Increasingly, attention is being focused on the preventionand management of medical errors that have been esti-mated to account for 44 000–98 000 deaths annually inthe USA.1 Recent studies suggest that cognitive errors, asubset of medical errors involving faulty-thought processesand subconscious biases, are important contributors tomissed diagnoses and patient injury.2 – 4 Indeed, accordingto Groopman,5 ‘a growing body of research shows that tech-nical errors account for only a small fraction of incorrectdiagnoses and treatments. Most errors are mistakes in think-ing. And part of what causes these cognitive errors is ourinner feelings, feelings we do not readily admit to andoften don’t even recognize’. Thus, the psychology of decision-making is becoming increasingly appreciated in the non-medical and medical literature.

Cognitive errors have been described in safety cultureindustries, such as aviation and nuclear power plants,6 7 andin the literature of medical education and other medical spe-cialities, including internal medicine, emergency medicine,family practice, radiology, neurology, and pathology.2 8 – 15

In the 1990s, early pioneers of the ‘Crisis Resource Manage-ment’ paradigms at Harvard and Stanford introduced thistopic by including fixation error (also known as ‘anchoring’or ‘tunnel-vision’) in their curricula.11 16 17 Numerous addition-al cognitive errors have been identified and this concept hasbeen greatly expanded in other disciplines since then. A com-prehensive review of cognitive errors or decision-makingpsychology has not yet been published in contemporaryanaesthesiology literature. This manuscript sought toprovide an expanded introduction to the concept of cognitive

British Journal of Anaesthesia 108 (2): 229–35 (2012)Advance Access publication 8 December 2011 . doi:10.1093/bja/aer387

& The Author [2011]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved.For Permissions, please email: [email protected]

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 2: Cognitive Errors in Anesthesiology 21

errors, and to identify the cognitive errors that may be mostimportant in anaesthesiology practice.

Review of cognitive errorsCognitive errors fall under the domain of psychology ofdecision-making. According to Croskerry,18 who has pre-sented the most comprehensive medical catalogue of cogni-tive errors to date, the literature on cognitive bias indecision-making is ample, but ‘ponderously slow to entermedicine’. Perhaps this is because cognitive errors are con-siderably less tangible than procedural errors. They aredescribed as ‘low-visibility’; that is, they are generallyunable to be witnessed or recorded and usually occur withlow awareness on the part of the thinker. Thus, they arenot conducive to root-cause analysis. However, these typesof thinking mistakes are thought to be potentially highly pre-ventable.19 Cognitive errors are thought-process errors,usually linked to failed biases or heuristics. According to

Groopman,5 ‘while heuristics serve as the foundation for allmature medical decision making, they can lead to graveerrors. The doctor must be aware of which heuristics he isusing, and how his inner feelings may influence them’.Importantly, these are errors that are made despite theavailability of knowledge and data needed to make thecorrect decision. Thus, they are distinct from knowledgegaps. It is important to note that heuristics and biases arefrequently useful in clinical medicine; they allow experts toarrive at decisions quickly and (usually) accurately.However, cognitive errors occur when these subconsciousprocesses and mental shortcuts are relied upon excessivelyor under the wrong circumstances. Below, we will introduceseveral examples of specific cognitive errors that are com-monly described in the literature and easy to understand.Table 1 lists our catalogue of 14 cognitive errors, along withdefinitions and examples for each.

‘Premature closure’ describes the cognitive error ofaccepting the first plausible diagnosis before it has been

Table 1 Cognitive error catalogue

Cognitive error Definition Illustration

Anchoring Focusing on one issue at the expense of understanding thewhole situation

While troubleshooting an alarm on an infusion pump, you areunaware of sudden surgical bleeding and hypotension

Availability bias Choosing a diagnosis because it is in the forefront of your minddue to an emotionally charged memory of a bad experience

Diagnosing simple bronchospasm as anaphylaxis becauseyou once had a case of anaphylaxis that had a very pooroutcome

Prematureclosure

Accepting a diagnosis prematurely, failure to considerreasonable differential of possibilities

Assuming that hypotension in a trauma patient is due tobleeding, and missing the pneumothorax

Feedback bias Misinterpretation of no feedback as ‘positive’ feedback Belief that you have never had a case of unintentionalawareness, because you have never received a complaintabout it

Confirmationbias

Seeking or acknowledging only information that confirms thedesired or suspected diagnosis

Repeatedly cycling an arterial pressure cuff, changing cuffsizes, and locations, because you ‘do not believe’ the lowreading

Framing effect Subsequent thinking is swayed by leading aspects of initialpresentation

After being told by a colleague, ‘this patient was extremelyanxious preoperatively’, you attribute postoperative agitationto her personality rather than low blood sugar

Commission bias Tendency toward action rather than inaction. Performingun-indicated manoeuvres, deviating from protocol. May be dueto overconfidence, desperation, or pressure from others

‘Better safe than sorry’ insertion of additional unnecessaryinvasive monitors or access; potentially resulting in acomplication

Overconfidencebias

Inappropriate boldness, not recognizing the need for help,tendency to believe we are infallible

Delay in calling for help when you have trouble intubating,because you are sure you will eventually succeed

Omission bias Hesitation to start emergency manoeuvres for fear of beingwrong or causing harm, tendency towards inaction

Delay in calling for chest tube placements when you suspecta pneumothorax, because you may be wrong and you will beresponsible for that procedure

Sunk costs Unwillingness to let go of a failing diagnosis or decision,especially if much time/resources have already been allocated.Ego may play a role

Having decided that a patient needs an awake fibreopticintubation, refusing to consider alternative plans despitemultiple unsuccessful attempts

Visceral bias Counter-transference; our negative or positive feelings about apatient influencing our decisions

Not trouble-shooting an epidural for a labouring patient,because she is ‘high-maintenance’ or a ‘complainer’

Zebra retreat Rare diagnosis figures prominently among possibilities, butphysician is hesitant to pursue it

Try to ‘explain away’ hypercarbia when MH should beconsidered

Unpackingprinciple

Failure to elicit all relevant information, especially duringtransfer of care

Omission of key test results, medical history, or surgical event

Psych-out error Medical causes for behavioural problems are missed in favourof psychological diagnosis

Elderly patient in PACU is combative—prescribing restraintsinstead of considering hypoxia

BJA Stiegler et al.

230

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 3: Cognitive Errors in Anesthesiology 21

fully verified.8 If a patient is hypotensive after induction, theanaesthetist may attribute this to the first potentialexplanation that comes to mind—such as the effect ofdeep anaesthesia without adequate stimulation—and notconsider the other key items on the differential diagnosis.Just as easily, an anaesthetist may diagnose anaphylaxisand launch into epinephrine resuscitation without consider-ing other key items that are alternative diagnoses. Import-antly, to understand the concept of cognitive error,premature closure represents the thought-process mistakethat occurs when a preliminary diagnosis is selectedwithout consideration of other possibilities and without veri-fication of the selected diagnosis. Anaesthetists may havevariability in terms of what specific diagnosis comes totheir minds most readily; the error lies not in the nature ofthe preliminary diagnosis itself, but in the premature cessa-tion of investigation. It is worth repeating that cognitiveerrors are not knowledge deficits. The anaesthetist in theabove example is assumed to be competent, knowledgeable,and capable of generating a list of potential diagnoses andthen evaluating each and selecting the best. Yet it is theoverreliance on a thought-process shortcut that leads theanaesthetist astray.

‘Feedback bias’ is another cognitive error that may be par-ticularly pertinent to anaesthesiology. This describes theprocess that occurs when significant time elapses betweenactions and consequences, or outcome data are neverreported back to the anaesthetist.18 When important infor-mation does not return to the decision maker, it is impossibleto shape future decisions based upon that information. Assuch, the absence of feedback is subconsciously noted asthe positive feedback. Potential clinical examples of thisabound in anaesthesiology, as we often have limitedpatient continuity. If a patient suffers an infection fromcentral catheter placement, or a corneal abrasion from inad-equate eye protection, or even intraoperative awareness,these issues may not be communicated back to the anaes-thesiologist. The anaesthesiologist may remain ignorant ofthese complications indefinitely, believing the techniquesused to be adequate.

‘Confirmation bias’ is an error characterized by seekingconfirming evidence to support a diagnosis while subcon-sciously discounting contrary evidence, despite the latteroften being more definitive. Sometimes this will manifestby trying to force data to fit a desired or suspected diagno-sis.6 20 A simple example in anaesthesiology might be therepeating of arterial pressure measurements, changing cuffsizes and locations, in an effort to get a reassuring reading,instead of recognizing the hypotension as real. Similarly,this may occur with nearly any monitoring device, includingmisdiagnosing oesophageal intubation in favour of ‘fixing’the capnograph. Another example might be diagnosingmalignant hyperthermia based upon hyperthermia andhypercarbia, and not looking for (or discounting theabsence of) metabolic acidosis. As with other cognitiveerrors, the fault lies in the thought process—subconsciouslyselecting only certain information and attempting to force

those data to ‘satisfy’ a chosen diagnosis. Crucial to theconcept of cognitive errors is the lack of deliberate thinking.Often, clinicians must sift through extraneous data that areindeed irrelevant to a diagnosis. Upon careful consideration,they may discard them. However, confirmation bias is notconscious, deliberate selection of data; it is a largely uncon-scious process of (incorrect) pattern recognition.

‘Availability bias’ is an error in diagnosis due to an emo-tionally memorable past experience (an adverse patientoutcome, or lawsuit, for example). Physicians may be sub-consciously affected by these emotional experiences,causing the diagnosis to be very readily available at the fore-front of the mind, which has the result of inflating thepre-test probability for that diagnosis. In the future, the phys-ician may unintentionally ignore important differencesbetween the current presentation and that prior experience,falling prey to the availability bias error.

Study methodsHaving reviewed the literature and noting the absence ofanaesthesiology-specific set of cognitive errors, we set outto perform two tasks. First, we sought to create ananaesthesiology-specific catalogue of errors. Next, wecreated a pilot study to observe the prevalence of thoseerrors during simulated anaesthesia emergencies. Institu-tional Review Board approval was obtained before studyimplementation.

Creation of an anaesthesiology-specific catalogue ofcognitive errors

We performed a Medline literature search using the keywords ‘cognitive error’, ‘cognitive psychology’, ‘diagnosticerror’, ‘heuristic’, and ‘decision making’ alone and in combi-nations. We limited our search to articles from peer-reviewedjournals written in English, screening abstracts for appropri-ateness. Since we were interested in the modern paradigmof thought-process errors, we limited our initial search toitems published during or after 1999. We then screened allabstracts identified by Medline to be ‘related citations’ of arti-cles we found to be relevant, and we screened relevant arti-cles cited in the articles we reviewed, including thosepublished before 1999. In all, we found that the concept ofcognitive errors was described in 28 articles, and specific cog-nitive errors were identified and defined in 23 articles. Fromall the specific cognitive errors identified, we examined thedefinitions and combined items that appeared to be syno-nyms. For example, ‘anchoring’, ‘fixation’, and ‘tunnelvision’ are all terms that essentially describe the process ofgiving exclusive attention to only one problem at theexpense of fully understanding the situation, and we choseto refer to this concept as ‘anchoring’. Instead of creating anew taxonomy, we chose among existing terminologyoptions for use in this study. We listed all the separateerrors, along with definitions, and we added illustrativeanaesthesiology examples.

Cognitive errors BJA

231

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 4: Cognitive Errors in Anesthesiology 21

Using the expert opinion of eight faculty on the qualityassurance committee (who routinely investigate errors andnear-misses) and crisis simulation faculty (who have exten-sive experience observing the natural course of errorsallowed to evolve unchecked) from the correspondingauthor’s academic Anaesthesiology department, we nar-rowed the catalogue using two rounds of a modified Delphitechnique. This process consisted of interviews duringwhich the faculty reviewed the list of errors with an investiga-tor (M.P.S.). They were asked to comment on the relevance ofeach error to the practice of anaesthesiology in a hypothet-ical manner and also in terms of personal experience andvicarious experiences. They were also asked to commenton the likelihood of the error based upon level of clinical ex-perience; that is, they were asked whether an error is likely tobe relevant to trainees only or also to experienced anaesthe-siologists. Finally, they were asked whether education abouta particular error and its prevention strategy would be usefulto their own practice or to that of other anaesthetists. Aftereach faculty member had been interviewed once and revi-sions made, the same faculty were invited to comment onthe proposed exclusions and the remaining catalogue. Inparticular, because many cognitive errors have a degree ofconceptual overlap, they were asked if any items appearedto be synonymous, or if their defining features were toosubtle to differentiate. They were also asked to commenton the construction of the list and whether it was self-explanatory. Based upon this information, we narrowed thecatalogue to 14 items, which are listed, defined, and illu-strated in Table 1. Finally, this 14-item catalogue, includingerror definitions and illustrative clinical examples, was pre-sented via e-mail to all faculty members (77 in total). Theywere asked to identify up to five items that they believedto be most pertinent to anaesthesiology practice. Again,they were asked to consider both their own personal experi-ences and those they observed or were told about from col-leagues or trainees. They were also invited to comment onany item they wished, and in particular, they were asked tocomment if they felt a particular error did not occur in anaes-thesiology. As is discussed in the Results section, this processultimately led to a catalogue of 10 items felt to be especiallyrelevant to anaesthesiology.

Observation of cognitive errors

We wished to observe whether the cognitive errors identifiedin our catalogue indeed occurred in clinical practice. Asobservation of errors in the operating theatre without inter-vening would be unethical, we elected to observe residentsin an established simulation course where errors may beallowed to evolve unchecked and transpire along theirnatural course. We did not alter the standard simulation cur-riculum in any way for this study, and we recruited partici-pants based upon availability. The following is a briefdescription of our standard simulation curriculum and envir-onment. As part of standard training, anaesthesiology resi-dent physicians at our institution practice managing

simulated emergencies several times per year. Residentsparticipated in high-fidelity scenarios, using the SimMansimulator (Laerdal Medical, Wappingers Falls, NY, USA), andactors (trained simulation centre staff) represented otherteam members (surgeon, nurse). Scenarios were selectedfrom our library of 20 cases by the simulation faculty basedon appropriateness for the resident level of training andprior exposure to scenarios. They contained scripted emer-gencies, including anaphylaxis, malignant hyperthermia,difficult airway (‘can’t intubate, can’t ventilate’), and pulmon-ary embolism. Before the simulation, residents were fullyoriented to the simulation environment, capabilities, andlimitations. Participants were instructed to act as the leadphysician managing patient care, and were also informedthey could ask for help, diagnostic exams, or any resourcecommonly available in the operating theatre. For eachcase, participants were given a vignette stem and theability to ask questions, just as they would in a preoperativeevaluation. Each case was �20 min in length and was imme-diately followed by a debriefing that lasted �40 min. All ofthis is our standard educational curriculum. It should beemphasized that we did not alter the simulated events inany way to attempt to cultivate error, nor did we study anyerror prevention or recovery strategies. This study wassolely to confirm (or refute) whether the cognitive errorsidentified in our catalogue appeared during clinical decision-making exercises.

Between October 2009 and February 2010, 32 residentphysicians consented to participate in the study. As cognitiveerrors are thought to be ubiquitous processes and notsituation-specific, we did not control for the simulatedvignette subject matter or script, nor did we ask the facultyinstructors to deviate from their usual practices. As is stand-ard in our curriculum, resident physicians were asked tomanage the emergency as if in real life. While observingthe live scenarios, investigators made note of error beha-viours. Those behaviours were subsequently exploredduring the debriefing, so that the thought processes couldbe elicited. The investigators recorded all cognitive errorsthat were identified; a resident physician could make anynumber of defined cognitive errors, including none or all, ina given scenario.

Expert rating

Two expert investigators evaluated each participant after thesimulated case and debriefing, using a five-point Likert-stylesurvey that queried performance of cognitive errors and non-technical skills, such as communication and teamwork beha-viours. This instrument was developed using the cognitiveerrors in the first part of this study, was demonstrated tobe reliable (Cronbach’s a¼0.81), and is described in detailelsewhere.21 Each expert rater received instructions and acalibration session with the principal investigator. Expertscompleted the cognitive error survey after the debriefingprocess, so that they could ascertain both the degree ofmedical knowledge possessed by the resident physician

BJA Stiegler et al.

232

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 5: Cognitive Errors in Anesthesiology 21

and the thought process by which decisions were made.Scores were given after debriefing because a definingfeature of cognitive errors is the absence of a knowledgedeficit; thus, it was essential to gain insight into the decision-making thought process before scoring. As previously dis-cussed, cognitive errors cannot be detected by observationalone.

Statistical analysis

Expert raters indicated whether or not a particular cognitiveerror was made by a given resident physician using a five-point Likert scale (1, strongly disagree that error occurred;2, disagree that error occurred; 3, neutral or unsure; 4,agree that error occurred; 5, strongly agree that erroroccurred). For our analyses, we averaged investigator scoresfor each error and created a binary variable as follows: anaverage of .3 indicated an error occurred; whereas anaverage of ≤3 indicated no error occurred. Then, usingSPSS version 17 (SPSS Inc., Chicago, IL, USA), we performedfrequency analysis for each error.

ResultsCognitive error catalogue

Thirty-two of the 77 faculty physicians (41.5% response rate)queried responded. Our faculty consensus identified the fol-lowing 10 cognitive errors as being highly relevant to thepractice of anaesthesiology: anchoring, availability bias,omission bias, commission bias, premature closure, confirm-ation bias, framing effect, overconfidence, feedback bias, andsunk cost. The percentage of responders who identified agiven error as being among the five most important isgiven in Table 2. Throughout the modified Delphi process,the faculty overwhelmingly affirmed their perception that

these kinds of thought-process error occur frequently andare likely to contribute to actual error behaviours and ultim-ately, adverse outcomes. Certain items were noted to be lesslikely, and they involved patient behaviours. Psych-out errorand visceral bias both depend largely on dynamic interac-tions with awake patients, so it is not surprising that theyfigured less prominently in the final catalogue, yet were iden-tified by our faculty members who specialize in obstetricanaesthesiology or critical care environments, including thepost-anaesthesia recovery unit.

Cognitive errors chosen by more than 25% of respondents,for a total of 10 items, were selected for the pilot study.However, specifically for our simulation-based portion ofthe study, we omitted feedback bias, since this kind oferror requires a prolonged amount of time lapse. Thus, nineof the errors from Table 2 were assessed in the second partof the study: anchoring, availability bias, omission bias, com-mission bias, premature closure, confirmation bias, framingeffect, overconfidence, and sunk cost.

Free text responses by faculty overwhelmingly affirmedthe importance of cognitive errors in anaesthesiology prac-tice. Almost all responders indicated that listing only fivewas difficult because they all occur with significant fre-quency. Several responders suggested that the most import-ant errors might vary based upon experience, with somemore likely to occur among trainees and others more likelyamong experienced clinicians. As an example, availabilitybias was suggested to be more likely among experiencedanaesthesiologists, since they would have more emotionallymemorable experiences from which to draw. Overconfidencewas also suggested to be more likely among experiencedanaesthesiologists, while omission bias was suggested tobe more likely among less experienced clinicians. Noresponder indicated that cognitive errors were unimportantor did not significantly contribute to medical errors.

Observation of cognitive errors

Thirty-two resident physicians (PGY 2–16, PGY 3–8, PGY 4–8)consented to participate in this pilot study. Over a period of5 months, 38 simulated encounters were observed in realtime. As is our standard practice, these encounters are alsorecorded; these recordings were available to study personnelif needed. Six resident physicians participated in simulationtraining twice over the course of the data collection, andone resident physician was scheduled three times. However,no resident physician participated in the same scenariomore than once, and no education about cognitive errorswas presented; thus, repeat participants had neither an ad-vantage nor disadvantage compared with other participants.

Table 3 lists the nine cognitive errors observed during the38 simulation encounters, along with the correspondingprevalence data. By far, our faculty felt that anchoring wasthe most common error, followed by availability bias, prema-ture closure, and feedback bias. These are errors that havebeen frequently identified in other fields of medicine. Seven

Table 2 Important cognitive errors as perceived byanaesthesiology faculty. *This column represents the number offaculty out of the 32 responders who identified this error as beingamong the five most important to anaesthesiology practice

Cognitive error Faculty selection [% (n)]*

Anchoring 84.4 (27)

Availability bias 53.1 (17)

Premature closure 46.9 (15)

Feedback bias 46.9 (15)

Confirmation bias 40.6 (13)

Framing effect 40.6 (13)

Commission bias 32 (10)

Overconfidence bias 32 (10)

Omission bias 28.1 (9)

Sunk costs 25 (8)

Visceral bias 12.5 (4)

Zebra retreat 12.5 (4)

Unpacking principle 6.25 (2)

Psych-out error 6.25 (2)

Cognitive errors BJA

233

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 6: Cognitive Errors in Anesthesiology 21

of the nine errors we studied occurred in at least half of thesimulation sessions.

DiscussionThe observations during our pilot study were consistent withthe high ratings of our faculty; premature closure and confirm-ation bias were the most commonly observed cognitive errorsin simulated scenarios. Framing effect and availability biaswere observed the least. In fact, errors perceived by facultyto be important to anaesthesiology were indeed observed fre-quently among trainees in a simulated environment.

Some items that were scored very highly by anaesthetistswere observed relatively infrequently in simulation, and viceversa. As a general phenomenon, there are a few possibleexplanations for this. First, faculty were asked which errorsthey believed were most pertinent or important, not necessar-ily which were most frequent. Next, the Delphi processinvolved perceptions of errors, but those perceptions arelikely based upon observational experience alone, withoutthe benefit of cognitive mechanism debriefing. In contrast,during the simulated exercises, our raters were able toinquire as to thought process and identify errors with theadded insight into cognitive mechanisms. Another possibleexplanation is that our faculty were asked to reflect on theirpast experience and identify the errors they thought weremost important, while the investigators in the simulationportion were looking prospectively and had the assistance ofprompts from the cognitive error survey. Additionally, it is pos-sible that our faculty strongly identified with errors they feltwere easier to understand and relate, and this unfamiliarityitself may have biased their responses. However, since mosterrors in our catalogue were observed in simulation, resultsfrom both parts of the study are likely to represent an import-ant first look at cognitive errors in anaesthesiology practice.

Identification of the most common cognitive errors iscrucial to developing appropriate training strategies for man-agement and prevention. With this catalogue as a startingpoint, more research should be performed to confirm theimportance of these errors and test error prevention andrecovery methods. Although a thorough treatment of meta-cognition and strategies for cognitive error prevention and

recovery is beyond the scope of this manuscript, it deservessome brief mention here. Put simply, metacognition is think-ing about thinking, and studies of metacognitive traininghave demonstrated improved decision-making processesand decision outcomes.22 Basic features of metacognitivepractice include: recognition of limitations of memory,ability to reflect on and appreciate a broader perspective,good capacity for self-critique and harnessing overconfi-dence, and the ability to select specific strategies for bestdecision-making.19 Practitioners should do deliberate self-checks, asking the questions: ‘Am I using the best decision-making strategy right now?’ and ‘Am I relying too heavilyon pattern-recognition or bias?’. While some urgent situa-tions require immediate action, more accurate diagnosisand appropriate therapy may result from active generationof alternative interpretations of evidence and identificationof hidden assumptions when time permits. Physiciansshould be deliberate in deciding when it is appropriate tothink more or think differently.22

A limitation of this consensus is that it was based on thepractice of faculty at a single large academic centre involvingsupervision of trainees. It is possible that perceived frequen-cies and relative importance of various errors would vary withtype of practice and level of experience. Further, regardingthe simulation portion of the study, due to our relativelysmall sample size and the diversity of simulated encounters,we cannot comment on the potential relationships betweenyears of training, types or prevalence of error, and clinical‘outcome’ of simulated emergencies. Finally, it is notknown whether these same errors occur at similar ratesamong experienced anaesthesiologists, nor is it knownwhether these errors occur at similar rates during real clinicalemergencies. These are all areas that warrant further study.

The concept of cognitive errors is relatively new to themedical literature, and relatively novel to the anaesthesi-ology literature. As such, there is a large amount of workstill to be done. A larger study of anaesthesiologists’ opinionson cognitive errors should be done to include those in otherpractice models. Further, studies that seek to stratify bynovice, intermediate, and experienced anaesthesiologistswould help identify whether certain errors are more preva-lent among certain groups. Additionally, the observation ofthese errors in real clinical events should be attempted,although this would present several challenges (self-reporting is generally poor, and it would be unethical toobserve errors without intervening). As noted, we did notstudy any error prevention or recovery strategies, butsimply observed for the occurrence of these errors.However, the applied psychology concepts of metacognitionand ‘de-biasing’ strategies have been demonstrated in otherfields to be effective in cognitive error prevention and recov-ery.18 23 With common cognitive errors in anaesthesiologyidentified, specific strategies to minimize them can be devel-oped and disseminated. Ultimately, studies addressing theeffectiveness of metacognitive training and debiasing strat-egies in preventing these errors among anaesthesiologistsmust be performed.

Table 3 List and prevalence of cognitive errors observed during 38anaesthesiology sessions

Cognitive error Frequency observed [% (n)]

Premature closure 79.5 (31)

Confirmation bias 76.9 (30)

Sunk costs 66.7 (27)

Commission bias 66.7 (27)

Omission bias 61.5 (24)

Anchoring 61.5 (24)

Overconfidence 53.8 (21)

Framing effect 23.7 (9)

Availability bias 7.8 (3)

BJA Stiegler et al.

234

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from

Page 7: Cognitive Errors in Anesthesiology 21

To our knowledge, this is the first study to identify cogni-tive errors specific to anaesthesiology practice and toobserve them prospectively. We were able to identify 10key errors perceived to be especially pertinent to anaesthesi-ology practice, and were then able to observe their occur-rence consistently and frequently in simulated anaesthesiascenarios. We believe that these are likely to occur in realclinical anaesthesia scenarios as well, as they do in otherfields of medicine. This application of cognitive psychologyis important because it follows logically that errors ofthought lead to errors of behaviour, and consequently con-tribute to morbidity and mortality.

We advocate that anaesthesiologists must have insightinto their own decision-making processes and deliberatelyabandon intuitive reasoning for an analytic approach whennecessary. To achieve this, educational training in cognitiveerrors, metacognition, and debiasing strategies is needed.However, there are still many questions regarding whicherrors are most important to address and which metacogni-tive strategies are most appropriate and effective in anaes-thesiology. Further research in this area is needed toreduce decision-making errors and improve patient safety.

AcknowledgementsThe authors wish to thank the UCLA Simulation Center facultyand staff for their participation and support of this project. Theauthors also wish to thank Dr Sebastian Uijtdehaage and DrCraig Fox for their contributions to this manuscript.

Declaration of interestNone declared.

FundingSupport for this paper was provided solely from institutional/departmental resources (David Geffen School of Medicine, atUCLA, and the Department of Anesthesiology at UCLA).

References1 Kihn LT, Corrigan JM, Donaldson MS. To Err Is Human: Building a

Safer Health System. Washington, DC: Institute of MedicineNational Academy Press Report, 1999

2 Croskerry P. Achieving quality in clinical decision making: cogni-tive strategies and detection of bias. Acad Emerg Med 2002; 9:1184–204

3 Mamede S, Van Gog T, Van Den Berge K, et al. Effect of availabilitybias and reflective reasoning on diagnostic accuracy amonginternal medicine residents. J Am Med Assoc 2010; 304: 1198–203

4 Tokuda Y, Kishida N, Konishi R, Koizumi S. Cognitive error as themost frequent contributory factor in cases of medical injury: astudy on verdict’s judgment among closed claims in Japan.J Hosp Med 2011; 6: 109–14

5 Groopman J. How Doctors Think. Boston, MA: Houghton Mifflin,2008

6 NEA Issue Brief: an analyses of principle nuclear issues. ReportNo. 2, January 1998

7 Malakis S, Kontogiannis T, Kirwan B. Managing emergencies andabnormal situations in air traffic control (part II): teamwork strat-egies. Appl Ergon 2010; 41: 628–35

8 Berner ES, Graber ML. Overconfidence as a cause of diagnosticerror in medicine. Am J Med 2008; 121: S2–23

9 Borrell-Carrio F, Epstein RM. Preventing errors in clinical practice: acall for self-awareness. Ann Fam Med 2004; 2: 310–6

10 Fandel TM, Pfnur M, Schafer SC, et al. Do we truly see what wethink we see? The role of cognitive bias in pathological interpret-ation. J Pathol 2008; 216: 193–200

11 Garber MB. Diagnostic imaging and differential diagnosis in 2case reports. J Orthop Sports Phys Ther 2005; 35: 745–54

12 Harasym PH, Tsai TC, Hemmati P. Current trends in developingmedical students’ critical thinking abilities. Kaohsiung J Med Sci2008; 24: 341–55

13 Kempainen RR, Migeon MB, Wolf FM. Understanding our mistakes:a primer on errors in clinical reasoning. Med Teach 2003; 25:177–81

14 Pescarini L, Inches I. Systematic approach to human error in radi-ology. Radiol Med 2006; 111: 252–67

15 Vickrey BG, Samuels MA, Ropper AH. How neurologists think: acognitive psychology perspective on missed diagnoses. AnnNeurol 2010; 67: 425–33

16 Holzman RS, Cooper JB, Gaba DM, Philip JH, Small SD, Feinstein D.Anesthesia crisis resource management: real-life simulationtraining in operating room crises. J Clin Anesth 1995; 7: 675–87

17 Howard SK, Gaba DM, Fish KJ, Yang G, Sarnquist FH. Anesthesiacrisis resource management training: teaching anesthesiologiststo handle critical incidents. Aviat Space Environ Med 1992; 63:763–70

18 Croskerry P. The importance of cognitive errors in diagnosis andstrategies to minimize them. Acad Med 2003; 78: 775–80

19 Croskerry P. Cognitive forcing strategies in clinical decisionmak-ing. Ann Emerg Med 2003; 41: 110–20

20 Redelmeier DA, Shafir E, Aujla PS. The beguiling pursuit of moreinformation. Med Decis Making 2001; 21: 376–81

21 Stiegler MA. Anesthesiologists’ Nontechnical and Cognitive SkillsEvaluation Tool. UC Los Angeles: Center for Educational Develop-ment and Research, 2010

22 Flin R, O’Conner P, Crichton M. Safety at the Sharp End: A Guide toNon-Technical Skills. Aldershot: Ashgate, 2008

23 Bond WF, Deitrick LM, Arnold DC, et al. Using simulation toinstruct emergency medicine residents in cognitive forcingstrategies. Acad Med 2004; 79: 438–46

Cognitive errors BJA

235

at RE

DA

R on July 14, 2015

http://bja.oxfordjournals.org/D

ownloaded from