When Our Minds Lead Us Astray: Cognitive Bias in the Emergency Department

Screen Shot 2017-07-14 at 5.01.07 PM.png

Written by: Annette Dekker, MD (University of Chicago EM PGY-1, Northwestern University Feinberg School of Medicine 2017) Edited by: Michael Macias, MD (UC San Diego Ultrasound Fellow, NUEM 2017 Graduate); Expert Commentary by: Nahzinine Shakeri, MD


As anyone who has spent time in the emergency department (ED) can attest, emergency physicians are faced with a constant stream of decisions to make. In few other situations is the “decision density” as high as it is in emergency medicine [1]. This high decision density is further complicated by uncertainty and time constraints. In order to help navigate this challenging milieu, experienced emergency physicians rely on cognitive shortcuts.

Nobel Prize winner in economics Daniel Kahneman describes these cognitive shortcuts as part of an innate mode of cognitive processing. Per Kahneman, the mind works in two modes: System 1 and System 2 [2]. System 1 is fast, emotional, and instinctive, whereas System 2 is slower, more deliberate, and logical. For new or challenging problems, the mind relies on System 2. If challenging situations are encountered repeatedly, the mind switches from System 2 to successful System 1 processing [3]. For example, when one initially learns to drive a car, the mind utilizes System 2. As one becomes more familiar with navigating the road, the mind transitions to System 1 processing. The mind is inherently lazy; if it can use System 1, it will. Cognitive shortcuts occur as part of System 1.

The same cognitive processing occurs in emergency medicine. A learner may need to use the slower, more deliberate processing of System 2 to establish the differential diagnosis for a patient with chest pain. An experienced attending, on the other hand, will likely rely on the more automated System 1 processing.

When things go wrong

Cognitive shortcuts associated with System 1 processing are not inherently bad. As discussed, these shortcuts allow a physician to make quicker decisions, which is essential in the context of the ED. However, such heuristics are also more prone to error.

9 common errors encountered in emergency medicine that result from cognitive biases are listed below [1,4,5]. The list is not exhaustive, nor are the biases mutually exclusive.

Availability bias

  • What is it: An assumption that what most readily comes to mind is most relevant
  • What goes wrong: Judging the likelihood of a disease based on the ease of recall
  • Example: During flu season, an emergency physician is seeing numerous small children with influenza-like symptoms and assumes that the 10th patient of the day with the same symptoms has the flu, missing a case of severe sepsis.
  • Tag line: Recency effect; common things are common; the sounds of hoof beats means horses
  • More: This bias can also be driven by an emotionally charged event or vivid memory of a recent case.

Anchoring heuristic

  • What is it: Fixation on initial impressions
  • What goes wrong: Failure to incorporate additional information into the diagnostic process
  • Example: A patient is brought in to the emergency department for a presumed witnessed seizure and the physician misses syncope related to long QT by discounting the ECG findings that do not fit with the first impression diagnosis.
  • Tag line: Tram-lining; first impression; jumping to conclusions

Base rate neglect

  • What is it: Failure to incorporate the true prevalence of disease into diagnostic reasoning
  • What goes wrong: Overestimation of the pre-test probabilities of specific diseases, leading to unnecessary testing and imaging
  • Example: An emergency physician overestimates the rate of pulmonary embolism, working up many patients for this entity who essentially have no risk, resulting in increased costs to the patient and health care system, false positives and downstream harm to patients in the form of anticoagulation and unnecessary procedures.
  • Tagline:  Worst-first mentality

Blind obedience

  • What is it: Undue importance given to expert opinion or test results
  • What goes wrong: Failure to think independently of test results or expert opinions
  • Example: The emergency physician is concerned that a patient is having an ST-elevation MI and discusses the case with the on-call cardiologist who is not concerned by the ECG and presentation. The emergency physician admits the patient to the cardiology floor without rebuttal and several hours later the patient goes into ventricular fibrillation requiring cardioversion and emergent cardiac catheterization.
  • Tag line: Nuremberg defense

Diagnostic momentum

  • What is it: A previous diagnosis is assumed to be definite
  • What goes wrong: Assuming a previously assigned diagnosis is correct without full workup or consideration of alternate possibilities
  • Example: A patient is referred to the ED from a primary care office with a presumed diagnosis of influenza. The patient is given fluids and discharged home. The patient returns the next day and is found to have bacterial meningitis.
  • Tag line: Diagnostic creep; labels are sticky 
  • More: This bias is often associated with referrals, transfer of care and handoffs. 

Framing effects

  • What is it: Perceptions are influenced by the presentation and contextualization of others
  • What goes wrong: Comments made by others “frame” the patient or clinical scenario in a context which may affect clinical decision making
  • Example: A patient with severe abdominal pain is labeled by EMS as a “drug seeker.” The emergency physician discharges the patient without further testing. The patient returns to the ED and is found to have an intestinal perforation.
  • Tag line: Labeling; framing

Gambler’s fallacy

  • What is it: Belief that past events affect the probability of future events
  • What goes wrong: Perceiving that previous patients’ diagnoses affect the probability of future patients’ diagnoses
  • Example: An emergency physician diagnoses three patients in a row with acute coronary syndrome so assumes that the fourth patient with chest pain is unlikely to have acute coronary syndrome.
  • Tag line: Monte Carlo fallacy; law of averages; sequence effect

Premature closure

  • What is it: Acceptance of a diagnosis before it is fully verified
  • What goes wrong: Providers commit to a presumed diagnosis and further thinking and consideration of alternate diagnoses stops
  • Example: A patient presents with altered mental status and intoxication. The emergency physician immediately attributes the patient’s altered mental status to intoxication, missing evidence of head trauma and an underlying intracranial hemorrhage.
  • Tag line: Counting chickens before they hatch

Yin-yang out

  • What is it: Perception that everything that can be done has been done
  • What goes wrong: Assumption that a patient who has been tested “up the yin-yang” is unlikely to have new diagnoses 
  • Example: A patient with chronic abdominal pain of unknown etiology and a recent negative abdominal CT scan, colonoscopy and endoscopy presents to the ED with abdominal pain. The patient is discharged and their acute appendicitis is missed.
  • Tag line: acute on chronic

Now what? Is there any hope?

It is important to emphasize that cognitive shortcuts do not always lead to failure and misdiagnosis. When they do, however, the results can have tremendous consequences. There are two approaches to reducing the mistakes that result from heuristics: individual awareness and system improvements.

The individual approach emphasizes the need to increase physician awareness of cognitive biases. Experts advocate for educating physicians about common mistakes and providing strategies to avoid them, such as delaying a diagnosis until complete information has been received to avoid anchoring or focusing on the base rates to resist availability bias or base rate neglect [1,6]. Several obstacles to this strategy exist including resource and time constraints. However, the more challenging barrier is physician overconfidence [7]. Deciding when to invoke a more nuanced workup requires self-awareness of diagnostic uncertainty. One study investigated patients who died in the ICU, comparing physician diagnosis, physician rated level of diagnostic uncertainty, and autopsy results [8]. Physicians who stated that they were “completely certain” of the diagnosis were wrong 40% of the time [8]. This mismatch was even more pronounced in the training setting. Medical students were least accurate and least confident, while attending physicians were most accurate and most confident. Residents had the greatest mismatch between confidence and accuracy with higher confidence yet lower accuracy than attending physicians, placing them in the highest risk group for error [9].

Ideally, system improvements would include a more ideal practice environment as well as computer-based resources to support or correct initial diagnoses. Current technologies are expensive and slow, and often still require physicians to identify situations in which they are useful [7]. The solution is more likely to lie in a synergy of physician education, feedback that allows for calibration of physician accuracy, and more efficient system checks.

Expert Commentary

Thank you for this excellent overview of cognitive biases in emergency medicine, Annette. 

Not surprisingly, the emergency department has been described as a “natural laboratory of error:” the decision density, constant switching between tasks, high emotional and cognitive load, frequent transitions of care between providers, and circadian disturbances inherent in shift work make emergency physicians particularly vulnerable to cognitive missteps [6]. Furthermore, there are limited opportunities for emergency physicians to receive feedback on many of the clinical decisions they make, and lack of feedback makes it challenging to calibrate one’s decision making.

Despite these challenges, emergency physicians can protect themselves from cognitive traps using the individual approach mentioned above.

The first step is to consider that pattern recognition, or System 1 processing, may fail us. When the history, physical exam and diagnostic testing results don’t fit together neatly, this should prompt us to force ourselves into slower, more deliberate System 2 processing. But there is even something to gain in seemingly classic clinical presentations when everything fits; a moment of reflection in System 2 may bring additional ideas to light and prevent us from missing rarer or more elusive diagnoses.

Implementing these so-called cognitive forcing strategies which deliberately force our thinking from System 1 to System 2 can be as simple as asking ourselves a single question with each patient encounter:


  • What else could this be?

  • And for more challenging cases:

    • What doesn’t fit?

    • Have I considered life/limb/sight threats?

    • Could there be more than one process at work?

  • Are there biases present? [4,6]


A criticism of this approach follows predictably: How can emergency physicians integrate slow, methodical thinking into a busy shift? One strategy is to make a habit of briefly reflecting upon what might have been missed - what else could this be? - when completing the medical decision making (MDM) component of each patient’s chart. Arguably, this approach will only add a few seconds to our existing workflow and may trigger us to consider alternate, potentially life-saving, possibilities.



Nahzinine Shakeri, MD

Instructor; Medical Education Scholarship Fellow;

Northwestern University; Department of Emergency Medicine


Other Posts You May Enjoy

How to Cite this Blog Post

[Peer-Reviewed, Web Publication] Dekker A,  Macias M (2017, July 18). When Our Minds Lead Us Astray: Cognitive Bias in the Emergency Department.  [NUEM Blog. Expert Commentary By Shakeri N]. Retrieved from http://www.nuemblog.com/blog/cognitive-bias


  1.  Croskerry, P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med. 2002; 9(11): 1184-1204.
  2. Kahneman, D. Thinking, Fast and Slow. Macmillan; 2011.
  3. Croskerry, P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009; 14(1): 27-35.
  4. Croskerry, P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003; 78(8):775-780.
  5.  Redelmeier, DA. Improving patient care. The cognitive psychology of missed diagnoses. Ann Intern Med. 2005; 142(2): 115-120.
  6. Croskerry, P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med. 2003; 41(1): 110-120.
  7.  Berner, ES, Graber, ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008; 121(5): S2-S23.
  8. Podbregar, M, Voga, G, Krivec, B, Skale, R, Parežnik, R, Gabršček, L. Should we confirm our clinical diagnostic certainty by autopsies?. Intensive Care Med. 2001; 27(11), 1750-1755.
  9.  Friedman, CP, Gatti, GG, Franz, TM, Murphy, GC, Wolf, FM, Heckerling, PS, Elstein, AS. Do physicians know when their diagnoses are correct? Implications for decision support and error reduction. J Gen Intern Med. 2005; 20(4), 334-339.