Administration and Leadership, Patient Care, Training

Escape Faulty Thinking: How to minimize the influence of bias in patient assessment

Issue 11 and Volume 33.

In EMS, we talk a lot about quality assurance, continuous quality improvement and other means of reviewing the care we give to learn from our mistakes. But we talk a lot less about why we make these mistakes and how to avoid making them in the first place. Physicians often point to the influence of cognitive errors in diagnosis and„strategies to minimize those errors. It’s time„EMS discussed these errors, too — why we make them and how to reduce the number of errors we make as a result of faulty thinking processes. In the world of risk management, these are called “pre-loss” strategiesƒthings we can do before a loss occurs.

Most errors in EMS ultimately result from incompleteor poorly performed patient assessments. However, instead of focusing our training on honing our assessment, critical thinking and critical reasoning skills, EMS training tends to focus on manual skills performed according to algorithmic protocols.

The reduction of errors in patient assessment is a worthwhile goal for any quality assurance program. Several studies indicate that eliminating adverse events in EMS should be a priority for all systems. Thorough, unbiased patient assessments, and frequent reassessments, can lead us down the right path instead of the wrong one.

Prime Examples

In 1990, the case of Wright v. City of Los Angeles awakened the world of EMS to what can happen when patients are categorized according to their initial presentation, rather than the thorough patient assessment we were taught to perform in EMT training. In the call at the center of the Wright case, paramedics found an African-American man lying in the street with an altered level of consciousness. Instead of doing a complete secondary survey, they assumed he was intoxicated and provided very little in the way of care for him. Unfortunately, the patient wasn’t intoxicated. In fact, Wright was suffering from a sickle cell crisis, and the paramedics did little to help him. Wright died on scene.

Fast forward 16 years. The 2006 case ofNew York Timesreporter David Rosenbaum is a rerun of the same bad movie. Rosenbaum was jogging in Washington, D.C., when unknown assailants attacked and beat him into unconsciousness, presumably for his iPod. EMS responded to the call, but the EMS report — made public by the media — is a shameful example of a complete lack of proper assessment. An assessment of Rosenbaum’s condition reads only ETOH. The word “found” is spelled “fond” and “trauma” is spelled “trama.” Although Rosenbaum’s Glasgow Coma Scale score was listed as a six, which, according to local EMS protocols should be a Priority 1/Unstable Patient, he was assessed as a Priority 3, which generally describes a stable patient who is not a transport priority. ALS wasn’t requested, and 23 minutes passed before a BLS unit arrived to transport him. He later died as a result of a significant brain injury. His family subsequently filed a $20 million lawsuit against the District of Columbia and Howard University Hospital, claiming medical malpractice.

What causes these gross errors in patient assessment, and how can we change the fate of our patients in the future? The good news is we can reduce patient assessment errors by understanding how they happen in the first place. We can minimize cognitive errors by understanding and acknowledging the biases that lead to them and forcing ourselves to think outside the box.

Underneath Biases

Heuristics are defined in psychology as simple, efficient rules that explain how people make decisions, come to judgments and solve problems. In the fast-paced world of medicine, they’re indispensable shortcuts and rules of thumb. They’re particularly prevalent in emergency medicine; without them, most emergency room patient flows would come to a halt. But heuristics can also cause serious errors by causing us to miss what’s right before us. We make errors when we assess patients with the understanding that they’ll fall into certain categories, and their presentations will be typical for patients in those categories. Heuristics are typically used when we’re facing complex problems or when we’re confronted with incomplete information. Welcome to the world of EMS, right? Take this example of a heuristic in the non-medical setting: People think expensive wine tastes better than inexpensive wine — but switch the labels and the price tags, and most people will still say the “expensive” one, which is actually the less expensive one, tastes better.

The problem with heuristics is that they can predispose us to respond to certain situations in certain ways. These predispositions are called cognitive dispositions to respond (CDRs). One goal of our error-reducing programs should be to unmask cognitive errors in the patient assessment process and allow for the development of debiasing techniques. A number of factors can lead to assessment errors, but knowing that these factors exist and learning how to avoid them is probably the most important step in changing these statistics.„

Emergency medicine has been called a “natural laboratory of error.” Approximately half of the litigation brought against emergency physicians results from delayed or missed diagnoses. Errors in diagnosis occur when we work in noisy environments, we’re exhausted or we work too quickly, failing to consider all available information. Particularly in the prehospital setting, we often have limited access to information and very limited time to process the information we have. Add to the mix the uncontrolled environment in which we work, and we have a recipe for error.

CDRs that Lead to Error

Many CDRs that may lead to error have been identified in studies of physicians, and several of them are particularly applicable to EMS. “Anchoring” involves the tendency for EMTs to lock onto the patient’s initial presentation too early in the diagnostic process and fail to adjust the initial impression in light of information acquired later. We’ve all experienced the ˙tunnel visionÓ phenomenon that causes us to see one patient at an MCI but not others who may be more critically injured. This CDR may be compounded by another, called the “confirmation” bias, a type of cognitive “cherry picking,” in which we tend to look for evidence that confirms the assessment we’ve already made and fail to consider persuasive evidence that changes or refutes that assessment.

In “diagnosis momentum,” once we’ve attached a label to a patient, it tends to stick — even if it’s erroneous — right into the emergency department (ED). What began as a possibility in a quickly performed assessment in the field may end up resulting in the exclusion of other potential diagnoses in the ED. Interestingly, an assessment error made by a field provider may lead to a patient being misdiagnosed down the line when busy ED physicians trust the EMS assessment rather than performing a complete history and physical examination of their own.

Example: EMSproviders brought what they described as a full-term pregnant trauma arrest patient into an ED. Because the woman had just arrested moments earlier, the ED mobilized to try to save the baby. A crash C-section revealed an empty uterus. The patient was obese; she wasn’t pregnant.

And, there’s the EMS favorite: “attribution error.” In this one, we can be judgmental and actually blame patients for their illnesses rather than fully examining the circumstances that brought about the„EMS call. We’re particularly vulnerable to this CDR when dealing with psychiatric patients, homeless patients and minority patients, especially those with cultural differences. In psychiatric patients, “psych-out error” occurs when we overlook serious co-morbid medical conditions, such as hypoglycemia, hypoxia and metabolic abnormalities. A corollary to this CDR is gender bias, in which we may believe gender is a determining factor in the probability of a diagnosis. Recent studies on women and heart disease have brought to light the tendency of medical providers to discount complaints in women who may have atypical presentations of acute coronary symptoms.

Example: A 52-year-old woman calls EMS and complains that she has recently eaten a muffin that seems to be stuck in her throat. She says she feels like she just needs to wash it down, but after several glasses of water it’s still there. Her vital signs are within normal limits, and her ECG shows no abnormalities. She has diabetes. Astute EMTs suspected a potential cardiac event, and transported her with an IV and cardiac monitor in place. A few hours later, she was diagnosed with an acute myocardial infarction (AMI). These EMTs had overridden the tendency toward gender bias, and remembered that female and diabetic patients often have atypical presentation of AMI.

Another EMS favorite: “overconfidence bias” — the tendency to believe we know more than we actually do. EMS training, even at the paramedic level, provides us with only a sliver of the knowledge of physicians, who spend many years in intense training for their practice.

Overconfident EMS personnel may act on in-complete information, and intuition. When we fail to gather evidence in a careful and systematic manner, errors can result. We’ve all known cavalier EMS providers who tend to make assessment errors in exactly this way. As we all know, things aren’t always what they seem. Just because a patient is extremely intoxicated doesn’t mean he won’t need advanced airway management, or that he didn’t take a fall that resulted in a subdural hematoma.

And, for you search-and-rescue types: “search satisfying” — the tendency to call off a search once something is found. Co-morbid conditions, second gunshot wounds or exit wounds and additional trauma may well be missed when we call off the “search” too early.

In EMS, we often fail to completely disrobe our patients for a thorough physical examination. It’s important to examine a patient’s back and the lateral aspects of the body. We deal with patients who, because of intoxication or mental illness, may not be able to fully describe their injuries, and we must become ˙detectivesÓ to find out for ourselves.

Example: EMS received a call for a gunshot wound at a bar in a rural area. Upon arrival, law enforcement officers inform EMS that they don’t think anyone has been shot. The patient is a homeless man wearing several layers of shirts and two pairs of pants. He repeatedly tells EMS he’s been shot, but when they ask where, he points to his head, where there’s no gunshot wound. After several minutes, one of the paramedics places her gloved hand underneath the multiple layers of shirts, and her hand emerges with a streak of blood. When the shirts are removed, there’s a small-caliber gunshot wound in the patient’s left upper chest. It’s later found that the bullet is lodged in his left ventricle.

Strategies to Reduce Bias

What strategies can we employ to reduce EMS errors that occur as a result of bias? Among several worthwhile strategies, scenario-based training and simulations have been found helpful in breaking down biases. We need to focus less on algorithmic protocols and more on developing critical thinking and reasoning skills that develop truly talented paramedics.

First, we have to recognize our biases and work with the insight we gain from this admission. Next, we need to remind ourselves to always consider alternatives to our initial assessment, constantly asking ourselves what else might be going on with a patient. We also need to learn to step back from difficult problems and reflect on our thinking process, even momentarily at a chaotic scene. We can decrease our reliance on memory by using cognitive aids, such as storing drug dosage protocols on an electronic device that can be easily accessed in the field. Although we’ll probably always have to live with the compressed timeframes of prehospital care, we can try to take a minute to be clear in our thoughts and reconsider our assessments.

To reduce errors in patient assessment, EMS must learn the value of critical thinking. We often carry biases that certain patients will present in certain ways, and our judgment may be clouded by these biases. Accepting them, and taking steps to change them, will go a long way toward error reduction. Careful, thoughtful patient assessments, frequent reassessment, and open-mindedness will help reduce errors. JEMS

W. Ann Maggiore, JD, NREMT-P, is an attorney and a paramedic in Albuquerque, N.M. She has been a full-time paramedic, an assistant fire chief and a state EMS administrator. Currently, she_s a shareholder in the Albuquerque law firm of Butt, Thornton & Baehr, P.C., where she practices law full time, defending physicians, police and EMS personnel against lawsuits. She_s a frequent lecturer on EMS legal issues at national conferences and holds a clinical faculty appointment at the University of New Mexico School of Medicine, where she teaches legal issues to faculty, residents and paramedic students. Contact her via e-mail at[email protected]

Learn more from W. Ann Maggiore at the EMS Today Conference & Expo, March 2Ï6 in Baltimore.


1. Wright v. City of„Los Angeles, 268 A.2d 309 (1990).

2. Washington Post article: ˙The Death of David RosenbaumÓ by Colbert

3. Croskerry P: ˙The importance of cognitive errors in diagnosis and strategies to minimize them.Ó Academic Medicine. 78(8):775Ï780, 2003.

4. Croskerry P: ˙Achieving quality in clinical decision-making: Cognitive strategies and detection of bias.Ó Academic Emergency Medicine. 9(11):1184Ï1204, 2002.

5. Rosen MA: ˙Promoting teamwork: An event-based approach to simulation-based teamwork training for emergency medicine residents.Ó Academic Emergency Medicine. 2008 Jul. 14 [Epub ahead of print]

6. Vincent DS: ˙Teaching mass casualty triage skills using immersive three-dimensional virtual reality.Ó Academic Emergency Medicine. 2008 Aug. 10 [Epub ahead of print]