Administration and Leadership, EMS Insider

Critical Thinking, Part Two


Mindsets are lenses through which the information we receive passes Mindsets help us to understand and make sense of information; they shape the world we see.  Walter Lippmann explained it best when he wrote: “We do not first see then define – we define first and then see.” For example, if the station is told that Bob is cooking tonight, some will view that with joy and others will grimace. Mindsets help us sort through complex information and make quick decisions and without them, the world would be too complex to comprehend. 

The most important influences on our mindsets are the identity groups we belong to, such as race, gender, religion, generation and ethnicity.  Organizational groups also help us to form mindsets, such as our occupation (public safety provider), specialty area (emergency medical services), years of experience, marital status or parental status.  Personal experience, education, values and what we perceive as “organizational norms” also contribute to mindset.

A good example of how culture affects our mindset is the color yellow.  In the United States, yellow is perceived as a warning (such as caution road signs), or as cowardice (they are yellow-bellied!)  In Malaysia, yellow is perceived as royalty and in Venezuela it means luck. 

Advantages and Disadvantages

There are advantages and disadvantages to mindsets.  They are largely unconscious, resistant to change, easy to form, allow us to absorb new information quickly, provide insight into complex information and eventually allow us to become experts.  However, they are also largely unconscious, resistant to change, easy to form (with advantages come disadvantages), create a potential for tunnel vision, filter out important information and they tend to be biased and harder to unlearn the more experienced we become.

Critical thinkers do not eliminate mindsets, but rather have a deep understanding of what their mindsets and others’ mindsets are, and reflect on how these mindsets are impacting their understanding and perspective. 

When we gain a new insight from understanding our mindsets, it’s important to share that perspective with our colleagues.  John Seely Brown wrote “Instead of pouring knowledge into people’s heads, you need to help them grind a new set of eyeglasses so they can see the world in a new way.”


A bias is a particular tendency or inclination.  If mindsets are lenses, then biases are filters, and they can lead us to mistakes.  We all have biases in life, many from experience, or inherited biases from those we trust.  Moral and religious views, for example, are biases that are a part of our everyday life.  If you observe a parent spanking a child, you may view it as discipline or abuse.  There are a number of cognitive biases and we will focus on those that affect us as EMS providers.

The Ambiguity Effect

The ambiguity effect occurs when we avoid a path or option because we lack complete information or the probability is uncertain.  People tend to prefer certainty over ambiguity, so decisions are made not necessarily for the best outcome, but because one outcome has more information than the others.  We may decide with our patients to “just load and go,” not based upon getting them to a surgeon more quickly, but because we don’t feel comfortable with a treatment plan.

The Clustering Illusion or Non-Existent Pattern Bias

Another bias is the clustering illusion or non-existent pattern bias.  Humans are uncomfortable with randomness and chance, so we try to link things together and create patterns that don’t exist.  When we see that there isn’t a pattern, we tend to attribute this to a lack of information.  Others explain randomness as fate or God’s will, things which are preordained. 

Sometimes with this bias, in an effort to form a pattern, we will ignore data or information that does not fit the pattern we see and highlight or stress information that does fit.  A good example is diagnosing a patient upon first contact (they are drunk) and ignoring the signs and symptoms that it might be something else (hypoglycemia).

The Hindsight Bias

The hindsight bias is another cognitive error that can affect critical thinkers.  The hindsight bias operates under the assumption that we have a faulty memory and tend to overestimate our accuracy in past judgements.  You may have encountered this when an event happens and someone tells you “I saw it coming!”  However, that person didn’t have some or all of the information, most likely didn’t have an opinion, or didn’t give their prediction to anyone before the event occurred.  As we recall past events to make decisions, their false reconstruction can lead to poor outcomes.


As we come to the end of this article, spend some time reflecting on the mindsets and biases covered.  Where can you apply these concepts in your own work?  Have you experienced examples of these mindsets and biases?  Think of ways to remind yourself of their existence and how you can avoid their pitfalls.


Scott Cormier is a nationally registered paramedic, and serves as the Vice-President of Emergency Management, Environment of Care, and Safety for Medxcel Facilities Management.  He is also a board member of the International Association of EMS Chiefs.  Please join us for the IAEMSC annual summit in Washington, DC in October. Visit for further details.