EMS Insider, Expert Advice

Meaningful Safety Policies

Ask any EMS leader if safety is one of their organization’s core values and it’s unlikely that you will hear anything other than, “Of course the safety of our providers and patients is one of our top values.”

 

If you look at safety from the provider’s perspective, the risks come mostly from driving (particularly emergency driving); taking care of people with contagious bacterial or viral infections; dealing with violent patients or bystanders; PTSD; and lifting-related injuries. Patients share some of the same risks, including auto crashes and acquiring a viral or bacterial infection from a sick EMS provider or from another patient’s bugs due to poor EMS hygiene practices. Patients also have their own special set of risks including being dropped, assessment or treatment errors, and electronic medical record data entry errors.

 

If an EMS organization is truly committed to a 100% safe workplace, then all driving should cease, helicopters should never be used and all patients should be treated with Ebola level precautions. Clearly this approach would cause conflict if there is also to be a commitment to good care and service. The reality is that if an organization is truly committed to 100% risk free safety, it should not be in the EMS business.

 

So if an EMS organization is committed to safety and good care/service, how do leaders strike a balance that produces the best result? The key lies in two primary domains: building safety into the system and creating a healthy organizational culture.

 

When it comes to building safety into systems and processes, there are several things that we can do—back-up cameras in all emergency vehicles, stair chairs designed to minimize the strain on the bodies of providers, using disinfecting stethoscope diaphragm covers on every stethoscope in the system, making hand sanitizer available at every door of the ambulance and on the belts of providers.

 

However, with every system there typically comes a behavioral expectation of some sort. We can provide the hand sanitizer, but we also need to explain the risks to the employee of not using it and put an expectation in place (often a rule, policy or procedure) that should be clearly communicated to the employee.

 

Not surprisingly, what we are learning about safety systems is that there is a probability associated with whether the employee will use the system or bypass it. This probability is directly associated with whether the employee sees the true risk (this is different than a fear of being punished) in not using the system or following the rule, and whether there is a more dominant competing value that makes following the rule difficult. These values and risks are communicated to our employees overtly (we clearly say what they are), and covertly (we communicate contrary values without intending to).

 

For example, we say “Drive with due regard. Follow the rules. Stop completely at all stop signs.” That’s the overt message. We also discuss with our employees performance expectations, maximum allowable travel times for emergency responses and survival rates that increase when definitive care arrives early and fast. Covertly, we are saying “Get there quickly! Your patient’s life may depend on it! Your job may depend on it!”

 

When we deconstruct behavioral choices that lead to errors within our systems, our current method of root cause analysis does not anticipate or consider competing values and probabilities. The work ahead of us to change our organizational cultures will depend on whether we can see beyond the rule we created, and whether we are able to consider how our employees’ behavior is often shaped or incentivized by the system we developed, and both the overt and covert messages we have been sending.

 

Putting up a poster that says “Safety First” does not improve anything but your decor. Making actual safety improvements requires structural changes in your systems and processes. It also requires a workplace where people know what safe actions are, how to do them and why they matter. We need places where providers feel comfortable that they will be treated fairly if they make a mistake, even if it causes harm.

 
 

 

Paul LeSage, former fire chief and flight medic, is with Critical Decision Partners LLC (cdm-hro.com) and conducts post-incident analysis using the “Just Culture” framework.

 

Mike Taigman, lifelong student and general manager for AMR’s Ventura County (Calif.) and Gold Coast operations, is a certified improvement advisor with the Institute for Healthcare Improvement.