If you’re confused about the difference between the term “near hit” and “near miss,” check out the YouTube video of George Carlin on near misses. Our co-author Paul LeSage recently had the opportunity to review and update an organization’s event reporting system. The risk and quality folks proudly walked him through the detailed database and entry system that would be used, they hoped, to report “events.” They had also structured the system to look for near misses. Apparently they had not watched the Carlin video.
Rather than “event reporting systems,” we really need “risk reporting systems.” In other words, why wait for an event to occur? We need to teach our employees how to see risk in our systems and behaviors, report what they see and suggest resolutions. Unfortunately, most systems are retrospective in nature—something has to happen before we record it and take action.
Too often reporting systems work against the reporter. Blind reporting systems, while showing us overall trends, are nearly useless in determining causal and contributing factors to risk. There is no method to follow up and gain context and important detail associated with the systemic and behavioral issues that led to the risk (or event). Those systems that require the reporter to name themselves (and to name those who made the error or had the near miss) need to be carefully administered to ensure bias, injustice and wrongful conclusions don’t conspire to undermine the data and intimidate the reporters or their peers.
Reporting systems need to be created and installed with the real involvement of front-line employees. A good discussion on what’s going to be done with the information reported and robust training on how it’s used is unlikely to provide accurate information. Think of what a near hit reporting system would look like if you were required to report these situations while driving your car, but had no training on exactly what constitutes a near hit, and had no personal assurance that reporting your own behavior wouldn’t implicate you. What would you report? It’s unlikely you’d report your own speeding (a behavioral choice that increased risk), or your own error in not using a turn signal (an inadvertent act maybe influenced by distraction). Instead, you’d report the other knucklehead who ran the stop sign and almost hit you. You’d report the guy who cut you off in traffic.
As we continue to explore ways to quantify and qualify risk in EMS, we need reliable systems that we can rely on to collect data on risk and events, and to be willing to invest in looking carefully at how the systems we have developed drive or incentivize the behaviors of our employees.