My policy of our board and by long-standing tradition, the EMS division of the Montgomery County Hospital„
District (MCHD) is committed to providing the best prehospital care possible. As„a part of that, our system has an aggressive, non-disciplinary quality improvement (QI) program that allows us to provide our community with a high level of service.„
Over the years, we_ve followed accepted EMS industry approaches to data analysis. When we found opportunities for improvement on operational and quality issues, we made changes and rolled them out to our staff via quarterly CE sessions.„In general, these approaches were successful at maintaining and improving the overall quality of our service.„
But there was one concern: The level of improvement was inconsistent with the level of effort involved. Despite our ardent attempts, we weren_t completely satisfied with the level of progress in certain areas. Something was wrong, and we weren_t sure what. We knew it wasn_t a data problem; MCHD is data-centric. We_ve utilized a host of data-collection tools, including computerized medical records, fully integrated computer-aided dispatch, satellite tracking of our deployed resources, electronic billing and a host of other data points. So we got to thinking.
Several years ago, a theory emerged. Perhaps our approach to quality had a systemic problem. Perhaps the processes we, and most others in our industry, were using didn_t allow us to see and react to„the “story behind the story.”
Frankly, our data and analytical processes allowed us to identify issues, but„they didn_t give us the level of understanding necessary to uncover the root causes. We also began to realize our methods of communicating issues and remedies to staff weren_t up to the task at hand. There had to be a better way.
We determined the answer lay in revising our approach, adopting different methods for analyzing data and communicating the results. We had significant amounts of data on almost every element of operations and patient care, but we needed a better framework„for analyzing that data, as well as a better„set of principles for passing the information along to those who needed it.
We investigated quality management„in other industries and found that underlying the success of most QI programs in the manufacturing and business world was a consistent and well-defined analytical framework. Leading organizations worldwide were using these frameworks to reduce time, costs, errors and redundancy. These programs include Total Quality Management, Lean, Six Sigma and many others. After reviewing our options, we chose Six Sigma as our new foundation for managing quality.
Although it_s not our intention to go into detail about the Six Sigma methodology, it_s helpful to understand what it is and how it works. Six Sigma was originally developed by Motorola as a strategy for quality management. At its heart are data-driven quantitative techniques to integrate various quality and performance tools and approaches into a well-organized package.
The foundation for the Six Sigma methodology is referred to as “DMAIC”:„Define the process you wish to improve,„measure it by developing appropriate data sets,„analyze your performance by employing appropriate quantitative and statistical tools,„improve that process by taking the appropriate steps, and, once improvement is noted,„control it to ensure it_s sustained (see circle graphic, January 2009„JEMS,„ p. 44).
Although Six Sigma_s goal (to not exceed a failure rate of 3.4 defects per million in produced output) is clearly aimed at the manufacturing world, using the tools and techniques offered by this methodology and driving failure rates as low as possible is a worthy goal for EMS.
To bring the new quality management approach to our organization, we needed a rallying point to engender ownership among our stakeholders. Thus was born an initiative we call “MCHD Measures.”
With the quantitative strategies of Six Sigma, adapted for use in the EMS environment, we set about to close the loop by improving our communication with the information end users. We adopted an„age-old model regarding the learning process. That model holds that “analyzed data yields information, which when appropriately disseminated, results in wisdom” (see Figure 1, p. 44.)
In the past, we had acted more as “data aggregators,” compiling statistics until we had a clear vision of a trend. Then, and only then, did we attempt corrective action. This approach meant operations and clinical managers knew issues that may have adverse affects on quality were developing,„but they weren_t passing that information down to the ultimate userƒthe field staff and other operations and clinical staff. There was nothing deceitful in this discontinuous flowƒit was just how we, and most of our industry, did things.
For MCHD Measures to be successful, we needed to transition from data aggregators to information distributors. We needed to find ways to speed up the entire process of moving from data collection to corrective action and follow-through. The key to this increase in timely information flow was getting quality and performance feedback, in the appropriate context, back to the end users as quickly as possible.
For the more critical quality and operational issues, we developed a new communication timeline. On day one, we collect the relevant data. On day two, we analyze and review the collected data to extract information. Then, on day three, we take the acquired knowledge and disseminate it among staff members to create a learning experience. It_s a simple concept: Getting the right information, to the right people, at the right time allows that information to have appropriate corrective impact. In many cases, our target is to get information into the hands of the end user by their next shift.
Since the introduction of Six Sigma, we_ve found that when a paramedic is provided with specific feedback on their performance in a timely and non-threatening way, they_re able to understand the effects of their actions and are therefore more likely to do things better the next time.
In the clinical areas where we_ve used this approach (e.g., airway management), we_ve seen considerable and sustained improvements (read “Airway Data Improves Outcomes” January 2008„JEMS, p. 26.) However, MCHD Measures has broader uses than just tracking clinical quality.„
We also use the tools and techniques in our operational and support services. For instance, our deployment committee recently went through a rapid cycle change process involving our deployment plan. They used data about unit call volume, unit hour activity and post moves to assess the impact of their changes. By applying the Six Sigma analytical strategies to this data, they were able to see trends in real time, develop responses and take appropriate corrective actions more quickly. The committee continues to monitor these metrics via an intranet dashboard in order to make sure their corrections remain effective.
One of the MCHD Measures key operational successes was related to response times. Again, using the tools and approaches of Six Sigma, we were able to statistically view significant response-time issues in far greater detail. As a result, we changed our pre-alert processes and lowered our average response time by 45 seconds.„
Another example of the success of MCHD Measures is the positive impact we were able to have on our billing function. We were able to develop and verify information that allowed us to improve policies and procedures that substantially increased our collection percentages and, therefore, our cash flow.
We_ve demonstrated improvements in many other areas as well. So far we_ve used the Six Sigma method to reduce our time to first pain medication, more appropriately use CPAP, justify new staff, improve our hiring practices and use resources more„
It_s important to note that MCHD Measures isn_t without costs and implementation„issues. For instance, although the data collection part of the strategy was already built into our day-to-day operations, transferring the data into the tools used for analysis is labor intensive. We_re taking steps to automate the process, but until then we_ll continue to have to commit staff resources to the process.
MCHD Measures also required some changes in our organizational culture and the way we deal with the quality function. Speeding up the process of getting from data to remediation required some substantial changes in how we evaluated significant clinical events. Also, it was necessary to make sure end users, who are now quickly called to after-action meetings and shown specific information on a particular clinical issue, know they_re not being disciplinedƒbut taught. Finally, staff members needed to accept that we monitor their performance metrics in detail.„
Despite the resource allocation issues and the needed cultural changes, everyone, from our board members to the end users, agrees that the value brought by Six Sigma and MCHD Measures has far outweighed the costs.
By changing how we approach quality„and operational issues, we_ve found a way to break through some problems that have stumped us for years. The statistical and quantitative methodologies of Six Sigma, along with the new approach to communicating with our staff, has allowed us to shed light on some details that were previously hidden from immediate view by the sheer quantity of uncorrelated data.
MCHD Measures demonstrates that a good QI function should be driven by data and follow a consistent, organized process of detailed analysis. We_ve also learned the importance of moving information to the end user quickly and non-punitively, while giving them the opportunity to own and take responsibility for their specific performance.
We know MCHD Measures is working for us. We have the outcome data and other statistics to prove it. Our employee surveys show a marked increase in staff knowledge and positive attitudes about projects. As an organization, we_re committed to expanding MCHD Measures to include additional operational and clinical issues.„
If you have questions or would like more information about the tools and techniques we have implemented, please feel free to contact us. Or, you can also watch a video about MCHD on„www.jems.com.„JEMS
Kelly Curry,„ RN, EMT-P, is the deputy administrator at the Montgomery County (Texas) Hospital District. Contact him at„[email protected]„„
Allen Sims,„ EMT-P, is the EMS director at the Montgomery County Hospital District. Contact him at„[email protected]„„
Acknowledgement:„ Michael Lambert, president of the consulting firm PaladinSG, contributed to this article.„