Expert Advice

National Data Collection Efforts Pose Challenges for Many EMS Agencies

There is increasing pressure on all EMS services to participate in national data collection efforts such as the Cardiac Arrest Registry for Enhanced Survival (CARES),1 Get with the Guidelines,2 National Emergency Medical Services Information System (NEMSIS)3 and the EMS Compass Initiative.4 The basic premise of these federal initiatives is that improvements to the efficiency and effectiveness of patient care should be data driven. Prehospital data provided by EMS services is one important data source in understanding how to improve system response.5

The responsibility for national data collection is often passed on to the states, which in turn pass the responsibility to local EMS agencies. The purpose of this article is to share some of the challenges this model creates (especially for rural EMS services), the consequences of these challenges to data validity and the role EMS leadership can play in preventing and mitigating these problems.

Rural Challenges of Federal Registries

Many rural EMS agencies are volunteer dependent, experience high turnover and struggle financially.6 These agencies are either unable to dedicate staff to collect needed data, or must rotate data collection responsibilities. The latter adds training costs and introduces data entry reliability issues. Further, data entry responsibilities are often assigned to EMS staff, which is especially problematic in rural, volunteer dependent agencies where leadership fears losing a volunteer if he/she is assigned the tedious task of data collection.7

EMS staff responsible for data entry require training and release time, the costs of which are absorbed by the EMS agencies.1 To compound matters, EMS agencies are often charged with participating in the national data collection efforts. Given the significant cost burden and the fact that many EMS agencies are struggling financially, the idea of participating in unfunded national data collection efforts is a) simply impossible to fulfill, b) at a minimum met with resistance, and c) poses an ethical concern (i.e., nonmaleficence) as finite resources are diverted away from patient care to support data collection.8

Assuming a rural EMS agency can afford to train staff, it is still challenged to sustain high quality data collection and entry. The high turnover in rural EMS services means continual retraining, which many rural EMS services cannot afford. Further, the time critical events for which national data is required are relatively infrequent for rural EMS services. Thus, there are few opportunities to keep data collection and entry skills honed. Therefore, when an event does occur, it requires significantly longer time to enter the information and is prone to skill decay errors.

Rural EMS agencies are also asked to collect many of the same data elements for multiple registries. For example, stroke, trauma and cardiac arrest data are collected by three different registries, but all require basic time stamp and treatment data. One solution being used by rural EMS and state agencies is to develop local/state databases and then export data to the various registries. Unfortunately, this cost saving strategy is offset by the significant and, in the authors’ opinion, unethical cost software vendors charge to merge data with the national registries.

Exporting data is further complicated by the plethora of different nationwide database software programs being used by the different EMS agencies and the different ways data is collected (e.g., manual entry versus transmitted ECG). The lack of database interoperability creates data integrity concerns during data export.9,10

Consequences of Challenges

Our qualitative evaluation of EMS services in seven rural states shows that lack of financial compensation coupled with additional time demands on already burdened rural EMS staff lowers the motivation to collect data for the national registries. This lack of motivation has two predictable effects. First, many agencies will simply not participate. This may explain why data needed to improve survival rates for time-critical events is only being collected by approximately one-fifth of EMS agencies.1 Second, when pressured to provide national data, EMS staff confess to falsifying or reconstructing the needed data elements.

Our quantitative analysis of EMS databases in several rural states substantiates our qualitative findings. Missing data was a prevalent issue in 1,130 cardiac arrest responses collected from 100 EMS agencies in two rural states from 2012 to 2015. For example, over 40% of cases have various missing key data elements such as 9-1-1 time, dispatch time, en route time, time departed scene, time arrived hospital and odometer readings. The topic of data quality and usability of prehospital data is gaining more attention as evidenced by a national study recently undertaken by Saint Louis University.11

The Role of EMS Leadership

Leadership is a key attribute for system change.5,12 Here are some suggestions for leadership when confronted with requests to participate in national data collection efforts:

1. Ask why the data is needed and what questions it will answer. Researchers must be able to explain how the data will be used to answer quality improvement questions. The relationship between the data collected at the national level and local quality improvement isn’t always clear. Lack of a common understanding of the data collection goals negatively impacts rural EMS agency participation.13 Researchers should be deliberate in requesting data needed to answer questions based on sound theory.14 Researchers often engage in a shotgun approach to data collection: gathering data that might be useful at some unspecified junction down the road. This approach is in violation of the ethical principle to respect persons15 and simply poor science.16 Understanding why data elements are needed is critical to creating a culture of documentation especially when these duties are added job responsibilities and/or assigned to volunteers.17,18

2. Participate in bottom-up approaches to identifying data elements. To improve the data quality, EMS agencies must perceive the national data elements to be of value and utility.19 To do this, a bottom-up, user-driven model—as opposed to the current top-down, research-driven approach to defining national data elements—is essential. A bottom-up approach is supported by the Institute for Healthcare Improvement (IHI). In a report on leadership and quality improvement, the IHI noted within an organizational context that any potential change is more likely to be successful when it is developed in collaboration with all levels of stakeholders. 20 The importance of the process by which national registry data elements are defined cannot be overstated. When data elements are prescribed, there is resistance. Approaches used by initiatives such as EMS Compass show great promise in honoring EMS agency-level data needs.4

3. Opt for local control that consolidates multiple national registry data elements. Rural EMS agencies struggle to meet the demands of multiple national registries. One effective solution currently in development in Nebraska is a registry that combines data normally seen in separate national silos of data (CARES, Get with the Guidelines, CathPCI). The single super-registry allows for immediate metric feedback and performance relevant to local urbanicity needs, reduces price per data point, improves security by eliminating data redundantly copied into multiple databases, and is not held by a proprietary entity charging additional fees for membership. Challenges to completing this type of super-registry are educating states about the benefit of pooling their data and sharing large de-identified data sets, and working with vendors to identify the core elements needed in a super-registry to minimize any new additional costs.

4. Insist on local benchmarking ability. An effective national registry provides its users with credible, relevant, specific, timely and sufficiently frequent feedback.5,21 However, national registries are challenged to meet the criteria for successful feedback. For example, the CARES evaluation report states, “Often, agency staff who are involved in the manual entry of CARES data elements do not see the hospital outcomes data, nor do the hospital nurses see the data elements collected from the ePCR.”22 Closing system feedback loops by providing sufficiently frequent, understandable and usable information to the local EMS subsystem user increases data transparency and motivation to participate.5 Our evaluation substantiates that many rural EMS agencies feel national registry data are useless because they are urban biased (simply based on call volume). Rural EMS leadership should insist that data collection software be configurable to allow for local level benchmarking. Our evaluation suggests two key configurability parameters are necessary to increase motivation to participate in a national registry. First, rural EMS agencies must be able to benchmark against their own previous performances. Local EMS agencies are intimidated by the comparisons to other EMS agencies and such a comparison can have the reverse effect, fostering feelings of inadequacy and decreasing motivation. Second, EMS agencies desire the ability to compare to other EMS agencies with similar demographics.

5. A new local/state EMS database is NOT the solution. One solution to data collection issues sought by many states is to purchase new EMS data collection software. Be warned! This will not solve any other data reliability and validity issues. New EMS data collection software itself is not a silver bullet. Engaging EMS leadership to support system change, increasing intrinsic motivation to participate by providing timely and credible feedback through highly configurable interfaces, and removing unnecessary system redundancies are some of the strategies necessary to sustain a viable data collection system.

6. The compensation model must be reversed. If the national registry requires the data and these data are important to the wellbeing of all taxpayers, then it should be a federally funded mandate. As currently designed, most registries are essentially unfunded mandates.

7. Remove punishment for not participating. The IOM recommends state and local health departments “[m]andate tracking and reporting of all cardiac arrest events.”1 We question the wisdom of mandating participation. It is unethical to force participation at any level.23 It is evident that many of the rural EMS services in the states we evaluated do not have the resources to comply. Adding this burden may be the proverbial straw that breaks the camel’s back—forcing them to close or encouraging data falsification. Moreover, there is no historical evidence of which the authors’ are aware that mandating states to gather data increases compliance or data integrity.

 

References

1. Institute of Medicine of the National Academies. (June 2015) Strategies to improve cardiac arrest survival: A time to act. Retrieved June 7, 2016, from http://iom.nationalacademies.org/Reports/2015/Strategies-to-Improve-Cardiac-Arrest-Survival.aspx.

2. Foltysova J. Stroke Get with the Guidelines Preliminary Evaluation Results [report prepared for state Dept. of Health]. North Dakota, 2015.

3. National EMS Information System (NEMSIS). (March 15, 2013) Goals and objectives. Retrieved June 14, 2016, from https://www.nemsis.org/theProject/whatIsNEMSIS/goalsAndObjectives.html.

4. EMS Compass. (n.d.) About This Initiative. Retrieved June 14, 2016, from http://www.emscompass.org/about-ems-compass/.

5. Renger R. System evaluation theory (SET). Evaluation Journal of Australasia. 2015;15(4):16–28.

6. Freeman V, Slifkin R, Patterson P. Recruitment and retention in rural and urban EMS: results from a national survey of local EMS directors. Journal of Public Health Management and Practice. 2009;15(3):246–252.

7. Cole D. The Challenges Posed by Rural EMS Volunteers [conference session]. National Rural EMS Conference, 2015.

8. Forester-Miller H, Davis T. (1996) A practitioner’s guide to ethical decision making. American Counseling Association. Retrieved June 14, 2016, from http://www.counseling.org/docs/default-source/ethics/practioner’s-guide-to-ethical-decision-making.pdf?sfvrsn=0.

9. Granillo A, Renger R, McPherson M, et. al. Redfield-Aberdeen-Sioux Falls cardiac arrest drill [after action report/improvement plan]. 2014.

10. Granillo A, Renger R. South Dakota patient data flow drill [after action report/improvment plan]. 2016.

11. Garza A. (2016) ePCR usability study. Saint Louis University. Retrieved June 14, 2016, from https://slu.az1.qualtrics.com/SE/?SID=SV_78a6FoQDINLYBmd.

12. Eisenberg M: Resuscitate! How your community can improve survival from sudden cardiac arrest. University of Washington Press: Seattle, 2013.

13. Reed L, Reed J. Minot, N.D., continuous quality improvement study meeting [personal communication]. 2015.

14. Johnson B, Turner L: Data collection strategies in mixed methods research. In A Tashakkori, C Teddlie (Eds.) Handbook of mixed methods in social and behavioral research. Sage Publications: Thousand Oaks, CA,297–319, 2003.

15. U.S. Department of Health & Human Services. (n.d.) Belmont Report. Retrieved June 14, 2016, from http://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/#.

16. Chase S. The art of diagnosis and treatment. In L Dunphy, J Winland-Brown, B Porter et. al. (EDs.) Primary care: the art and science of advanced practice nursing. F.A. Davis Company: Philadelphia, 2015.

17. Mahoney J. Leadership skills for the 21st century. Journal of Nursing Management. 2001;9(5):269–271.

18. Owen J. How to Lead: What you actually need to do to manage, lead, and succeed. Prentice Hall: Upper Saddle River, NJ, 2005.

19. Patton M. Utilization-focused evaluation. Sage Publications: Thousand Oaks, CA, 2008.

20. Kabcenell A, Nolan T, Martin L, et. al. (2010) The Pursuing Perfection Initiative: Lessons on Transforming Health Care. IHI White Papers. Retrieved June 14, 2016, from http://www.ihi.org/resources/Pages/IHIWhitePapers/PursuingPerfectionInitiativeWhitePaper.aspx.

21. Chen H, Hailey D, Wang N, et. al. A review of data quality assessment methods for public health information systems. International Journal of Environmental research and Public Health. 2014;11(5):5170–5270.

22. Barron-Simpson R, Elmi J, Valderrama A. (Dec. 16, 2011). Evaluation of the Cardiac Arrest Registry to Enhance Survival (CARES): Evaluation Report. Retrieved June 14, 2016, from https://mycares.net/sitepages/uploads/2013/04/CARES_Evaluation_Report_Final.pdf.

23. Elliott D, Stern J (Eds.). Research ethics: A reader. University Press of New England: Hanover, NH,1997.