EMS as a system of care was born out of the need to improve outcomes of ill and injured patients. The document that fueled the emergency services movement, Accidental Death and Disability: The Neglected Disease of Modern Society (1966), focused on the clinical outcome of survival after trauma.1 Effective operations and system design were recognized as contributors to the clinical outcomes of preventing death and permanent disability.

EMS provides healthcare using a wide variety of models, but very little is known about the quality of care that they provide.2 The medical specialty of EMS has made great strides in measuring and improving the quality of clinical care; identifying and systematically addressing the multifactorial barriers to quality improvement efforts at all levels is crucial.

Recently, the National Association of EMS Physicians (NAEMSP) published a position statement entitled Defining Quality in EMS to identify what constitutes quality within EMS.3 The ten elements of the position statement address the multidimensional nature of quality in EMS. Our goal is to provide context on the specifics of each element of the position statement and to offer insight and opportunities that agencies, systems and leaders can use to build a culture of quality.

“Quality in EMS must prioritize patient outcomes. The complexities of EMS and the diversity of the practice environment require attention to structural and process measures to build improved care delivery; however, the EMS community must strive to develop, promote and implement measures that capture meaningful effects on patient outcome.”3

EMS quality measures have historically focused on operational metrics that are easily quantifiable and publicly visible, such as the eight-minute response time metric derived from some of the original studies on prehospital cardiac arrest.4 Before AEDs were available, studies demonstrated improved outcomes with more rapid CPR and defibrillation. However, these results were widely misconstrued as evidence that ALS care should be available within eight minutes in the urban EMS environment.5-7

Subsequent studies have found that exceeding the eight-minute response time in an urban system did not negatively impact survival of critically ill or injured patients.8,9 Practices such as lights and sirens to decrease response time have been shown to produce clinically insignificant time savings, yet are cited as a significant factor in fatal ambulance crashes, creating risk to patients, providers, and the public.10-12

Efforts to improve quality in patient care must address numerous interdependent factors that impact ultimate patient outcome. The complex relationship between patient, structure, and process factors is ultimately responsible for patient outcome.13 (See Figure 1.)

Structural factors include the setting where care is provided, which is impacted by system design, geography, equipment, staffing, provider level and knowledge base. Process data are more complex and include the interaction of the EMS provider and the patient. For example, in cardiac arrest, process measures include compression fraction, time to defibrillation and appropriate destination decision. The term “performance measures” has been used frequently in the EMS quality literature and in national initiatives relating to quality in EMS to describe successful care.4,14 Performance measures are equivalent to a “process measure” in the classic quality measure terminology. The EMS community must broaden how we define quality, and the umbrella term “quality measures” must incorporate the triad of structure, process and outcome measures specifically designed for our unique healthcare environment. (See Table 1.)

Although outcome-based measures are ultimately the most important to patients, the current limitations of data integration between EMS and hospital records often dictate that quality efforts focus on process measures instead. Process-based measures are still enormously useful, as they inherently suggest specific courses for improvement and are generally more actionable, but they should be grounded in high-level clinical evidence that tightly links them to patient-centered outcomes.15

Several national efforts to define EMS quality measures have had variable success, including EMS Compass for EMS agencies, and GAMUT QI for critical care transport agencies. Systems of care projects such as CARES, TQIP and Mission: Lifeline have incorporated elements of EMS/prehospital care in their quality measures.16-19 (See Table 2.)

The National EMS Quality Alliance (NEMSQA) was formed in 2018 to unify and advance this process. As EMS ability to link to hospital records and measure patient outcomes improves, more comprehensive measures of EMS impact on clinical outcomes will become feasible.

“Quality efforts are dynamic. A high-quality EMS system should be continuously advancing toward a safer system that improves patient, provider, and population outcomes.”3

EMS quality efforts have historically focused on retrospective quality assurance to maintain a desired level of quality rather than prospective quality improvement. Such a static approach is insufficient given the rapidly evolving knowledge base. For example, our expectations of survival or recovery from conditions such as sepsis, stroke and out-of-hospital cardiac arrest have changed dramatically over the last few decades. The quantifiable outcome targets of today should not define those of tomorrow. To improve quality on a continuous basis, healthcare has adapted many important lessons from the industrial quality improvement field.20,21 The Plan-Do-Study-Act (PDSA) cycle is one of the most commonly used approaches for continuous improvement in healthcare.22 After identifying what the system aims to accomplish, the four-phase cycle involves inclusively planning ideas for improvement, followed by piloting the intervention, studying its effects, and then incorporating this information to refine and move forward. Change can be piloted as small experiments, ideally by the EMS providers who propose them, with guidance from leadership.20 Success depends on establishing a culture of improvement, where it’s understood that progress happens through system changes rather than blaming individual providers.

Continuous quality improvement should take place at the local, regional and national levels. While variation in EMS system models, hospital resources, and geography makes it difficult to devise “one size fits all” solutions, the public is best served when EMS is integrated into the larger healthcare system. Initiatives in EMS require examining every step of the process of health at the community level, from public education to the patient’s most appropriate destination.

To take an example, the care of patients with ST elevation myocardial infarction (STEMI) has been greatly improved through national, regional, and local initiatives. Iterative projects such as the Mission: Lifeline Accelerator have decreased time to reperfusion and lowered mortality for patients with STEMI by coordinating care on a regional level and promoting best practices, such as prehospital ECG acquisition and cardiac catheterization lab activation.16,23,24 Such regional efforts not only promote but also rely heavily on continuous quality improvement efforts at the local or agency level for success.25 Quality STEMI care relies not only on a regional system of care, but also on adequate provider education, timing and interpretation of prehospital ECGs and destination decision. Individual EMS agencies are foundational within the system of care, and thus their meaningful involvement is critical. (See Figure 2.)

Figure 2: High-quality patient care is a result of continuous quality improvement efforts at a number of levels.

“Quality EMS care should embrace current evidence-based practice in all EMS domains from system design to clinical practice. EMS leaders should promote timely knowledge translation through the development, dissemination, implementation, and monitoring of evidence-based guidelines that inform practice at the national, state, and local levels.”3

The role of research in the quality improvement process is to define system goals and identify evidence-based process measures that impact patient outcomes. As an example, research has demonstrated the negative impact of hypoxia, overventilation and hypotension on outcomes of patients with traumatic brain injury.26,27 Although historically, “expert opinion” suggested that patients with severe traumatic brain injury benefited from intubation, research has shown variable effects of this procedure on patient outcome, likely due to associated physiologic insults such as overventilation and peri-procedural hypoxia.28 Therefore, although improvement in intubation success rates for patients with severe traumatic brain injury may have been a historical goal, we now realize that refocusing on effective airway management and minimizing hypoxia and hypotension have greater potential for positive impact on patient outcome.29

Out-of-hospital research is particularly challenging for several reasons. These include issues around informed consent, lack of uniform data definitions, inadequate linkage of prehospital and hospital data, and a paucity of funding and EMS researchers.2,30,31 However, even when high quality evidence exists, it is often not incorporated into clinical practice—a process known as “knowledge translation.”32,33

Evidence-based guidelines (EBGs) are practice recommendations based on rigorous review of existing research evidence. During EBG development, the benefits and harms of alternative care options are assessed. In the EMS realm, several national efforts have developed (and continue to develop and revise) EBGs to inform best practices in prehospital care. (See Table 2.)30,34-36 Importantly from the quality perspective, many of these efforts concurrently develop quality measures which can be tracked at the individual agency level or higher.

EBGs will not impact the quality of clinical care if they don’t change the understanding and behavior of frontline EMS providers.37 Changing care begins with initial EMS education, which should include basic research and quality improvement concepts as well as emphasize that clinical practice should change when the evidence does.2,30,38 On an ongoing basis, education of EMS providers represents the greatest challenge in guideline implementation.39,40 EMS leaders (and in particular, medical directors) should partner with frontline providers to identify the barriers to change and identify specific educational needs so that they can be addressed.

“Adequate infrastructure to support quality efforts must be developed and supported at local levels. It should include the following features:

  • Imbued with methodology that promotes continuous improvement
  • Developed in partnership with EMS operational leadership, providers, and medical directors
  • Adequately resourced to enable medical directors and quality personnel to perform data review and outcomes reporting
  • Integrated into daily operations
  • Linked to education and evaluation”3

“EMS should support and develop quality improvement training and/or certification for personnel dedicated to this effort.”3

“EHRs and reporting systems must support quality improvement monitoring and reporting requirements. Agencies of all sizes should have access and be able to implement this technology. Improving data capture for quality improvement will enable EMS agencies to analyze data and will allow regulatory and governmental agencies to understand the effects of EMS care.”3

Infrastructure for EMS quality improvement encompasses staff, education, data collection, integration instruments and technology. With regard to staff, a significant number of EMS agencies have limited or no dedicated quality staff.41 Staff for quality improvement are ideally incorporated in administrative staffing considerations. In addition, just like we don’t expect EMS personnel to perform procedures that they have not been trained to do, quality staff and organizational leadership (including medical directors) should receive training in just culture, quality methods, metrics and improvement science.22,42,43

To maximize the limited time of quality staff, automated reporting on national evidence-based quality metrics should be embedded within the EHR.43 The data entry burden on the field provider must be balanced with their primary focus of patient care.44 Data elements should be limited to critical components that are mandatory nationwide or deemed locally important, and device manufacturers should enable cross-platform information transfer to facilitate accurate data sharing.45

For patient care, this would decrease the documentation burden on the individual provider and increase accuracy of recorded times and specific interventions provided to the patient. The integrity of the entered data is critical, and as such, field providers should receive education regarding the goals and mechanics of the quality process and importance of data integrity.45,46

“Quality efforts in EMS require seamless, automatic, large-scale bidirectional information sharing of patient data and outcomes. This should be supported via provincial, state, and national regulations as well as in partnership with local health entities.”3

Routine integration of EMS data within the greater healthcare record is the exception not the rule within the United States. The current voluntary cooperation and integration with hospital EHRs is often obstructed by HIPAA concerns on the hospital side41,43 and complicated by the expense of linking the EMS and hospital EHR. Moreover, within the out-of-hospital environment itself, there’s a plethora of pertinent data not incorporated in the EHR, including telemedicine consultation, prescription drug information, social services interventions and community public health information. The inaccessibility and proprietary nature of even EMS-specific data resources, such as AED monitor data and dispatch measures, stifles the full potential of quality improvement initiatives.

We can’t expect the quality of EMS medical care to improve if we’re unable to link our interventions to patient outcomes. Therefore, quality improvement efforts, outcomes research and cost effectiveness of the healthcare system need consistent, accessible solutions that integrate EMS EHRs bidirectionally into the larger healthcare medical record.47,48 Although traditionally, this goal is thought of as the integration of EMS and hospital data, healthcare also includes public health, social services and other linked organizations. Ideally, a unique patient identifier would allow for tracking of patients throughout the health care continuum.49

Given the little headway that has been made in this area, empowering regulation is needed to encourage or require hospital and other healthcare entity cooperation in the interest of assuring the timely availability of patient information. Depending on EMS agencies to fund this integration or expecting local/regional hospital systems to do so restricts this access to well-funded and forward-thinking agencies that can dedicate financial and personnel resources to this effort. Removing barriers at the federal level and incentivizing integration from the receiving health entity side of the system may be a solution.50,51

“EMS should adopt uniform quality terminology and definitions. This will improve the ability of EMS medical directors, leaders, regulators, and policymakers to compare results between systems, regions, and countries.”3

 Quality improvement efforts require adoption of a uniform set of definitions, terminology and reporting formats. Fundamentally, we have to be sure that we are comparing “apples to apples” if we’re to promote meaningful comparisons among disparate care systems and over time—a process known as system benchmarking.52

For example, if we compare scene times in major trauma between systems as a process measure for trauma care, how we define this interval is important (e.g., does it start with “at scene” or “at patient”?). A classic example of the utility of uniform definition implementation is the Utstein criteria for cardiac arrest which not only advanced research, but enabled quality improvement initiatives that have revolutionized cardiac arrest care.53,54 Candidate variables from the Utstein template, along with the National EMS Information System (NEMSIS) and the Resuscitation Outcomes Consortium were used to design the core elements of the Cardiac Arrest Registry to Enhance Survival (CARES).17 The CARES data is used by individual agencies to benchmark themselves against others, to suggest ideas for performance improvement, and measure the impact of performance improvement interventions on patient outcomes as part of a PDSA cycle. Registry data has been used to highlight regional disparities in cardiac arrest survival and to suggest performance improvement initiatives such as bystander-education and implementation of team-focused CPR.55,56

Benchmarking performance with like agencies through utilization of uniform definitions shouldn’t be limited to cardiac arrest, but should be broadened to apply to a range of clinical situations, as it has been for airway management and trauma.57,58 As demonstrated by the Utstein example, established benchmarks enable EMS systems to evaluate their performance in comparison to other agencies, systems, or regions. This process can reveal invaluable opportunities for quality improvement by identifying high-performing systems so that their innovations may be adopted by others. NEMSIS provides a mechanism to capture standardized U.S. data regarding the full spectrum of EMS clinical care and the ability to compare similar agencies’ performance, though this capability is significantly underutilized.

Although these initiatives are still plagued by non-uniform participation and missing data elements, they represent an important initiative in EMS quality and will continue to improve in utility as long as individual agencies and states continue to develop capacity for data sharing.59

“Quality improvement methodology and work requires partnership between the operational and medical community using a system-based approach in which patient / provider safety and quality care are highlighted.”3

“EMS leaders need to promote a culture of safety. Leaders must emphasize that the highest quality of care is only achieved when the process improvement program rewards those who identify and seek to prevent errors before they occur.”3

Quality in every aspect of EMS operations and patient care must become a core value of our culture.60 Such cultural change may initially run counter to many of the deep-seated beliefs within EMS and requires a comprehensive approach.61 Quality can’t merely be a single exercise to meet regulatory or accreditation requirements, but rather must be an iterative cycle of assessment, intervention, evaluation and reassessment to improve practice. In order for quality improvement to be both successful and sustained, an organization must engage both its leadership and frontline EMS providers in establishing a culture of improvement.62

On multiple levels, it must be understood that progress happens through system changes rather than blaming of individual providers. This is done not only by incorporating Just Culture principles into the quality management process, but by reliably adhering to these principles when adverse events occur.63 Foundational understanding at all organizational levels of the quality process is central to meaningful implementation. Continuing medical education should focus not on repetition of core topics, but rather on case discussions that illustrate principles of just culture, cognitive error/bias and discussion of system-level solutions to prevent error and improve quality.64

Reinforcing that system-level faults contribute significantly to most quality issues is just the first step in culture change. Design and implementation of any quality initiative should include a range of stakeholders, from management to field providers, to promote meaningful objectives, broad-based acceptance and integration into daily practice. Expectations of providers should be clearly delineated, feedback should be timely, and engagement should be openly rewarded and recognized. One way to accomplish this is through performance measure care bundles.65 Since attitudes towards EMS providers have a significant impact on behavior, EMS medical directors should collaborate with other affected health entities for ongoing quality improvement.39 Intentional efforts to promote buy in from other health entities and hospital personnel are important to advancing these initiatives.

Finally, operations and medical care are two sides of the same coin; operational decisions that impact provider wellness in turn effect clinical decision making and patient care. As it stands, there’s inconsistency in workplace safety culture between EMS agencies66 and in mechanisms to address fatigue in the workplace.67 Operational initiatives focused on the workforce, including individual provider wellness, adequate support, ergonomics, safety equipment, reasonable work conditions and expectations allow for a solid foundation for higher-level quality initiatives.43 Fatigue, provider turnover and burnout, high unit hour utilization, red light and siren use are some operational factors shown to directly negatively impact not only morale, but also patient care and risk of error.36

The EMS Culture of Safety Strategy project answered federal-level recognition that a coordinated effort to improve safety across the spectrum was needed, and it outlined a vision emphasizing the need for a culture shift to prioritize safety considerations and risk awareness and make them a part of everyday practice.60 Empowerment and education of the individual provider is central to actualizing this vision to ingrain safety and quality as a part of daily practice and not merely as a declared aim. Identifying and managing areas of risk prior to an error occurring is the ideal, and reflective of an organization operating at the highest levels on both the operational and medical plane.68


The practice of EMS medicine was born out of the realization that medical care in the out-of-hospital environment can positively impact health outcomes of the ill and injured. As a system, we must never lose sight of this primary objective. Quality in EMS, therefore, is ultimately defined by patient-centered outcomes. Research, knowledge translation, information systems, operations, collaborative practice and education form the tools to define and achieve an ever-evolving target of the best available patient care. Operational quality elements such as safer operating environment and provider wellness are critical to facilitating this goal. Quality shouldn’t be restricted to the motivated but should be pervasive and accessible to every EMS system, and a culture of improvement synonymous with the culture of EMS.


1. Accidental Death and Disability. National Academies Press: Washington, D.C.; 1966.

2. Committee on the Future of Emergency Care in the United States Health System Board on Health Care Services. Emergency medical services: At the crossroads. National Academies Press: Washington, DC; 2007.

3.  Defining Quality in EMS. Prehospital Emerg Care. 2018;22(6):782-783. Retrieved March 29, 2019, from www.naemsp.org/home/news/naemsp-position-paper-defining-quality-in-ems/.

4. Myers JB, Slovis CM, Eckstein M, et al. Evidence-based performance measures for emergency medical services systems: A model for expanded EMS benchmarking. Prehosp Emerg Care. 2008;12(2):141-151. doi:10.1080/10903120801903793

5. Larsen MP, Eisenberg MS, Cummins RO, et al. Predicting survival from out-of-hospital cardiac arrest: A graphic model. Ann Emerg Med. 1993;22(11):1652-1658. http://www.ncbi.nlm.nih.gov/pubmed/8214853.

6. Stout, J. L. (1994). Contracting for emergency ambulance services: A Guide to Effective System Design – Executive Summary.

7. Weston CF, Wilson RJ, Jones SD. Predicting survival from out-of-hospital cardiac arrest: A multivariate analysis. Resuscitation. 1997;34(1):27-34.

8. Blackwell TH, Kline JA, Willis JJ, et al. Lack of association between prehospital response times and patient outcomes. Prehosp Emerg Care. 2009;13(4):444-450. doi:10.1080/10903120902935363

9. Pons PT, Markovchick VJ. Eight minutes or less: does the ambulance response time guideline impact trauma patient outcome? J Emerg Med. 2002;23(1):43-48.

10. Ho J, Casey B. Time saved with use of emergency warning lights and sirens during response to requests for emergency medical aid in an urban environment. Ann Emerg Med. 1998;32(5):585-588.

11. Hunt RC, Brown LH, Cabinum ES, et al. Is ambulance transport time with lights and siren faster than that without? Ann Emerg Med. 1995;25(4):507-511.

12. Kahn CA, Pirrallo RG, Kuhn EM. Characteristics of fatal ambulance crashes in the United States: An 11-year retrospective analysis. Prehosp Emerg Care. 2001;5(3):261-269.

13. El Sayed MJ. Measuring quality in emergency medical services: a review of clinical performance indicators. Emerg Med Int. 2012;2012:161630. doi:10.1155/2012/161630

14. EMS Compass: Improving systems of care through meaningful measures. (2017) National Association of State EMS Officials. Retrieved March 27, 2019, from www.nasemso.org/projects/ems-compass/.

15. Kahn JM, Gould MK, Krishnan JA, et al. An official American thoracic society workshop report: Developing performance measures from clinical practice guidelines. Ann Am Thorac Soc. 2014;11(4):S186-95. doi:10.1513/AnnalsATS.201403-106ST

16. Bagai A, Al-Khalidi HR, Sherwood MW, et al. Regional systems of care demonstration project: Mission: Lifeline STEMI Systems Accelerator: Design and methodology. Am Heart J. 2014;167(1):15-21.e3. doi:10.1016/j.ahj.2013.10.005

17. McNally B, Stokes A, Crouch A, et al. CARES: Cardiac Arrest Registry to Enhance Survival. Ann Emerg Med. 2009;54(5):674-683.e2. doi:10.1016/j.annemergmed.2009.03.018

18. Nathens AB, Cryer HG, Fildes J. The American College of Surgeons Trauma Quality Improvement Program. Surg Clin North Am. 2012;92(2):441-54, x-xi. doi:10.1016/j.suc.2012.01.003

19. O’Connor R, Nichol G, Gonzales L, et al. Emergency medical services management of ST-segment elevation myocardial infarction in the United States—A report from the American Heart Association Mission: Lifeline Program. Am J Emerg Med. 2014;32(8):856-863.

20. Lee CS, Larson DB. Beginner’s guide to practice quality improvement using the model for improvement. J Am Coll Radiol. 2014;11(12 Pt A):1131-1136. doi:10.1016/j.jacr.2014.08.033

21. Model for improvement. (2019.) Associates in Process Improvement. Retrieved March 28, 2019, from www.apiweb.org.

22. Langley GJ, Moen RD, Nolan KM, et al: The improvement guide: A practical approach to enhancing organizational performance, 2nd edition. Jossey-Bass: San Francisco, 2009.

23. Jollis JG, Al-Khalidi HR, Roettig ML, et al. Regional systems of care demonstration project: American Heart Association Mission: Lifeline STEMI systems accelerator. Circulation. 2016;134(5):365-374.

24. Jollis JG, Al-Khalidi HR, Roettig ML, et al. Impact of Regionalization of ST-Segment-Elevation Myocardial Infarction Care on Treatment Times and Outcomes for Emergency Medical Services-Transported Patients Presenting to Hospitals With Percutaneous Coronary Intervention: Mission: Lifeline Accelerator. Circulation. 2018;137(4):376-387. doi:10.1161/CIRCULATIONAHA.117.032446

25. Daudelin DH, Sayah AJ, Kwong M, et al. Improving use of prehospital 12-lead ECG for early identification and treatment of acute coronary syndrome and ST-elevation myocardial infarction. Circ Cardiovasc Qual Outcomes. 2010;3(3):316-323. doi:10.1161/CIRCOUTCOMES.109.895045

26. Chi JH, Knudson MM, Vassar MJ, et al. Prehospital Hypoxia Affects Outcome in Patients With Traumatic Brain Injury: A Prospective Multicenter Study. J Trauma Inj Infect Crit Care. 2006;61(5):1134-1141. doi:10.1097/01.ta.0000196644.64653.d8

27. Spaite DW, Hu C, Bobrow BJ, et al. Association of Out-of-Hospital Hypotension Depth and Duration With Traumatic Brain Injury Mortality. Ann Emerg Med. 2017;70(4):522-530.e1. doi:10.1016/j.annemergmed.2017.03.027

28. Davis DP, Dunford J V, Poste JC, et al. The impact of hypoxia and hyperventilation on outcome after paramedic rapid sequence intubation of severely head-injured patients. J Trauma. 2004;57(1):1-8; discussion 8-10. http://www.ncbi.nlm.nih.gov/pubmed/15284540.

29. Spaite DW, Bobrow BJ, Stolz U, et al. Evaluation of the impact of implementing the emergency medical services traumatic brain injury guidelines in Arizona: The Excellence in Prehospital Injury Care (EPIC) study methodology. Acad Emerg Med. 2014;21(7):818-830. doi:10.1111/acem.12411

30. Martin-Gill C, Gaither JB, Bigham BL, et al. National Prehospital Evidence-Based Guidelines Strategy: A Summary for EMS Stakeholders. Prehosp Emerg Care. 2016;20(2):175-183. doi:10.3109/10903127.2015.1102995

31. Ripley E, Ramsey C, Prorock-Ernest A, et al. EMS Providers and Exception from Informed Consent Research: Benefits, Ethics, and Community Consultation. Prehosp Emerg Care. 2012;16(4):425-433. doi:10.3109/10903127.2012.702189

32. Straus SE, Tetroe J, Graham I. Defining knowledge translation. Can Med Assoc J. 2009;181(3-4):165-168. doi:10.1503/cmaj.081229

33. Graham ID, Logan J, Harrison MB, et al. Lost in knowledge translation: time for a map? J Contin Educ Health Prof. 2006;26(1):13-24. doi:10.1002/chp.47

34. Brown KM, Macias CG, Dayan PS, et al. The development of evidence-based prehospital guidelines using a GRADE-based methodology. Prehosp Emerg Care. 2014;18(Suppl 1):3-14. doi:10.3109/10903127.2013.844871

35. Lang ES, Spaite DW, Oliver ZJ, et al. A national model for developing, implementing, and evaluating evidence-based guidelines for prehospital care. Acad Emerg Med. 2012;19(2):201-209. doi:10.1111/j.1553-2712.2011.01281.x

36. Patterson PD, Higgins JS, Lang ES, et al. Evidence-Based Guidelines for Fatigue Risk Management in EMS: Formulating Research Questions and Selecting Outcomes. Prehospital Emerg Care. 2017;21(2):149-156. doi:10.1080/10903127.2016.1241329

37. Wilkinson SA, Hough J, Hinchliffe F. An Evidence-Based Approach to Influencing Evidence-Based Practice in Allied Health. J Allied Health. 2016;45(1);41-48.

38. Traffic safety plan for older persons. (December 2013.) National Highway Traffic Safety Administration. Retrieved March 28, 2019, from www.nhtsa.gov/sites/nhtsa.dot.gov/files/older_people_811873.pdf.

39. Fishe JN, Crowe RP, Cash RE, et al. Implementing Prehospital Evidence-Based Guidelines: A Systematic Literature Review. Prehosp Emerg Care. 2018;22(4):511-519. doi:10.1080/10903127.2017.1413466

40. Sholl M, Taillac P, Adelgais KM, et al. (2016). Statewide implementation of a prehospital care guideline: Final report to the National Highway Safety Traffic Administration. National Association of State EMS Officials. Retrieved March 28, 2019, from www.nasemso.org/wp-content/uploads/EBG_NHTSA_FinalReport.pdf.

41. Redlener M, Olivieri P, Loo GT, et al. National Assessment of Quality Programs in Emergency Medical Services. Prehosp Emerg Care. 2018;22(3):370-378. doi:10.1080/10903127.2017.1380094

42. Greiner AC, E Knebel: Health Professions Education: A Bridge to Quality. National Academies Press: Washington, D.C. 2003.

43. National EMS Advisory Council. (2017). Advisory: Successful Integration of Improvement Science in EMS. Retrieved March 27, 2019, from www.ems.gov/pdf/nemsac/NEMSAC_Final_Advisory_Successful_Integration_Improvement_Science.pdf.

44. Poissant L, Pereira J, Tamblyn R, et al. The Impact of Electronic Health Records on Time Efficiency of Physicians and Nurses: A Systematic Review. J Am Med Inform Assoc. 2005;12(5); 505-516. doi:10.1197/jamia.M1700

45. V3 Data Dictionaries & XSD. (2016) National EMS Information System (NEMSIS). Retrieved March 27, 2019, from www.nemsis.org/technical-resources/version-3/version-3-data-dictionaries/.

46. Wang HE, Prince DK, Stephens SW, et al. Design and implementation of the Resuscitation Outcomes Consortium Pragmatic Airway Resuscitation Trial (PART). Resuscitation. 2016;101:57-64. doi:10.1016/j.resuscitation.2016.01.012

47. Carlin CS, Dowd B, Feldman R. Changes in Quality of Health Care Delivery after Vertical Integration. Health Serv Res. 2015;50(4):1043-1068. doi:10.1111/1475-6773.12274

48. McWilliams JM, Chernew ME, Zaslavsky AM, et al. Delivery system integration and health care spending and quality for Medicare beneficiaries. JAMA Intern Med. 2013;173(15):1447-1456.

49. Appavu SI. (1997). Analysis of Unique Patient Identifier Options. U.S. Department of Health and Human Services. Retrieved March 27, 2019, from www.ncvhs.hhs.gov/wp-content/uploads/2014/08/APPAVU-508.pdf.

50. Balas A, Al Sanousi A. Interoperable electronic patient records for health care improvement. Stud Health Technol Inform. 2009;150:19-23.

51. Shekelle PG, Morton SC, Keeler EB. Costs and benefits of health information technology. Evid Rep Technol Assess (Full Rep). 2006;(132):1-71.

52. Cummins RO, Chamberlain DA, Abramson NS, et al. Recommended guidelines for uniform reporting of data from out-of- hospital cardiac arrest: the Utstein Style. A statement for health professionals from a task force of the American Heart Association, the European Resuscitation Council, the Heart and Stroke Foundation of Canada, and the Australian Resuscitation Council. Circulation. 1991;84(2):960-975.

53. Cone DC, Jaslow DS, Brabson TA. Now that we have the Utstein style, are we using it? Acad Emerg Med. 1999;6(9):923-928.

54. Perkins GD, Jacobs IG, Nadkarni VM, et al. Cardiac Arrest and Cardiopulmonary Resuscitation Outcome Reports: Update of the Utstein Resuscitation Registry Templates for Out-of-Hospital Cardiac Arrest. Resuscitation. 2015;96:328-340. doi:10.1016/j.resuscitation.2014.11.002

55. Girotra S, van Diepen S, Nallamothu BK, et al. Regional Variation in Out-of-Hospital Cardiac Arrest Survival in the United States. Circulation. 2016;133(22):2159-2168. doi:10.1161/CIRCULATIONAHA.115.018175

56. Pearson DA, Darrell Nelson R, Monk L, et al. Comparison of team-focused CPR vs standard CPR in resuscitation from out-of-hospital cardiac arrest: Results from a statewide quality improvement initiative. Resuscitation. 2016;105:165-72. doi: 10.1016/j.resuscitation.2016.04.008

57. National Trauma Databank. (2019). American College of Surgeons. Retrieved March 27, 2019, from www.facs.org/quality-programs/trauma/ntdb.

58. Wang HE, Domeier RM, Kupas DF, et al. Recommended Guidelines for Uniform Reporting of Data from Out-of-Hospital Airway Management: Position Statement of the National Association of EMS Physicians. Prehosp Emerg Care. 2004;8(1):58-72. doi:10.1197/s1090-3127(03)00282-x

59. Mann NC, Kane L, Dai M, et al. Description of the 2012 NEMSIS public-release research dataset. Prehosp Emerg Care. 2015;19(2):232-240. doi:10.3109/10903127.2014.959219

60. National Highway Traffic Safety Administration. (Oct. 3, 2013). Strategy for a National EMS Culture of Safety. EMS.gov. Retrieved March 27, 2019, from www.ems.gov/pdf/Strategy-for-a-National-EMS-Culture-of-Safety-10-03-13.pdf.

61. Juran JM, DeFeo JA: Juran’s Quality Handbook, 6th edition. McGraw-Hill: New York, 2010.

62. Deming WE: Out of the Crisis. MIT Press: Cambridge, Mass., 1986.

63. Boysen PG 2nd. Just culture: A foundation for balanced accountability and patient safety. Ochsner J. 2013;13(3):400-406.

64. Trowbridge RL, Dhaliwal G, Cosby KS. Educational agenda for diagnostic error reduction. BMJ Qual Saf. 2013;22(Suppl 2):ii28-ii32. doi:10.1136/bmjqs-2012-001622

65. Prehospital Care Bundles. (May 5, 2018). Monroe-Livingston Regional EMS Council. Retrieved March 27, 2019, from www.mlrems.org/provider/performance-measures/.

66. Patterson PD, Huang DT, Fairbanks RJ, et al. Variation in Emergency Medical Services Workplace Safety Culture. Prehosp Emerg Care. 2010;14(4):448-460. doi:10.3109/10903127.2010.497900

67. Patterson PD, Weaver MD, Frank RC, et al. Association between poor sleep, fatigue, and safety outcomes in emergency medical services providers. Prehosp Emerg Care. 2012;16(1):86-97. doi:10.3109/10903127.2011.616261

68. Pickard K, Fowler RL, Lippman M, Risk Management. In Cone DC, Brice J, Delbridge TR, et al (Eds.): Emergency Medical Services Clinical Practice and Systems Oversight, Volume 2. Wiley: , pp. 192-200, 2015.