Myths and Misinformation about the National Registry Examinations

Greg Applegate, PhD, MBA

We only use 10 percent of our brain. Lightning never strikes the same spot twice. Cracking your knuckles gives you arthritis. The National Registry uses trick questions on its examinations. These statements all have one thing in common. They are some of the most believed myths out there. 

As the National Registry’s Chief Science Officer, I am fortunate enough to speak to many EMS professionals from all backgrounds. But during these conversations, I sometimes find myself debunking myths and misinformation that have been passed along to them. In the interest of transparency and sharing knowledge, here is information about the National Registry examinations — backed by science, not myths and misinformation.  

A quick note about terminology. Throughout this article I will refer to EMS professionals. An EMS professional is a licensed or otherwise clinically qualified individual who works in the EMS field. Some of them get paid and some of them volunteer but the commitment and dedication to professional behavior and clinical competence makes them a professional. 

The people who write the questions have obviously never practiced in the field or they are trying to trick us. 

This is most common issue I hear. Examination content is written and reviewed by practicing EMS professionals from across the country. It is not written by the National Registry staff. A group of EMS professionals is selected once or twice each month to participate in an examination development panel at the National Registry offices in Columbus, Ohio.  

John Corley, a program manager on the Examinations Team, is responsible for selecting panel participants based on their role and experience in EMS. John works to get a diverse group of people who represent all of EMS on each panel. This includes EMS providers, educators, physicians and state EMS officials. It also includes choosing people from different parts of the country, with different backgrounds, and from different types of EMS services from urban, suburban, rural and frontier services. 

Panelists are trained in best practices for question writing prior to participating. They are then tasked with writing items for the examinations. Panelists are encouraged to write clearly, concisely and to avoid trick questions or trivia. The focus of the items is on major concepts and application of EMS knowledge. A good question is clear and allows a candidate to demonstrate their knowledge. 

The National Registry staff is trained to facilitate the content development panels but intentionally remains neutral to avoid biasing the examination content. Their job is to ensure the process proceeds as designed and to help participants develop good questions. If you are interested in participating in an examination development panel, you can sign up at  

There is more than one right answer to a National Registry question. 

This one used to be true. Many of you who participated in writing items in the past may remember when candidates were expected to choose the “best answer.” This has not been National Registry practice for the last couple of years. All newly developed questions have one right answer. Older items are being reviewed and updated so that they also have only have one right answer. 

What changed? Best practices in the high-stakes testing field have evolved and the National Registry is evolving with them, including having only one correct answer to each item. Further, the National Registry hired a full-time psychometrician to work with its examinations. A psychometrician is a scientist who specializes in making sure tests are fair and accurate. The National Registry’s psychometrician is Dr. Mihaiela Gugiu and she continually reviews the results of the examinations to ensure they remain accurate and fair. She also reviews new items for fairness before she allows them to be used to count for a candidate’s score. Full disclosure, I am trained as a psychometrician. 

The way we have to teach for the National Registry examination is different that than the way we do it in the field. 

Of all the myths and misinformation about the examinations this one is the most persistent and frustrating. First, the National Registry examinations are designed to avoid detailed questions about protocols and local practice. The goal is to determine a candidate’s knowledge of EMS principles and practice. For many issues, there are multiple ways to accomplish a positive patient outcome. The National Registry recognizes this and encourages item writers to focus on understanding good clinical judgement rather than specific processes. 

Second, examination content is designed to reflect current practice in clinical care as defined by EMS professionals. After an item has been developed, it is reviewed by the National Registry program managers. Program managers are Nationally Registered paramedics with a variety of field experience (typically they have been field training officers and/or educators). The program managers are responsible for ensuring that the items are written according to best practices. The item development process is complicated and takes months. Here are the major steps: 

  1. A panel of subject matter experts are trained in item writing and asked to write questions. 
  2. The questions are reviewed by Examinations Program Managers who check the content by looking up the answer in the current medical literature. 
  3. A professional editor reviews the items to ensure they are clear, concise and grammatically correct. 
  4. A separate panel of subject matter experts reviews the items to ensure they are current and within the scope of practice. 
  5. The items are included on examinations so that response data can be collected but the items do not count toward a candidate’s score.  
  6. The response data for each item is reviewed by a psychometrician to ensure it meets the quality assurance guidelines for examination questions. 
  7. Items that pass each step of this process are now eligible to be included on an examination. 

Despite this stringent process, there is still controversy over item content. Medicine is always changing and determining when to add new content or remove old content is not always an easy decision. This is why it is especially important to have practicing EMS professionals develop, review, and approve content.  

There were paramedic items on my EMT examination because (lots of theories about why). 

Putting inappropriate content on an examination is never acceptable. Examinations are designed to be fair and accurate assessments of a candidate’s knowledge, skills, and abilities for their level. There is no reason to include out-of-scope content on an examination and the National Registry has a process in place to prevent this from occurring.  

I occasionally receive complaints about out-of-scope content on an examination. This is typically for one of two reasons. First, the EMR, EMT and Paramedic examinations are adaptive examinations. This means that the examination adjusts to a candidate as they progress. If a candidate gets a question correct, the computer algorithm selects the next question to be a little harder. If a candidate gets a question incorrect, the computer algorithm selects the next question to be a little easier. An adaptive format is used because it reduces the cost to candidates and it provides greater security. The cost to candidates is reduced because less items need to be administered to determine if a candidate has passed the examination. Examination security is improved because no two candidates get the same examination.

One of the downsides of an adaptive format is it can be a challenging experience for a candidate. A strong candidate will often get many of the first questions correct on their examination. As a result, they begin receiving the hardest questions within the item bank. The questions are within scope but represent content above the knowledge typically possessed by an entry level provider. Some candidates interpret the content as being out of scope because their instructor spent little or no time discussing these topics. In truth, the content is within scope, but the candidate is doing well on the examination and does not realize it. This is an issue I emphasize with instructors when I speak with them. It is important that candidates understand how the process works. Understanding can help reduce a candidate’s stress level and improve their performance on the examination.

On the other hand, sometimes the complaints I receive about out-of-scope content are from candidates who failed the examination. When I speak to them about their concerns, I usually find that they weren’t prepared or their instructional program left out or spent less time on areas which are included in the National Scope of Practice model. All areas included in the National Scope of Practice model are included on the National Registry examinations. Unfortunately, there are situations where it is not the candidate’s fault that they were not taught the information. However, full knowledge of the clinical practices included in the National Scope of Practice is required to become Nationally Registered.

One final note on this issue. Despite our best efforts, we are still human. There was an incident in early 2019 where a few inappropriate items were included on the AEMT examination. The issue was caught and addressed quickly but it did affect a handful of candidates. Each affected candidate had their attempt nullified (it did not count against them), were refunded their money, and were given a free retest. We learned some valuable lessons through this error and adopted further practices which have made our processes even stronger. Additionally, we made the decision to immediately communicate with affected candidates and stakeholders about the issue. Being transparent and straightforward was the right decision and helped us quickly mitigate the issue and further build trust within the EMS community.

I randomly received the maximum number of questions on my examination. 

Nobody receives the maximum number of test questions randomly. Except for the AEMT examination (which is the same number of questions for everyone), the number of questions on each examination is determined by the candidate’s performance. After each question, the computer algorithm scores the examination. Once the minimum number of items has been administered, the examination stops if the score is high enough (the candidate has passed). If the score falls below a certain level, the examination stops because the candidate has failed. If the score is between the high and low cutoffs, the candidate receives another question. This continues until the score is above the high cut off, below the low cut off or the maximum number of questions have been given. If a candidate receives the maximum number of questions it is because their score is too close to the passing standard for an early decision to be made. 

Final Thoughts

The goal of the National Registry is to protect the public by ensuring EMS professionals are qualified to provide safe and effective entry-level care at their level. To accomplish this, examination content is developed and maintained by the EMS community. Our goal is to facilitate the process, by holding ourselves accountable, communicating openly, and being transparent. The National Registry is committed to providing fair and unbiased examinations that allow each candidate to demonstrate their knowledge, skills, and abilities related to EMS. 

If you want to learn more about the news and happenings of the National Registry, follow us on social media. We’re on Facebook, Twitter, Instagram, YouTube and LinkedIn. We also invite you to talk to our staff members at many of the local, state and national EMS conferences we attend each year. If you have a specific question for me, you can contact me at My staff and I will work to ensure you get responses to your questions.  

No posts to display