From time to time, I like to pretend that I'm still in academics. That's why I was excited when my friend Dr. Dennis Vincenzi, assistant professor of human factors and systems at Embry-Riddle Aeronautical University, invited me out for a beer a few weeks ago. It wasn't just any beer. Dennis was one of the key organizers for the Human Performance, Situation Awareness and Automation conference held here in Daytona Beach; a personal invitation from a conference chair meant that appetizers and drinks were free.
Human factors engineers are an interesting lot. They think about those things that we don't. Although that sounds flippant, it really isn't. The human factors guys are the ones who know why you might pay attention to a red light rather than a blue light, why knobs rotate right or left and whether switches should flip up or down. Most of us don't even think about where these things originate. (It's kind of like the line from one of my favorite British comedies, The Black Adder, where Rowan Atkinson notes, "I am one of those people who are perfectly happy to wear cotton, but have no idea how it works.") So when you talk to a human factors engineer you're struck by how brilliant and yet how trivial they seem. Once you break through the scientific jargon, their research involves a lot of common sense. But it's the kind of common sense that's incredibly uncommon, the type that slaps you in the face and make you wonder, "Why didn't I think of that!" (There are some great human factors stories in Set Phasers on Stun by Steven Casey. It's a quick, fun and fascinating read.)
While I was waiting for Dennis (apparently you have to mingle and stuff when you're the chair), I had a chance to peruse the poster presentations. For those of you who haven't been to a "scientific" meeting before, new research is usually presented in one of two forms. Most works are displayed as posters, which are often left up throughout the conference for people to review at their leisure. Research may also be presented orally in specially scheduled sections. The Prehospital Care Research Forum, as well as most major emergency medicine meetings, follow this model.
Walking into the main conference room, I was really pleased to see a few old friends. I met some new ones as well. My favorite acquaintance, with whom I developed such a close bond that his name totally escapes me, sat down at the table where I was scarfing down multicolored nachos. After introductions, I asked which presentation was his. "Oh, hell, none of them are mine," he replied with a good-natured shrug. "I'm with that girl over there. I don't know a damn thing about all this." Honesty is always to be admired.
Many of the posters were genuinely interesting. Some were simply fun to look at, like the research on the causes of road rage. (Turns out that stress is the No. 1 cause. Go figure.) But one presentation appealed to the EMS person in me. Entitled "Skill Acquisition with a Virtual Reality Simulator for Phlebotomy," it explored the use of a computer-based virtual reality system for instruction in venipuncture.
We know that training students in clinical procedures is a difficult issue. Even taking out the problems of too many students competing for too few procedures, concerns about liability, supervision, reimbursement and technical problems remain. Speaking strictly about venipuncture, there are only so many people (translated as none where I'm the potential victim) who will volunteer to be prodded by neophyte leeches in search of their first blushing vein. Models featuring plastic arms made of "lifelike" materials are also suboptimal. It's nobody's fault. It's hard to recreate a natural system, especially with the inherent variations that occur in nature. In addition, most simulators are designed to encourage success. (There's a reason why the IV arms have ropes for veins.) Finding a more lifelike training model free of the physical and emotional complications that come with practice in live individuals is certainly a laudable goal.
Virtual reality simulators have been touted as the next generation in training devices. These educational aids have a strong history of use in the aerospace and military settings. The proliferation of VR systems to address procedural training in clinical medicine has been identified as a potential solution to the "see one, do one, teach one" method of instruction. (I kind of thought this was one of our quiet little secrets, but the human factors folks knew all about this. I told you they were smart.) But despite the multitude of articles touting the use of VR, a literature scan reveals minimal efforts to truly validate the efficacy of this training.
The team at Old Dominion University (which is in Virginia; if it were in Florida, it would be called "Old Retired Yankee Golf and Bingo University") studied the use of a virtual reality simulator for training students in phlebotomy. Known as the CathSim Vascular Access Simulator (Immersion Medical Inc.), it's an immersion trainer that uses visual, auditory and haptic (tactile) displays to provide information and feedback to the student. Ten graduate students participated in the study. Each subject was given five hours of training consisting of six patient cases. A pre-test and post-test were conducted using a Life/Sim intravenous therapy training arm to evaluate the efficacy of the simulator.
The results were interesting. Performance did improve after training with the CathSim, but there were limits to the learning. The majority of the increase in skills occurred during the first three simulated attempts. Continued practice resulted in only minimal performance benefits. When looking at the improvements in performance between the pre-test and post-test, the gain was relatively small. The investigators concluded that although the VR trainer did work, the ability to transfer the skill acquired through training was limited.
The implications of this study, assuming the results will be confirmed by other works, have some profound consequences for EMS. For years we have been increasingly dependent upon the use of artificial devices (manikins, simulators, etc.) to conduct our procedural training. However, if it's true that no matter how hard we train on a simulator our results have limited application to other settings like real life it begs the question of whether simulator training actually meets the intent of the effort.I think that virtually all educators would agree with Marvin Gaye and Tammy Terrell that there "ain't nothing like the real thing." We need access to patients to achieve excellence in procedural skills. The question becomes how many patients we need to see and how many times a procedure must be performed, in order to ensure competency in that technique.More on that subject next week ...