Bibliography

Published on 27/02/2015 by admin

Filed under Anesthesiology

Last modified 22/04/2025

Print this page

rate 1 star rate 2 star rate 3 star rate 4 star rate 5 star
Your rating: none, Average: 0 (0 votes)

This article have been viewed 1378 times

CHAPTER 9 Bibliography

So there’s a bazillion articles on Simulators, and each article has a bibliography as long as your arm. Where do you start? What do they all mean? Do you pound through each and every one and accrete knowledge like a tree adds growth rings? Is there any theme to them other than, “Simulators are really cool, grab your phone, a credit card, and order before midnight tonight and we’ll send you a free Thighmaster”? Is there a way out of this chaos? Yes.

Since 1969 there have been well over 1000 articles published on simulation. The BEME collaboration* (we’ll come back to that later) took more than 3 years to identify, collect, read, and evaluate all of these articles. Do not worry—we’ll help you through this.

We begin this chapter with a brief description of our general search strategy for articles so you have an idea about how we found all of them. Next we briefly review the current areas of simulation research. Although this chapter focuses on the use of simulators for education, training, and assessment, we provide references for the other areas in case you are interested. The heart of this chapter contains an annotated bibliography separated into interesting themes.

OUR LITERATURE SEARCH

We wanted to provide you with the mother of all simulation bibliographies. So we began the search with references from 1969 when the seminal article about simulation in medical education was published by Abrahamson and then proceed all the way to June 2005. We searched five literature databases (ERIC, MEDLINE, PsychINFO, Web of Science, and Timelit) and employed a total of 91 single search terms and concepts and their Boolean combinations (Table 9-1). Because we know that electronic databases are not perfect and often miss important references, we also manually searched key publications that focused on medical education or were known to contain articles on the use of simulation in medical education. These journals included Academic Medicine, Medical Education, Medical Teacher, Teaching and Learning in Medicine, Surgical Endoscopy, Anesthesia and Analgesia, and Anesthesiology.

Table 9-1 Search Terms and Phrases.

In addition, we also manually searched the annual Proceedings of the Medicine Meets Virtual Reality Conference, the annual meeting of the Society for Technology in Anesthesia, now the International Meeting on Simulation in Healthcare and the biannual Ottawa Conference on Medical Education and Assessment. These Proceedings include “gray literature” (e.g., papers presented at professional meetings, doctoral dissertations) that we thought contained the most relevant references related to our review.

We also performed several basic Internet searches using the Google search engine—an invaluable resource to locate those articles you cannot find anywhere else (it reviews every CV on the web—so you are bound to find even the most obscure reference). Our aim in doing all this was to perform the most thorough literature search possible of peer-reviewed publications and reports in the unpublished “gray literature” that have been judged at some level for academic quality.

All of the 91 search terms could not be used within each of the five databases because the databases do not have a consistent vocabulary. Although each database also has unique coverage and emphasis, we did attempt to use similar text word or keyword/phrase combinations in the searches. Thus the essential pattern was the same for each search, but adjustments were made for databases that enabled controlled vocabulary searching in addition to text word or keyword/phrase searching. This approach acknowledges the role of “art” in information science, recognizing that information retrieval requires professional judgment coupled with high-technology informatics—and a whole lot of extra time on your hands. [Ojala M. Information professionals as technologists. Online 2002;26(4)5.]

GENERAL AREAS OF SIMULATION RESEARCH

For the past 36 years, the primary focus of medical simulation research has been to justify its use as a training and assessment method. Nearly all of the articles begin with the obvious comparison of medicine to aviation and clinicians to pilots. Then they spend the rest of the time in a defensive tone justifying simulation as a valid training to the point that you think simulators are the ugly stepsister of books, lectures, small group discussions, and patient rounds. We believe it is time to stop all of this defensive research and start moving forward—let’s end the meaningless studies comparing simulators to other unproven methods and begin determining the most effective ways to use simulation for training and assessment. We have an important responsibility, as the current generation of trainers who have seen simulation develop and become integrated with traditional training (we are in a sense simulation immigrants). We need to start planning on training the next generations of clinicians who have grown up with simulation (simulation natives) and not worry about previous generations of clinicians (simulation Luddites) who have looked at simulation as some threat to their unproven, outdated, and unsafe “see one, do one, teach one” philosophy. Let us heed the words of Eric Hoffer: “In a time of drastic change, it is the learners who inherit the future. The learned usually find themselves equipped to live in a world that no longer exists.”

Simulators for Training and Assessment

How do you categorize the studies? How do you evaluate the effectiveness of the simulation as a training and/or assessment tool? We are in luck. Donald Kirkpatrick devised a very useful system to evaluate the effectiveness of training programs—that has since been modified for direct application to simulation: Donald Kirkpatrick described four levels for evaluating training programs. (Kirkpatrick DI. Evaluating Training Programs: The Four Levels, 2nd ed. San Francisco: Berrett-Koehler; 1998). Although originally designed for training settings in varied corporate environments, the concept later extended to health care education. Kirkpatrick’s framework for evaluation as adapted for health care education includes all four of these levels. (Freeth D, Hammick M, Koppel I, Reeves S, Barr H. A critical review of evaluations of interprofessional education. http://www.health.ltsn.ac.uk/publications/occasionalpaper02.pdf. Accessed March 10, 2006. Centre for the Advancement of Interprofessional Education, London, 2002.)

The higher the level, the greater the impact of simulation’s effectiveness on training.

Unfortunately, there are no studies at the “Benefits to patients” level, very few at the “change in organization practice”—an example would be the FDA’s decision to grant approval for the use of carotid stents only to clinicians who are trained on a Simulator. We demonstrate that there are far more studies in each lower category.

Now that we have everything organized, we will provide a more friendly approach to read the literature by grouping articles into themes and even linking some of these to the Kirkpatrick criteria. Truth to tell, those Kirkpatrick criteria are a little tough to wade through. You feel yourself falling into “education PhD—speak”, and not so much “regular old doctor teaching another doctor—speak.”

Simulator articles fall into five main “themes.”

1. It stands to reason: Logic dictates that a Simulator makes sense. You wouldn’t want someone flying a plane without flying a “pretend” plane first. You wouldn’t want someone manipulating nuclear reactors without practicing first. So, darn it, it just seems inescapable that a Simulator is the way to go in anesthesia too.

Articles from aviation and industry fit into the “it stands to reason” column. Educational theory gives us some “it stands to reason” arguments as well. Teach with a “death-proof” patient—how can you say no to that? Teach with a patient who can “do what you want” at the stroke of a key. Teach in a setting where the learner has to act, to speak, to interact. Teach where the student has an emotional investment. They’ll learn better. It just plain “stands to reason.”

What would an “anti-Simulator” person say to these “it stands to reason” articles? “Nice. I’m glad a Simulator seems like a good idea. Lots of things seem like good ideas. Has anyone proven it’s a good idea, or are we to go on a hunch? A hunch with, lest we forget, a half million dollar price tag?”

Articles related to this theme would fall into the Level 1 category—how the learners felt about participating in the simulation experiences—“This was the best learning experience in my career—it sure beats listening to the program director talk about this stuff” and the Level 2a category—did the experience change how they felt about the importance and relevance of the intervention—“I now realize how many things can go wrong and how aware I have to be at all times to prevents mishaps.” These are also editorial discussions and descriptive articles about the use of simulators for training and testing and comparing medicine to other high-risk industries—aviation, military.

5. Salvation: These are the articles that matter, the Holy Grail of Simulator literature. Yes, it’s great that there are “it stands to reason” articles. A solid logical base for simulators is comforting. “Canary in the mineshaft” articles help too. We are all looking for better ways to teach. Intellectual honesty demands that we probe for our own weaknesses and failings. If the Simulator can tell me where to shore up my teaching, then thank you Mr. Simulator. “Gee whiz, golly, I belong too” articles merit a place at the table. Simulators are new, they are expensive. We should ask the hard questions of so pricey a technology. When scholarly detractors speak up, we should listen. These are not Luddites, throwing their wooden shoes in the looms. These are serious educators who want proof that simulators help. Detractors focus on simulator research. If simulator champions take an “us versus them” approach, the simulator debate sinks into a squabble. If simulator champions take a “let’s hear them out” approach, the simulator debate elevates into a living, breathing academic discussion. “Halfway to the station” articles serve as necessary stepping stones. We have to examine simulators in the “in vitro” setting. Lab proof precedes clinical proof, and the simulator is a “lab” of sorts. But “Salvation” articles are the real deal. Pure gold. Precious. Salvation articles show that simulators made a difference in the real world. Because someone trained in the Simulator, someone else did better.

A patient didn’t have an MI.

A patient didn’t have a stroke.

Someone lived, who would have died. And the Simulator made it happen.

How could you ever design a study to prove that?

That explains why “Salvation” articles don’t fall out of the sky every day. Truth to tell, that explains why there are no real salvation articles. The closest we can come is articles that suggest salvation. And they are rare but rare. But oh man do they carry some heft.

Articles related to this theme would fall into the Level 3a category—did resident’s actually change their habits after taking a course, and in Level 3b—have any groups changed what they are doing. Finally Level 4—does all this really mean anything important—are patients safer?

So there they are the major themes of simulator articles. Of course, these articles don’t neatly plop into their pigeonholes. An article’s main idea may be “gee whiz golly, I belong too,” but you extract a “canary in the mineshaft” idea. So, this classification system is a little arbitrary and whimsical. But what the heck.

Articles Touching on the Theme “It Stands to Reason”

The articles included in this section say “it stands to reason” that simulators are good things. You read them and you just can’t help but blurt it out. “It stands to reason” that a simulator is a good way to teach because you can’t hurt a patient while practicing on it. “It stands to reason” that reproducible scenarios that you can “dial in” anytime you want is a good way to train medical professionals.

Then here are the gigantic “leaps of faith” implied by these articles: it stands to reason that it’s a better way—pay tons of money to buy one; it stands to reason that it’s a better way—pay tons of money and devote hundreds of staff-hours to support one.

In a world of infinite resources and infinite money, we wouldn’t even bring up these leaps of faith. But that is not the world we live in. So as you read these articles, ask yourself, “OK, so it stands to reason that simulators are good, but just how good, given the cost and time necessary to keep them afloat.”

If simulators make so much sense, why is their use so recent? Haven’t humans been participating in risky behavior (either to themselves or others) before the Wright Brothers proved powered air flight was possible?

The answer is yes—of course it is. It stands to reason that previous generations of humans must have wanted to practice their skills or to practice protecting themselves. “Historically, whenever the real thing was too expensive, too dangerous, too bulky, too unmanageable, or too unavailable, a stand-in was sought.”

In a comprehensive review of anesthesia simulators as they were available during the late 1980s and early 1990s, Good and Gravenstein (the original developers of the METI Human Patient Simulator at the University of Florida) provide an example of simulators from antiquity.

The field—warfare. The simulator—a quintain. What’s a quintain? A quintain originated from tree stumps upon which soldiers would practice their sword fighting. These were fitted with shields and features to resemble adversaries. By the Middle Ages, quintains were mounted on poles to simulate a joust. It also contained feedback. If the novice failed to attack his “enemy” correctly, a weighted arm on the pole would swing around and smack him on his back. Sometimes, we wish we could do this with some of our students and residents. But alas, we live in a kinder, gentler time.

Good and Gravenstein then cite Andrews, who differentiated between simulators and training devices. Simulator … attempts to…. [r]epresent the exact or near exact phenomena likely to occur in the real world; are good for trainee and expert practice but are not necessarily good for systematic learning of new skills and knowledge.

Training device … systematically presents to the trainee only the necessary training stimuli, feedback, reinforcement, remediation, and practice opportunities appropriate to the trainee’s learning level and style. It uses fidelity only as necessary to enhance the learning process. These are commonly referred to as task trainers.

Just as in aviation, there is a right blend for simulators and training devices. Much like tackling dummies and practice scrimmages in football, or a punching bag and sparring partner in boxing.

The remainder of the article reviews the educational applications of anesthesia simulators and training devices. The following examples of training devices (task trainers) are listed here along with the original citations for further reading:

Simulators

SIM ONE

See below.

CASE

See below.

GAS

While there is evidence of using simulators for military training in ancient Rome, their use in medicine did not occur until the mid-sixteenth century. Although it can be argued that Italian physicians such as Mondino de’Luzzi (1275–1326) used “simulators” when he employed cadavers to complement lectures, the idea to use simulation methods to demonstrate rare conditions or a difficult procedure did not occur until the 1540s.

Why then? At the time, many institutions starting to become concerned regarding the safety of women during childbirth. Although physicians (all men) had the knowledge to deliver babies, it was considered a social taboo for a man to perform a task that was the responsibility of the midwives. However, midwives had no formal training and were graduates of the famous “see one, do one, teach one” university. Initial attempts at formal instruction consisted of lectures with illustrations. This did not affect the infant and mother mortality rates; and more than 100 years later, a father and son physician team from France did something about it—they developed an obstetric simulator.

The Gregoires’ Simulator—it was crude by today’s standards—human skeletal pelvis contained in a wire basket with oil skin to simulate the genitalia and coarse cloth to simulate the reaming skill. “Real fetuses, likely preserved by some means, were used in conjunction with the manikin.” The simulator could reproduce the birth of a child and some complications that would require a trained person to fix.

And yes—there were complaints regarding its validity and transfer to real patients, but for the first time someone said, “it stands to reason we can do a better job and not allow these poor women and children to die.”

Over the next two centuries, there were additional obstetric simulators developed in England and the United States—and they appeared to have enjoyed support from lay people and some other physicians. However, some very familiar factors limited their widespread adoption.

You think after 400 years we would have adequately addressed these issues! Even when the majority of students in the late nineteenth century graduated medical school (there was no such thing as residency) without any direct experience with childbirth, available simulators were not adopted, even though “the use of the simulator would provide medical students with at least some experience with birthing techniques and with some of the related complications.” But no—we would have to wait 80 years before another attempt at simulation for training.

So what was the purpose of this Simulator, built before Neil Armstrong took his famous walk?

So we had a Simulator that could do many of the things modern simulators can do, and Denson and Abrahamson had identified all of the potential benefits for simulators that we are talking about now! They performed one formal study involving 10 anesthesia residents for endotracheal intubation (the study is described later, in the Halfway to the Station section).

Over the years, Denson and Abrahamsom went on to train many more health care providers, including medical students, interns, inhalation therapists, nurses, nursing students, and ward attendants. In addition to intubation, they trained in ventilator application, induction of anesthesia, intramuscular injection, recovery room care and pulse and respiration measurement (HOFFMAN KI, ABRAHAMSON S. The “cost-effectiveness” of Sim One. J Med Educ 1975;50:1127–8).

Although additional simulators were planned, funding dried up and the culture was not ready for this type of training—the old guard was skeptical of technology, and there was no appreciation of the need to reduce medical errors and improve patient safety, although Denson and Abrahamson clearly made a case for it. In the words of Abrahamson, the factors that led to Sim-One’s demise was “internal administrative problems,” which means a lack of university support. As a result “the funding agencies were no longer interested” and there was growing “low esteem the academic world was developing for education.” Ouch! (ABRAHAMSON S. Sim One: a patient simulator ahead of its time. Caduceus 1997;13(2):29–41).

What is the legacy of Sim One? As Abrahamson states, “the effectiveness of simulation depends on the instructional method with which the simulation is being compared … if there is no alternative training method available (limited patient availability or restrictions on the use of patients), the effectiveness of a simulation device probably depends on the simple fact that the device provides some kind of learning experience as opposed to none.” Thus, Abrahamson was saying 30 years ago that it stands to reason we should be using these devices if nothing else exists or if traditional training is too dangerous.

What did they think about this Simulator at the time?

“From an anesthesiologist’s point of view, SIM 1 might represent man’s most impressive attempt, thus far, to manufacture himself from something other than sperm and ovum.”

“The appropriateness of the anesthetist’s response to each stress is automatically recorded for his later bemusement and education.”

“The next phase, Sim II, would appear to be an automated trainer to eliminate the need for a flesh-and-blood instructor, and the obvious finale is to simulate the learner as well.”

This is not a community-based practitioner reminiscing about the good-old-days of ether and a biting stick; this was the official response of the Association of University Anesthesiologists! [HORNBEIN TF. Reports of scientific meetings. Anesthesiology 1968;29:1071–7.]

We would have to wait until the late 1980s to pick up from where these pioneers left off.

✓ Gaba DM, DeAnda A. A comprehensive anaesthesia simulation environment: re-creating the operating room for research and training. Anaesthesiology 1988;69:387–94.

This article describes the rediscovery of full-body simulators for anesthesia training and introduced Gaba as a player in the wild, wooly world of simulation. You will see his name again and again in this bibliography. Based out of Stanford, home of lots of smart people, it comes as no surprise that Gaba, too, is smart and on a mission to see simulators reach their potential.

Way back in 1988, Gaba laid out how to do a simulation, and he made clear the argument that it just plain “stands to reason” that simulation is a good way to train. He described their setup and how they went through simulations. He argues that a “total simulation” requires the complete capabilities for noninvasive and invasive monitoring. Also, other tasks are performed using standard operating room equipment so the scenario recreates the anesthesiologist’s physical as well as mental task environment.

Gaba and DeAnda described a script, actors in the field, “on the fly” decisions by the simulator director, a high-fidelity mannequin—basically all the stuff we do now in the Simulator. He ran 21 people through the Simulator and they all judged the experience as highly realistic. This article did not actually do any kind of study, it just laid out how simulations are done and how much the participants liked it. Finally, Gaba proposed that simulation has “major potential for research, training, evaluation, and certification.” Amen to that, Dr. Gaba.

SCHWID HA, O’DONNELL D. The anesthesia simulator-recorder: a device to train and evaluate anesthesiolgists’ responses to critical incidents. Anesthesiology 1990;72:191–7.

Dr. Schwid has shown us that simulators come in all shapes, sizes, types, costs, range of feasibility. This multicenter study evaluated the acceptance of a computer-based anesthesia simulator that uses sophisticated mathematical models to respond to user-controlled interventions, including airway management, ventilation control, and fluid and drug administration (53 different agents).

The Simulator also provided detailed feedback that tracked all of the user’s and Simulator’s responses—this could be used for formative feedback during training or summative evaluation to determine if the learner has mastered the key critical events. The Simulator was evaluated by 44 residents and attendings at seven institutions. Feedback was very positive, as nearly all participants found the patient’s responses to management interventions as realistic and determined it was a good device to test anesthesiologists’ responses to critical events. A significant and important finding was that there were no differences in response among any of the institutions—demonstrating the practical transferability of this training device.

It is always tempting to compare this Simulator with the full-body, comprehensive simulator environment developed by Gaba and Good and Gravensein. To do so misses the point! A comprehensive training environment is as much dependent on the faculty facilitator, the debriefing feedback sessions, and the involvement of the “team” as it is on the Simulator.

Schwid’s computer-based Simulator and others similar to it have several advantages.

Finally, the two following extreme cases illustrate the use of these devices.

Anesthesia has consistently looked to aviation as its “model” for training. Well, aviation manufacturers, including Boeing and Airbus, are now “equipping” pilots with computer-based simulators to master prior to attending the full-scale simulator. Rather than compare one simulator type with another, we should focus on the most effective methods in the best mix for training.

GABA DM. Improving anesthesiologists’ performance by simulating reality [editorial]. Anesthesiology 1992;76:491–4.

Gaba starts out by discussing a screen-based Simulator study by Schwid. Schwid discovered that residents made errors.

Although Gaba never draws the analogy between the simulation and the aforementioned canaries in the mineshaft, we can see how they fulfill this crucial function. If deadly methane gas had seeped out of the coal deposits, the canaries would suffer a severe case of death, alerting miners to the danger. Maybe simulators should be our “canaries.” Instead of waiting for a methane explosion in the mine (a patient catastrophe in the operating room), we should see how the canary’s doing (run residents through the Simulator and uncover their weaknesses).

Usually, we analyze cases retrospectively, after disaster has befallen. This analysis is clouded by incomplete records, failed memories, and, who knows, perhaps a little defensiveness? “I have no idea what went wrong!” So, looking at stuff after the fact isn’t too good.

We could videotape cases as they occur and, in effect, see disasters during the fact. Only problem with that is that most of the time nothing happens. We’d be looking at millions of hours of people sitting on a chair. It would be like watching the medical equivalent of C-SPAN. We might save a few patients that way, but we’ll kill scores of people with boredom. So, looking at stuff during the fact is no good.

How about looking at stuff before the fact? Time travel. Back to the Future instead of C-SPAN. Only the Simulator can provide that kind of time travel. “It stands to reason” that the Simulator is a good idea. You don’t have to wait until a patient is hurt (the retrospective way); you don’t have to wade through miles of stultifying tape (the real-time way); you can “create the problems” without patient harm. You do it ahead of time (the prospective way).

Gaba also reviewed the limits of Simulators, including that, despite their sophistication, they can never create a patient with all of the inherent variables seen in clinical medicine—but so long as they are “reasonable” representations of real patients they could be considered valid by experienced anesthesiologists.

Another limitation is that the trainee is never convinced the simulation is 100% real—leading to hypervigilance in which the poor resident is always worried that something bad is going to happen. This would be okay, except that many errors may result, in reality, from the very boredom and fatigue that occur in real practice. At the other end of the spectrum are the smart alecks who believe that they can do whatever they want because no real patient is at risk.

However, this is true in other industries, and they have made successful use of simulation. In medicine, the validation of simulation will be even more difficult than aviation because no two patients are alike (unlike a 747); the effects of training should be measured over years of training and remediation not after a single training session. Gaba summarized his editorial by making the important point: “No industry in which human lives depend in the skilled performance of responsible operators has waited for unequivocal proof of the benefits of simulation before embracing it.” I say we embrace it too.

✓ Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004;13(Suppl 1):i2–10.

In this article, Gaba shows why he is the maven of high-fidelity simulation in health care. He describes a comprehensive framework for future applications of simulations as the key enabling technique for a revolution in health care—one that optimizes safety, quality, and efficiency.

Gaba begins by making the case that simulation addresses current deficiencies of the health care system.

To address these problems, Gaba proposes that Simulators must be integrated into the fabric of health care delivery at all levels, which is much more complex than piling it on top of the existing system. To do so, he outlines 11 dimensions (and gradients within each) that can take us there. Next, Gaba outlines the various social entities, driving forces, and implementation mechanisms that could forward the agenda of simulation in medicine. Finally, he paints two possible scenarios (he has had lots of practice at developing scenarios) for the fate of simulation in health care.

Optimistic scenario

Pessimistic scenario

Although we certainly take the optimistic view, we know it stands to reason that Simulators will have a significant future in medical training because of the dedication and hard work of individuals who will ensure that it happens.

HELMREICH RL, DAVIES JM. Anaesthetic simulation and lessons to be learned from aviation [editorial]. Can J Anaesth 1997;44:907–12.

This editorial points out that simulators have a lot of potential for serving as tests. All the usual arguments hold—you don’t put a patient at risk, you can reproduce the scene. But this editorial goes on to point out a crucial problem with using a Simulator as a “test vehicle.” A key problem is the idea of “equifinality”—that is, different techniques can give you the same end result. (The article does not mention the following example, we made it up just to illustrate the point.) For example, one anesthesiologist may use epinephrine to achieve a goal, whereas another may use dobutamine to achieve a goal. Both achieve the same goal—better cardiac output. So, in the Simulator, what do you do? Grade someone wrong who uses epinephrine because the “simulator grade book” says you should use dobutamine? The editorial finishes by saying “there is a need to provide opportunities for practice and assessment until the culture supports the fairness of the assessment process.” In other words, it “stands to reason” that a Simulator is a good way to test, but we haven’t quite gotten there yet.

MURRAY WB, SCHNEIDER AJ, ROBBINS, R. The first three days of residency: an efficient introduction to clinical medicine. Acad Med 1998;73:595–6.

Dr. Murray and the fine folks at Penn State (you can almost smell the chocolate from the Hershey factory) describe the first 3 days of their anesthesia residency. Rather than just shoveling a ton of stuff at their residents, they make the learning more active, using (what else) the Simulator. Result—a questionnaire showed “improvement in the residents’ confidence in their ability to carry out clinical tasks.”

So, it “stands to reason” that if a Simulator increases the confidence of a resident, a Simulator must be a good thing. A hard-nosed scientific drudge could look at this and say, “This is not rigorous proof.” A skeptic could look at it and say, “So what, what difference does that make, a little more confidence?” But I’ll bet that to those Penn State residents the added confidence made all the difference in the world when they walked into the OR the first day.

MURRAY DJ. Clinical simulation: technical novelty or innovation in education [editorial]. Anesthesiology 1998;89:1–2.

Dr. Murray is the big cheese in Simulation at Washington University in St. Louis. This is a “do we really need Simulators?” editorial. What did we do in the “B.S. (before simulator) era”? We did residency and did a lot of cases with supervision. We did lectures, one-on-ones with attendings. But why use the past tense? That’s what we are doing right now!

So, do we need to throw Simulators into the mix? Yes. You can use Simulators to teach.

Murray goes on to say that a lot of different groups need to work in the Simulator. Anesthesiologists alone can’t keep the thing humming all the time. A Simulator is a Lamborghini—you bought it, now drive it! Don’t let it sit in the garage all day collecting dust. Get that thing on the road.

ISSENBERG SB, MCGAGHIE WC, HART IR. Simulation technology for health care professional skills training and assessment. JAMA 1999;282:861–6.

Dr. Issenberg, who is one of the authors of this book, oversees the development “Harvey,” the Cardiology Patient Simulator at the University of Miami. In this Special Communication, Issenberg et al. touch on all the simulation technologies that were available in 1999, laparoscopy simulators to train surgeons, their own mannequin Harvey to train students about 27 cardiac conditions, flat screen computer simulators, and finally anesthesia simulators.

What does Dr. Issenberg have to say about the anesthesia simulators? “The high cost and requirements for accompanying equipment, space, and personnel have resulted in research to justify the installation of such devices.” (Hence so many “justification of simulators” articles in this bibliography.) If you look at “intermediate” benefits of simulators, Issenberg points out the following.

So, as study after study comes out hinting that simulators can make us better practitioners, do we have to wait for proof positive? No.

GORDON JA, WILKERSON WM, SHAFFER DW, ARMSTRONG EG. “Practicing” medicine without risk: students’ and educators’ responses to high-fidelity patient simulation. Acad Med 2001;76:469–72.

This is a “feel good” qualitative paper about simulators, pure and simple. Altogether, 27 clinical medical students and clerks and 33 educators went through the Simulator and were asked how they feel about it. The medical students were instructed to evaluate and treat two patients: (1) a trauma patient with hypovolemic shock and a tension pneumothorax and (2) a cardiac patient with marginally stable ventricular tachycardia. The educators, on the other hand, were instructed to care for a patient with anaphylaxis. All participants were debriefed in a case discussion afterward and then completed several evaluations to determine who liked the experience.

To get back to the “theme” of this group of articles—It “stands to reason” that an educational method that everyone likes should be an educational method we should use. Everyone likes Simulators. Even better than the statistics (85% loving the Simulator) were the “raw comments” that hammer home just how cool Simulators are.

“I think everyone could benefit from this.” “Every medical student should have the opportunity to learn using this Simulator several times each year.”

How can you argue with that?

This study also demonstrates the benefit of relatively small sample sizes—you can collect more qualitative data so you know not only what they liked but, more importantly, why they liked it.

✓ Gordon JA, Oriol NE, Cooper JB. Bringing good teaching cases “to life”: a simulator-based medical education service. Acad Med 2004;79:23–7.

Based on their successful pilot studies of positive learner reactions to simulation-based education, Dr. Gordon and his colleagues set out to develop a comprehensive on-campus simulation program at Harvard Medical School. They provide a descriptive case study of how to develop a simulator program in an undergraduate medical curriculum. And when the folks at Harvard give free advice—we listen.

The authors outline several initial steps that are critical to get a simulation program off the ground and make sure it lasts.

The authors provide practical tips on integrating simulation into the existing medical school curriculum by using existing material rather than “reinventing the wheel.” Students in every year of medical school can have meaningful education and training using simulation—you don’t need to restrict this to junior and seniors in medical school.

However, what separates this program from all others is the development and implementation of a “medical education service” dedicated to providing “education on demand” for any student who wants to use the Simulators. Faculty members and residents provide the instruction so students can use whatever “down time” they have to hone their skills.

This has become very successful, as evidenced by a group of 15 graduating students who wrote to the dean, “the Simulator stands out as the most important educational adventure at Harvard Medical School.”

What can be better than that?

GREENBERG R, LOYD G, WESLEY G. Integrated simulation experiences to enhance clinical education. Med Educ 2002;36: 1109–10.

Dr. Greenberg and her faithful minions from the University of Louisville Patient Simulation Center at the Alumni Center for Medical Education (see? what did we tell you about the importance of having an impressive name for your simulation center) combined a high-fidelity Simulator with a standardized patient. The ultimate simulatory experience—first you talk with an actor pretending to have a condition, then you go to the Simulator as if the actor has now “become” the mannequin. Great idea!

First, students meet a patient (SP—standardized patient, the actor) about to have an appendectomy. Next, the student follows the patient into the OR and participates in anesthetizing the patient (Simulator) throughout the procedure. Then the student returns to the waiting room to discuss the procedure with the patient’s spouse (SP). Finally, the student examines the patient (SP) 2 weeks later when she presents with a fever. Whew! Faculty like exploring new clinical teaching and testing methods, and the students are more engaged in their education.

This is an educational twist—that it “stands to reason” is a great way to teach. You combine the best of both worlds and give the student a hell of an experience.

EPSTEIN RM, HUNDERT EM. Defining and assessing professional competence. JAMA 2002;287:226–35.

When you think of “medical science” you think of hard data: blood levels of propofol, heart rates that say “beta-blockade achieved,” or gastric emptying times. And even in the “softer” realm of medical education, you still look for “hard data”: test scores, percentage pass rate of a program, and (in our Simulator world) checklists.

This JAMA article takes us even farther into the “soft.” What is competence? How do you assess it? Look at their definition of competence and ask yourself, “Just how could I assess competence?” and, not to be too cagey about it, “Could I use the Simulator to assess competence?”

Competence is “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community.”

OK, genius, just how in blue blazes do you assess that? (For our nefarious purposes, can a couple of hours in the Simulator fill that tall order?) JAMA tells us that the three biggies for assessing competence are:

Note: Simulators are not mentioned. The million dollar question—Should Simulators be included?

OK, our goal is to assess competence, and we currently have three ways of doing it. Are they any good? (By extension, does a budding Simulationologist see any defects in the current system that the Simulator could fill?)

So here we have the current three methods of assessing competence. Look again at the definition of competence and ask yourself if any of these three really hit the nail on the head. Competence is “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community.”

Does an attending physician’s evaluation of a resident assess “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community.” Not really.

Does a multiple choice exam assess “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community.” Not really.

Does a standardized patient assessment evaluate “the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community.” Um, closer. I think.

This whole world is murky and quasi-scientific. Go ahead, try to make a bold and sure statement about assessing competence. “The best method for assessing competence is the standardized patient assessment!” Someone asks you, “Prove it.” You say, uh, you say … what do you say?

So wouldn’t it be great if the JAMA then said, “So the current methods of assessing competence aren’t any good. But putting people through the Simulator fits the bill perfectly!” Well, they didn’t. Too bad. But they did say that we need to develop innovative ways to assess professional competence. And, who are we kidding, that is exactly what we’re trying to do with our Simulators.

DILLON GF, BOULET JR, HAWKINS RE, SWANSON DB. Simulations in the United States Medical Licensing Examination (USMLE). Qual Saf Health Care 2004;13(Suppl 1):i41–5).

This is the article we have been waiting for—the people in charge of providing the assessment requirement for a medical license in the United States predicting the inevitable use of simulators in high-stakes examinations.

They provide a current description of the US medical licensing system and explain how all of them use some form of simulation.

There—the folks in charge of testing and therefore education and training (testing drives learning) have just stated what we knew all along. Want to go for a ride?

✓ Seropian MA. General concepts in full scale simulation: getting started. Anesth Analg 2003;97:1695–705.

This article is cited later in this book, where we mention, “If you are thinking of starting a simulation center, and you’re looking for a good ‘how-to’ article, this is the one.” Dr. Seropian pays most attention to the person running the Simulator, not so much the Simulator mannequin itself. It’s the live component in the Simulator that makes it happen, so Seropian emphasizes the need to “train the trainer,” especially in the delicate art of debriefing.

OHRN MAK, VAN OOSTROM JH, VAN MEURS WL. A comparison of traditional textbook and interactive computer learning of neuromuscular block. Anesth Analg 1997;84:657–1.

This didn’t test a high-fidelity mannequin; rather, it was a test of a flat screen Simulator (majorly cool video game, in effect, teaching neuromuscular blockade). Does this have any relevance to a simulator center? Yes indeedy. Any “full service” simulator center would have not just mannequins but all kinds of learning gizmos, including flat screen simulators. It “stands to reason” that we should use all manner of simulation in a simulation center. So, OK, great, does this neuromuscular video game do the trick? Yes.

A group of 23 residents were divided up: Half were taught with textbooks (the same technology used since the Epic of Gilgamesh 5000 years ago), and half were taught with these flat screen computer Simulators (the new technology used since the Epic of Bill Gates just 20 or so years ago). Result: computers taught better, as measured by an exam. Fringe benefit, the residents liked the computer experience more than the textbook one.

You see this again and again and again. No matter what the study, no matter what the technique or result, one thing comes through loud and clear. People like this way of learning. If that alone served as justification, there’d be Simulators on every street corner from Miami to Juneau.

BERKENSTADT H, ZIV A, BARSUK D, LEVINE I, COHEN A, VARDI A. The use of advanced simulation in the training of anesthesiolgists to treat chemical warfare casualties. Anesth Analg 2003;96: 1739–42.

Our colleagues in Israel identified another use of simulation training—prepare anesthesiologists to respond to a weapons of mass destruction attack, in this case chemical weapons. Since the early 1990s they have used a curriculum that included lectures, hands-on training with simulated patients undergoing decontamination, and simulated treatment while medical personnel were in full protection gear. However, they acknowledge these courses focused on the logistics of the scenarios and were deficient in providing opportunities for medical personnel to exercise and practice clinical procedures—here comes the use of advanced Simulators to provide these opportunities to respond to chemical attacks. The study included 25 medical personnel divided into multidisciplinary teams of anesthesiologists and intensive care and postanesthesia care nurses. The catch—all trainees had to be in full protective gear, including gas mask, chemical protective gloves, and a multilayered overgarment!

The tasks included the following.

Outcome measures included checklists for performance assessment (coordination and communication among team members, leadership in clinical decision making and prioritization) and feedback. They were validated by the input of several experts in anesthesia, intensive care, and trauma management. In addition, there were experts in relevant medical fields from such diverse areas as the Israeli Defense Forces Medical Corps NBC Branch and the National Health Authorities. Participants also completed a postcourse questionnaire gauging their perception of several aspects of the course.

They learned that the medical personnel could actually function with the gas mask, although it did interfere with communication within the medical teams. The chemical protective gloves were found to be the limiting factor in the performance of medical tasks. All 25 participants gave favorable rating to the course. The authors acknowledge that limitations included the lack of pre- and posttesting tools and no quantitative performance evaluations.

This study is important because it demonstrates how existing training and assessment methods can be used to address new needs (response to acts of terrorism) and can be implemented on a national scale. It also highlights the importance of involving all stakeholders in the process of developing outcome measures based on the curriculum. Finally, the study identified two independent variables that affected performance (gas mask—communication; gloves—clinical procedures). This has important implications regarding the assumptions of how prepared medical personnel are.

BERKENSTADT H, GAFNI N. Incorporating Simulation-Based Objective Structured Clinical Examination (OSCE) into the Israeli National Board Examination in Anesthesiology. Anesth Analg 2006;102:853–8.

No kidding, simulation as an assessment tool has arrived. In Israel, the OSCE, using simulator technology, “has gradually progressed from being a minor part of the oral board examination to a prerequisite component of the test.”

In Israel, they asked the question, “What should our anesthesia people know before we certify them?” The answers are as follows.

So, because that’s what residents need to know, that’s what the Israeli board set out to test. They create scenarios for each of these areas, put the examinees in the Simulator, videotape and grade their performance, and accredit those who perform well. During the past 2 years, with 104 candidates, the Israeli board used simulation technology as part of their assessment. Most examinees found the exam reasonable to difficult, and most preferred it to the standard oral examinations.

Is Israel the only place doing this?

If anyone is still wondering whether the Simulator is coming, we’ve got news for you. It’s already here.

BOND WF, DEITRICK LM, ARNOLD DC, KOSTENBADER M, BARR GC, KIMMEL SR, ET AL. Using simulation to instruct emergency medicine residents in cognitive forcing strategies. Acad Med 2004;79:438–46.

Emergency medicine residents are at high risk of making thinking errors because of multiple factors, including high-decision density, high levels of diagnostic uncertainty, and high patient acuity at the same time having to deal with a large number of distractions. The way to do this is to instruct clinicians to develop strategies to face these situations. This is called metacognition. The problem is that one’s ability to handle hypotension due to cardiac arrest does not translate to one’s ability to manage hypotension that results from septic shock. In other words, your problem-solving ability is disease-(context)-specific. However, the authors point out, “but if the resident does not see enough of certain critical problems, he or she may be left with incomplete training.”

In an elegant qualitative study (includes an appendix with the survey instrument), the authors put 15 second- and third-year anesthesia residents through a complex case they thought would be mismanaged because it stands to reason that learners pay more attention to a case in which they made mistakes than one they performed flawlessly. The patient was a 67-year-old woman with renal failure on dialysis who presented to the emergency department with shortness of breath. The case is embedded with “error traps.” For instance, the decision to intubate and use succinylcholine without confirming whether the patient is on dialysis as evidenced by the shunt on her arm. This leads to worsening hyperkalemia and cardiac arrest.

Residents were debriefed on issues such as omission errors and faulty hypothesis generation, given the option to review the videotape of their case, and asked a series of questions related to their experience. Third-year residents appeared to appreciate the global thinking strategies, whereas second-year residents focused more on concrete issues (knowledge gained about succinylcholine). Most residents commented positively on the opportunity to make errors without injuring patients. So when forcing cognitive strategies on your residents—and you need lots of patients to do so—the residents appear to learn from their mistakes. You can’t have residents making mistakes on patients, so it stands to reason that you should use Simulators.

CLEAVEHOGG D, MORGAN PJ. Experiential learning in an anaesthesia simulation centre: analysis of students’ comments. Med Teach 2002;24:23–6.

We learn better when we are doing rather than watching or being told something. There is just no better substitute than hands-on experiences. However, Cleave-Hogg and Morgan, at the University of Toronto, pointed out that this is a problem, especially for medical students because:

It stands to reason a method that could address these limitations would offer that all important hands-on experience. Each of the participating 145 fourth-year students was allowed to work through one short case as part of his or her curriculum. The authors asked fourth-year students how they felt regarding their use of Simulators as a learning tool. They had a 100% return rate on the questionnaires (they must have offered pizza). Their comments are in contrast to the Bond study in that most students (88%) valued the cognitive issues over the technical skills (10%) learned. Students in general preferred to have one-to-one feedback rather than getting group feedback. Most importantly, “they were involved in learning without fear of harming a patient.”

COOPER JB, BARRON D, BLUM R, DAVISON JK, FEINSTEIN D, HALASZ J, ET AL. Video teleconferencing with realistic simulation for medical education. J Clin Anesth 2000;12:256–61.

If Simulators are good training tools for individuals and small groups of learners, what about for large groups? If we use additional technology, video-conferencing, it stands to reason we can reach a much broader audience, including places without the facilities and resources of these costly tools. Cooper and colleagues explored the feasibility and success of conducting long-distance clinical case discussions with realistic re-enactments of anesthesia critical events. They set up the equipment to allow two-way audio and visual feedback between the simulation suite and audience. (Details of the technology setup are fully described in the article’s Appendix 1).

The audience (which ranged from 50 to 150 people) was initially given information regarding the case from a real “patient” and family; and after a short break they were sent to the OR where the Simulator was in place. Participants on both sides were allowed to ask questions and make comments regarding the case. In fact, when the patient’s condition deteriorated, participants were allowed to make suggestions regarding the patient’s management. Participants were generally enthusiastic regarding this approach, including 97% who highly rated the educational value of the session. Challenges with the study: A few questioned the cost, and the authors noted the many technical issues that always need to be monitored.

Although not directly studied, the authors believed that the teleconferenced training sessions could enhance the traditional mode of case-based clinical education, and they do acknowledge the “entertainment value of the program.” There is nothing wrong with being entertained while learning.

SCHWID HA. Anesthesia simulators—technology and applications. Isr Med Assoc J 2000;2:949–53.

Poor Howard. Here he is a full professor, a major element in the Simulator world, and this article in the IMAJ misspelled his name at the bottom of every other page in this article. Go figure. It’s hard to get the respect you deserve. Professor Schwid’s name appears again and again in simulation articles, so keep your eye out for his excellent work from the University of Washington.

This is a review article that lays out all the various kinds of technology available for simulation teaching. Screen-based simulation is, in effect, a high-tech video game where you can study uptake of anesthetic vapors, snake your way through the oxygen flow in an anesthesiology machine, try your hand at neuromuscular blockade pharmacology, or run codes. Mannequin-based simulators win rave reviews from residents (which jibes with my experience—Author), and the hunt is on to “prove the effectiveness of simulators.”

EAVES RH, FLAGG AJ. The U.S. Air Force pilot simulated medical unit: a teaching strategy with multiple applications. J Nurs Educ 2001;40:110–5.

If you can train a learner to manage a single “patient” using a single Simulator, it stands to reason you can train a provider to manage a unit of patients using many Simulators simultaneously. Who has to do this?—nurses, of course!

In this descriptive article Majors Eaves and Flagg from the U.S. Air Force describe the design, development, and implementation of a Simulated Medical Unit (SMU) consisting of 11 patients—nine medium fidelity simulators and two live actors. They point out that recent changes in Department of Defense hospitals have resulted in significant downsizing, with far fewer patients, making it difficult to find clinical experiences to learn skills.

The authors set up a medical ward consisting of patients with:

To enhance the realism, nurses were provided expectations of their behavior.

Five nurses spent 3 weeks in the simulated medical unit (SMU) with progressive responsibilities over time.

The authors pointed out that this allowed them to see not only a variety of conditions but a variety of presentations of the same condition. It also allowed:

The nurses were unanimous in their increased ability to perform at a busy inpatient unit. Although not formally evaluated, when the nurses first took care of real patients the nurses’ first preceptors were “amazed” by their ability – their orientation time was cut in half, and they were much more independent than the typical new nurse.

The authors correctly point out the high cost of their exercises (estimated at $1,548,600) and that few organizations would have the resources to develop this type of learning. However, for large organizations who have to train large numbers of personnel in relatively brief periods of time, the “potential costs savings … are significant if documentation improves and litigation decreases.” What else could an organization want?

HAMMAN WR. The complexity of team training: what we have learned from aviation and its application to medicine. Qual Saf Health Care 2004;13(Suppl 1):i72–9.

We read all the time that the promise of simulation in health care is based in large part on its positive effect in the field of aviation. We cannot imagine a pilot flying a large passenger jet without hours of simulator training and retraining. Aside from the technical marvels of modern flight Simulators, what can we learn from the aviation field about how we train providers to make a safer system with fewer medical errors?

In this article, Hamman drew on his vast experience as an aviation training expert to provide a blueprint of what we can do in medicine to match the aviation industry. First, he notes that most errors in medicine, like aviation, are a result of a breakdown in the team or system rather than an individual. Until the late 1970s, aviation training focused on a pilot’s individual skills. In 1978, NASA published its research on the causes of commercial air accidents and concluded that “the majority of disasters resulted not from pilot’s lack of technical skill or mechanical failure, but from error associated with breakdowns in communication, leadership, and teamwork.” Hamman illustrates this with two examples:

Events such as these led to the obvious conclusion that the way pilots and crew had been trained for the previous four decades would no longer suffice in the modern era. Reports such as the Institutes of Medicine’s To Err is Human have highlighted that the way physicians, nurses, techs have been trained over the last 100 years is entirely inadequate for today’s complex health care system.

So what can we learn from aviation?

In summary:

Read this article in full – you will have a clearer picture of where we need to go in medical simulation. Hamman tells us that it will not be easy and will require “much work” but that medicine “should no longer wait.” We agree, Captain Hamman.

HOLZMAN RS, COOPER JB, GABA DM, PHILIP JH, SMALL SD, FEINSTEIN D. Anesthesia crisis resource management: real-life simulation training in operating room crises. J Clin Anesth 1995;7:675–87.

Can a successful simulation program developed at Stanford be transferred across the United States to Boston and be just as successful? This article describes the first adoption of Anesthesia Crisis Resource Management (ACRM) outside Stanford and the Kingdom of Gaba. This is important because it demonstrates the possibility and feasibility of simulation training transferability. Once people saw that it could be done in Boston, they started to say, “We can do this too.”

Holzman, Cooper, and their Boston colleagues collaborated with Gaba to set up an analogous simulation program including Simulator, mock OR suite, actors, evaluators. They enrolled 68 anesthesiologists of varying levels of experience and 4 nurse anesthetists in ACRM training and evaluated their perception of the experience. As expected, the overall response was very positive, with more junior attendings rating the course higher than senior attendings. They also thought that the course should be taken more often. Senior attendings rated their own performance significantly higher than more-junior anesthesiolgists.

A 6-month follow-up questionnaire from 33 respondents revealed that 8 had been involved in a critical incident since the course and thought that the training prepared them to handle these critical events more effectively. The authors acknowledge that the study did not involve a control group and that an adequate controlled evaluation of participants would be difficult, time-consuming, faculty-intensive, expensive, and need multiple institutions to develop a national standard. That may be true, but in the process they proved that a novel idea borrowed from aviation could be applied to medicine at more than one institution – and it is now routinely performed at hundreds of institutions worldwide.

KURREK MM, FISH KJ. Anaesthesia crisis resource management training: an intimidating concept, a rewarding experience. Can J Anaesth 1996;43:430–4.

This is an early report from the University of Toronto on the early acceptance of Anesthesia Crisis Resource Management (ACRM). The authors sought to obtain the opinions of two groups of practitioners: those who likely had never been trained on Simulators and those who had participated in ACRM workshops at the University of Toronto.

They sent 150 survey questionnaires to a mixture of community and academic anesthesiologists and residents in-training. They received back 59 surveys – a response rate of 39%. This is less than half the minimum response rate of 80% generally considered necessary to avoid bias in the results.

How did this group feel about simulation? They were very supportive of the purchase, training for residents and faculty, willing to spend unpaid time in the Simulator, and thought it had much relevance for anesthesia training. These responses did not vary much between staff and residents. Both staff and residents anticipated much anxiety if trained in a Simulator and did not favor the compulsory use of simulation for recertification.

The authors also sent a survey questionnaire to 36 previous participants in ACRM workshops – 35 were returned (97% response rate – this is excellent). The participants enjoyed all aspects of the course, thought it would be beneficial to anesthesiologists for initial, advanced, and refresher training. They generally thought the course should be taken every 1.5 years.

The authors commented on the perceived level of anxiety of the larger group of inexperienced anesthesiologists as a potential barrier to this group using Simulators because of the fear of Simulators being used for evaluation purposes. It is unfortunate that the authors stated that the evaluation aspect of simulation should be minimized and surmised that “issues of validation and expense make it unlikely that the use of anesthesia simulators will be a viable option for re-certification.” What? That is probably what pilots first said about flight Simulators.

It stands to reason that all health care providers should feel anxiety when they are going to be tested. How many students make themselves sick with worry and panic over multiple-choice exams? Perhaps the anesthesiologists realized that for the first time in their career someone was going to actually watch their performance – we would all be anxious – but that is not a reason not to do it.

HALAMEK LP, KAEGI DM, GABA DM, SOWB YA, SMITH BC, SMITH BE, ET AL. Time for a new paradigm in pediatric medical education: teaching neonatal resuscitation in a simulated delivery room environment. Pediatrics 2000;106:E45.

Anesthesia is not the only high-risk, dynamic, stressful area of medicine – how about neonatal resuscitation! Alien fetal and neonatal physiology, tiny anatomy for endotracheal intubation, umbilical vessel catheterization – decisions made by the pediatrician carry lifelong consequences for both patients, mother and infant. Unlike anesthesia, the pediatrician does not have the benefit of a sedated, well monitored patient but most rely on auditory cues such as “crying” (there is no crying under anesthesia), breath and heart sounds, visual cues such as muscle tone and skin color (under anesthesia the patient is draped), and information from the obstetrician, nurse, mother, father, and grandparents among others. The authors make the case that if Simulators are good for other high-risk industries (aviation) and anesthesia it makes good sense for neonatal medicine – they are right!

Halamek and his colleagues at Stanford developed a course, “NeoSim,” that integrates traditional instruction (textbooks, lectures, on-the-job training) with technical and behavioral skills training in a simulated environment. They developed several delivery room crises that included patient problems (meconium aspiration, prenatal depression, hemorrhage, congenital anomalies) with equipment failure and stressful interactions with other delivery room team members. At the time of the study, 38 physicians and nurses had completed the program and overwhelmingly valued the experience. They liked mostly the realistic scenarios, feedback debriefings, and the faculty. Even though many thought the Simulator could have been more realistic, they nonetheless thought that the entire experience effectively recreated real-life situations that tested their technical and behavioral skills.

That is the important message – good simulation is not about the technology and all the fancy gadgets. It is how it is used by the right of people – those dedicated to education and training.

REZNEK M, SMITH-COGGINS R, HOWARD S, KIRAN K, HARTER P, SOWB Y, ET AL. Emergency medicine crisis resource management (EMCRM): pilot study of a simulation-based crisis management course for emergency medicine. Acad Emerg Med 2003; 10:386–9.

If CRM works for anesthesia, why not for emergency medicine. Emergency departments are complex, dynamic working environments in which crises can rapidly develop. Reznek and several colleagues at Stanford (where else?) developed the Emergency Medicine Crises Resource Management (EMCRM) course and evaluated participants’ perceptions of their training.

The course was modeled after the ACRM textbook (Crisis Management in Anesthesiology. New York: Churchill Livingstone; 1994). As with previous iterations of the CRM courses in other disciplines, the participants, comprising 13 emergency medicine residents, gave very positive ratings of the course, their skills as a result of the course, and whether the course would be suitable for initial and refresher training.

This study did not reveal anything new. It just demonstrated that what was once a domain of anesthesia is now being adopted in all high-risk fields of medicine – way to go!

GABA DM, HOWARD SK, FISH KJ, SMITH BE, SOWB YA. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simulation Gaming 2001;32:175–93.

In this review, Gaba and his colleagues provide a 10-year perspective on the development, successful implementation, growth and evaluation, and the ongoing challenges of training health care providers to work as a crew for a larger team. The authors outlined the needs of the course during the late 1980s and early 1990s to address deficiencies in the training of anesthesiologists – these focused on several critical aspects of decision making and crisis management. The team then used aviation training as a model to design and develop the ACRM course that trains not only crews within the same disciplines but also interdisciplinary teams.

Highlights of this successful curriculum, which in large part has been the driving force for the use of high-fidelity simulation, are as follows.

Ongoing challenges for ACRM include the following.

The main message is that although it stands to reason that Simulators work it stands to reason even more if the Simulator is guided by a well developed curriculum and not by its technical gadgets.

MORGAN PJ, CLEAVE-HOGG D. A worldwide survey of the use of simulation in anesthesia. Can J Anaesth 2002;49:659–62.

I wonder how they use simulation in Amsterdam or Singapore? Do they face the same challenges of obtaining funding and finding the time to do research? How do they balance their clinical responsibilities with their educational duties? A highly effective way to find this out is to send a survey to as many centers using high-fidelity Simulators and evaluate the results. This is exactly what Drs. Morgan and Cleave-Hogg did.

They searched the WWW and two centers’ large database of simulation centers (University of Rochester and Bristol Medical Simulation Center) to identify 158 simulation centers worldwide. They sent a 67-item survey (available at: www.cja.jca.org) designed to capture information regarding the use of Simulators for education, evaluation, and research. They received 60 responses for a rate of 38% (even after a second mailing), which was too low to avoid significant biases in their results. Phone calls to the Center directors would have dramatically increased the response rate (this has been demonstrated in numerous educational studies).

The authors reported primarily quantitative data from the survey.

The authors provided a “snapshot” of fewer than half of the identified centers that returned the survey in 2001. The number of simulation centers now numbers several hundred. But what did we learn from this survey? That most centers use Simulators for similar reasons and most face similar challenges. We are more interested in the centers that were outliers. What distinguishes the 15% of centers that use simulation for assessment – how do they do it? What about the simulation centers that do not rely on university or department funding? How did the small number of centers obtain government funding?

We provide a case example to illustrate why these questions are important. The University of Miami’s Michael S. Gordon Center for Research in Medical Education has been involved in simulation training, assessment, and research for 40 years. In all this time, the Center has received minimal funding from the university or any department. It has raised through federal, state, and local government sources, national and private foundations, and generous individuals more than $120 million during the past four decades. This Center did not receive the survey but could have offered significant advice from its experience of many successes and a few failures over the past 40 years. There are other centers that were likely missed as well.

The important message is that when you conduct survey studies you do not learn as much if you limit your search to those Centers who mirror your own program. Look for the distractors, the vanguards – there are valuable lessons out there!

OWEN H, PLUMMER JL. Improving learning of a clinical skill: the first year’s experience of teaching endotracheal intubation in a clinical simulation facility. Med Educ 2002;36:635–42.

Sometimes “less” is more, and more of “less” is even better. Drs. Owen and Plummer from Flinders University in Adelaide, Australia point out that endotracheal intubation is a fundamental part of airway management, and airway management “is the scaffolding upon which the whole practice of anaesthesia is built.” The authors contend that we should not wait until a postgraduate or residency program to hone these skills in learners – it stands to reason these skills can be developed in the undergraduate curriculum.

This article and the training described is unique in two aspects.

To address the first issue, Owen and Plummer designed a very practical and straightforward approach to training students about endotracheal intubation. Take a look at their Figure 1 – You see a nice flowchart that outlines the components of the curriculum.

To address the second issue – multiple Simulators – the authors recognized that even though human patient Simulators can simulate different airways, it is a waste of valuable resources to have novice students use a full–body Simulator for single tasks. Instead, they identified and use 13 different adult airway trainers in their curriculum to provide the variation critical for learning skills.

Theirs is a good example of maximizing all of a simulation centers’ resources with an approach that ensures basic skills in medical students so they are better prepared for postgraduate training. All of you residency directors should be happy with this!

NURSING EDUCATION

It stands to reason that if Simulators offer so much potential to the physicians’ disciplines of anesthesia, critical care, and surgery they are just as valuable in nursing education. If one of the primary focuses of medical simulation is interdisciplinary team training, each professional field needs to know what the other is doing.

Enter Drs. Nehring and Lashley from Rutgers, State University of New Jersey College of Medicine. Together and with their colleagues they have written several articles on the use of human patient simulation in nursing education. I list them here so you have easy access.

NEHRING WM, ELLIS WE, LASHLEY FR. Human patient simulators in nursing education: an overview. Simulation Gaming 2001;32:194–204.

This is a well written review of how human patient simulators are used in nursing education. It draws on several examples from the anesthesia field, reviewing the educational, evaluation, and research aspects of using simulation in nursing education.

NEHRING WM, LASHLEY FR, ELLIS WE. Critical incident nursing management using human patient simulators. Nurs Educ Perspect 2002;23:128–32.

The authors describe a unique course, “Critical Incident Nursing Management” (CINM) – a derivation of anesthesia crisis resource management designed by Gaba. CINM is a competency-based method of nursing instruction in which nursing care is taught in the context of critical health incidents (dyspnea in an asthmatic patient).

NEHRING WM, LASHLEY FR. Use of the human patient simulator in nursing education. Annu Rev Nurs Educ 2004;2:163–81.

This is another well written review summarizing the many uses of human patient simulators in nursing education and the authors’ personal experience over the past 5 years.

NEHRING WM, LASHLEY FR. Current use and opinions regarding human patient simulators in nursing education: an international survey. Nurs Educ Perspect 2004;25:144–8.

The authors acknowledge the scant literature in nursing education involving human patient simulators (HPSs). As a result, they decided to survey all nursing training programs that had obtained a METI HPS prior to January 2002. They sent out more than 215 surveys and obtained 40 responses (less than 20% response rate). The survey covered demographic data, items on curricular content of HPS use, evaluation of competence, continuing education, and other uses.

What did they learn? The HPS is used in more courses more often in community colleges than in university or simulation center settings. The Simulator was used most often to teach diagnostic skills and critical events. All but three schools reported that faculty was very receptive to the use of Simulators in their curricula. Why were the others not receptive?

The authors acknowledge the high cost of Simulation as being a limiting factor to its growth in nursing education. They point out:

We could not agree more – Nurses have always played critical roles in patient care; and without their full inclusion in simulation-based training we all will suffer!

Additional Articles on Nursing

Fletcher JL. AANA journal course: update for nurse anesthetists—anesthesia simulation: a tool for learning and research. AANA J. 1995;63:61-67.

Fletcher JL. AANA journal course: update for nurse anesthetists—ERR WATCH: anesthesia crisis resource management from the nurse anesthetist’s perspective. AANA J. 1998;66:595-602.

Henrichs B, Rule A, Grady M, Ellis W. Nurse anesthesia students’ perceptions of the anesthesia patient simulator: a qualitative study. AANA J. 2002;70:219-225.

Kanter RK, Fordyce WE, Tompkins JM. Evaluation of resuscitation proficiency in simulations: the impact of a simultaneous cognitive task. Pediatr Emerg Care. 1990;6:260-262.

Lampotang S. Logistics of conducting a large number of individual sessions with a full-scale patient simulator at a scientific meeting. J Clin Monit. 1997;13:399-407.

Larbuisson R, Pendeville P, Nyssen AS, Janssens M, Mayne A. Use of anaesthesia simulator: initial impressions of its use in two Belgian university centers. Acta Anaesthesiol Belg. 1999;50:87-93.

Lupien AE, George-Gay B. High-fidelity patient simulation. In: Lowenstein AJ, Bradshaw MJ, editors. Fuszard’s Innovative Teaching Strategies in Nursing. Sudbury, MA: Jones & Bartlett; 2004:134-148.

March JA, Farrow JL, Brown LH, Dunn KA, Perkins PK. A breathing manikin model for teaching nasotracheal intubation to EMS professionals. Prehosp Emerg Care. 1997;1:269-272.

McIndoe A. The future face of medical training—ship-shape and Bristol fashion. Br J Theatre Nurs. 1998;8:5. 8–10

McLellan B. Early experience with simulated trauma resuscitation. Can J Surg. 1999;42:205-210.

Monti EJ, Wren K, Haas R, Lupien AE. The use of an anesthesia simulator in graduate and undergraduate education. CRNA. 1998;9:59-66.

Morgan PJ, Cleave-Hogg D. A Canadian simulation experience: faculty and student opinions of a performance evaluation study. Br J Anaesth. 2000;85:779-781.

Mulcaster JT, Mills J, Hung OR, MacQuarrie K, Law JA, Pytka S, et al. Laryngoscopic intubation: learning and performance. Anesthesiology. 2003;98:23-27.

Murray WB, Henry J. Assertiveness training during a crisis resource management (CRM) session using a full human simulator in a realistic simulated environment. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Nyman J, Sihvonen M. Cardiopulmonary resuscitation skills in nurses and nursing students. Resuscitation. 2000;47:179-184.

O’Donnell J, Fletcher J, Dixon B, Palmer L. Planning and implementing an anesthesia crisis resource management course for student nurse anesthetists. CRNA. 1998;9:50-58.

Peteani LA. Enhancing clinical practice and education with high-fidelity human patient simulators. Nurse Educ. 2004;29:25-30.

Rauen CA. Simulation as a teaching strategy for nursing education and orientation in cardiac surgery. Crit Care Nurse. 2004;24:46-51.

Scherer YK, Bruce SA, Graves BT, Erdley WS. Acute care nurse practitioner education: enhancing performance through the use of clinical simulation. AACN Clin Issues. 2003;14:331-341.

Seropian MA, Brown K, Gavilanes JS, Driggers B. An approach to simulation program development. J Nurs Educ. 2004;43:164-169.

Vandrey CI, Whitman KM. Simulator training for novice critical care nurses. Am J Nurs. 2001;101:24GG. LL

Wilson M, Shepherd I, Kelly C, Pitner J. Assessment of a low-fidelity human patient simulator for acquisition of nursing skills. Nurse Educ Today. 2005;25:56-67.

Wong TK, Chung JW. Diagnostic reasoning processes using patient simulation in different learning environments. J Clin Nurs. 2002;11:65-72.

Yaeger KA, Halamek LP, Coyle M, Murphy A, Anderson J, Boyle K, et al. High-fidelity simulation-based training in neonatal nursing. Adv Neonatal Care. 2004;4:326-331.

Additional Articles on “It Stands to Reason”

Abrahamson S. Human simulation for training in anesthesiology. In: Ray CD, editor. Medical Engineering. Chicago: Year Book; 1974:370-374.

Abrahamson S, Hoffman KI. Sim One: a computer-controlled patient simulator. Lakartidningen. 1974;20(71):4756-4758.

Abrahamson S, Wallace P. Using computer-controlled interactive manikins in medical education. Med Teacher. 1980;2(1):25-31.

Adnet F, Lapostolle F, Ricard-hibon A, Carli P, Goldstein P. Intubating trauma patients before reaching the hospital—revisited. Crit Care. 2001;5:290-291.

Arne R, Stale F, Ragna K, Petter L. PatSim—simulator for practising anaesthesia and intensive care: development and observations. Int J Clin Monit Comput. 1996;13:147-152.

Barron DM, Russel RK. Evaluation of simulator use for anesthesia resident orientation. In: Henson L, Lee A, Basford A, editors. Simulators in Anesthesiology Education. New York: Plenum; 1998:111-113.

Barsuk D, Berkenstadt H, Stein M, Lin G, Ziv A. [Advanced patient simulators in pre-hospital management training—the trainees’ perspective (in Hebrew)]. Harefuah. 2003;142:87-90. 160.

Beyea SC. Human patient simulation: a teaching strategy. AORN J. 2004;80:738. 741–2.

Block EF, Lottenberg L, Flint L, Jakobsen J, Liebnitzky D. Use of a human patient simulator for the advanced trauma life support course. Am Surg. 2002;68:648-651.

Blum RH, Raemer DB, Carroll JS, Sunder N, Felstein DM, Cooper JB. Crisis resource management training for an anesthesia faculty: a new approach to continuing education. Med Educ. 2004;38:45-55.

Bond WF, Kostenbader M, McCarthy JF. Prehospital and hospital-based health care providers’ experience with a human patient simulator. Prehosp Emerg Care. 2001;5:284-287.

Bower JO. Using patient simulators to train surgical team members. AORN J. 1997;65:805-808.

Bradley P, Postlethwaite K. Simulation in clinical learning. Med Educ. 2003;37(1):1-5.

Byrne AJ, Hilton PJ, Lunn JN. Basic simulations for anaesthetists: a pilot study of the ACCESS system. Anaesthesia. 1994;49:376-381.

Cain JG, Kofke A, Sinz EH, Barbaccia JJ, Rosen KR. The West Virginia University human crisis simulation program. Am J Anes-thesiol. 2000;27:215-220.

Chopra V, Engbers FH, Geerts MJ, Filet WR, Bovill JG, Spierdijk J. The Leiden anaesthesia simulator. Br J Anaesth. 1994;73:287-292.

Cooper JB, Gaba DM. A strategy for preventing anesthesia accidents. Int Anesthesiol Clin. 1989;27:148-152.

Davies JM, Helmreich RL. Simulation: it’s a start. Can J Anaesth. 1996;43:425-429.

Daykin AP, Bacon RJ. An epidural injection simulator. Anaesthesia. 1990;45:235-236.

Denson JS, Abrahamson S. A computer-controlled patient simulator. JAMA. 1969;208:504-508.

Doyle D, Arellano R. The virtual anesthesiology training simulation system. Can J Anesth. 1994;42:267-273.

Edgar P. Medium fidelity manikins and medical student teaching. Anesthesia. 2002;57:1214-1215.

Ellis C, Hughes G. Use of human patient simulation to teach emergency medicine trainees advanced airway skills. J Accid Emerg Med. 1999;16:395-399.

Euliano TY. Small group teaching: clinical correlation with a human patient simulator. Adv Physiol Educ. 2001;25(1–4):36-43.

Euliano TY. Teaching respiratory physiology: clinical correlation with a human patient simulator. J Clin Monit Comput. 2000;16:465-470.

Euliano T, Good ML. Simulator training in anesthesia growing rapidly; LORAL model born in Florida. J Clin Monit. 1997;13:53-57.

Euliano TY, Mahla ME. Problem-based learning in residency education: a novel implementation using a simulator. J Clin Monit Comput. 1999;15:227-232.

Fallacaro MD. Untoward pathophysiological events: simulation as an experiential learning option to prepare anesthesia providers. CRNA. 2000;11:138-143.

Fish MP, Flanagan B. Incorporation of a realistic anesthesia simulator into an anesthesia clerkship. In: Henson LC, Lee A, Basford A, editors. Simulators in Anesthesiology Education. New York: Plenum; 1998:115-119.

Flexman RE, Stark EA. Training simulators. In: Salvendy G, editor. Handbook of Human Factors. New York: Wiley; 1987:1012-1038.

Forrest F, Bowers M. A useful application of a technical scoring system: identification and subsequent correction of inadequacies of an anaesthetic assistants training programme. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Freid EB. Integration of the human patient simulator into the medical student curriculum: life support skills. In: Henson LC, Lee A, Basford A, editors. Simulators in Anesthesiology Education. New York: Plenum; 1998:15-21.

Friedrich MJ. Practice makes perfect: risk-free medical training with patient simulators. JAMA. 2002;288:2808. 2811–2.

Gaba D, Fish K, Howard S. Crisis Management in Anesthesiology. New York: Churchill Livingstone, 1994.

Gaba DM. Anaesthesia simulators [editorial]. Can J Anaesth. 1995;42:952-953.

Gaba DM. Anaesthesiology as a model for patient safety in health care. BMJ. 2000;320:785-788.

Gaba D. Dynamic decision making in anesthesiology: cognitive models and training approaches. In: Evans D, Patel V, editors. Advanced Models of Cognition for Medical Training and Practice. Berlin: Springer; 1992:123-147.

Gaba D. Dynamic decision-making in anesthesiology: use of realistic simulation for training. Presented at the Nato Advanced Research Workshop: Advanced Models for Cognition for Medical Training and Practice, Krakow, August 1991.

Gaba D. Human error in anesthetic mishaps. Int Anesthesiol Clin. 1989;27:137-147.

Gaba DM. Simulation-based crisis resource management training for trauma care. Am J Anesthesiol. 2000;5:199-200.

Gaba DM. Simulator training in anesthesia growing rapidly: CAE model born at Stanford. J Clin Monit. 1996;12:195-198.

Gaba DM. Two examples of how to evaluate the impact of new approaches to teaching [editorial]. Anesthesiology. 2002;96:1-2.

Gaba DM, Small SD. How can full environment-realistic patient simulators be used for performance assessment. American Society of Anesthesia Newsletter 1997 (http://www.asahq.org/newsletters/1997/10_97/HowCan_1097.html). Accessed on May 22, 2001.

Gaba DM, Howard SK, Small SD. Situation awareness in anesthesiology. Hum Factors. 1995;37:20-31.

Gaba DM, Maxwell M, DeAnda A. Anesthetic mishaps: breaking the chain of accident evaluation. Anesthesiology. 1987;66:670-676.

Garden A, Robinson B, Weller J, Wilson L, Crone D. Education to address medical error—a role for high fidelity patient simulation. N Z Med J. 2002;22(115):133-134.

Girard M, Drolet P. Anesthesiology simulators: networking is the key. Can J Anaesth. 2002;49:647-649.

Glavin R, Greaves D. The problem of coordinating simulator-based instruction with experience in the real workplace. Br J Anaesth. 2003;91:309-311.

Good ML. Simulators in anesthesiology: the excitement continues. American Society of Anesthesia Newsletter (1997. http://wwwasahq.org/newsletters/1997/10_97/SimInAnes_1097.html).

Goodwin MWP, French GWG. Simulation as a training and assessment tool in the management of failed intubation in obstetrics. Int J Obstet Anesth. 2001;10:273-277.

Gordon JA. A simulator-based medical education service. Acad Emerg Med. 2002;9:865.

Gordon JA, Pawlowski J. Education on-demand: the development of a simulator-based medical education service. Acad Med. 2002;77:751-752.

Grant WD. Addition of anesthesia patient simulator is an improvement to evaluation process. Anesth Analg. 2002;95:786.

Gravenstein JS. Training devices and simulators. Anesthesiology. 1998;69:295-297.

Grevnik A, Schaefer JJ. Medical simulation training coming of age. Crit Care Med. 2004;32:2549-2550.

Halamek LP, Kaegi DM, Gaba DM, Sowb YA, Smith BC, Smith BE, et al. Time for a new paradigm in pediatric medical education: teaching neonatal resuscitation in a simulated delivery room environment. Pediatrics. 2000;106:E45.

Hartmannsgruber M, Good M, Carovano R, Lampotang S, Gravenstein JS. Anesthesia simulators and training devices. Anaesthetists. 1993;42:462-469.

Helmreich RL, Davies JM. Anaesthetic simulation and lessons to be learned from aviation. Can J Anaesth. 1997;44:907-912.

Helmreich RL, Chidester T, Foushee H, Gregorich S. Anesthesia crisis resource management: real-life simulation training in operating room crises. J Clin Anesth. 1990;7:675-687.

Hendrickse AD, Ellis AM, Morris RW. Use of simulation technology in Australian Defence Force resuscitation training. J R Army Med Corps. 2001;147:173-178.

Henrichs B. Development of a module for teaching the cricothyro-tomy procedure. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Henriksen K, Moss F. From the runway to the airway and beyond: embracing simulation and team training—now is the time. Qual Saf Health Care. 2004;13(1):i1.

Henry J, Murray W. Increasing teaching efficiency and minimizing expense in the sim lab. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Henson LC, Richardson MG, Stern DH, Shekhter I. Using human patient simulator to credential first year anesthesiology residents for taking overnight call [abstract]. Presented at the 2nd Annual International Meeting on Medical Simulation, 2002.

Hoffman KI, Abrahamson S. The ‘cost-effectiveness’ of Sim One. J Med Educ. 1975;50:1127-1128.

Howells R, Madar J. Newborn resuscitation training—which manikin. Resuscitation. 2002;54:175-181.

Howells TH, Emery FM, Twentyman JE. Endotracheal intubation training using a simulator: an evaluation of the Laerdal adult intubation model in the teaching of endotracheal intubation. Br J Anaesth. 1973;45:400-402.

Iserson KV, Chiasson PM. The ethics of applying new medical technologies. Semin Laparosc Surg. 2002;9:222-229.

Iserson KV. Simulating our future: real changes in medical education. Acad Med. 1999;74:752-754.

Jensen RS, Biegelski C. Cockpit resource management. In: Jensen RS, editor. Aviation Psychology. Aldershot: Gower Technical; 1989:176-209.

Jorm C. Patient safety and quality: can anaesthetists play a greater role? Anaesthesia. 2003;58:833-834.

Kapur PA, Steadman RH. Patient simulator competency testing: ready for takeoff? Anesth Analg. 1998;86:1157-1159.

Kaye K, Frascone RJ, Held T. Prehospital rapid-sequence intubation: a pilot training program. Prehosp Emerg Care. 2003;7:235-240.

King PH, Pierce D, Higgins M, Beattie C, Waitman LR. A proposed method for the measurement of anesthetist care variability. J Clin Monit Comput. 2000;16:121-125.

King PH, Blanks ST, Rummel DM, Patterson D. Simulator training in anesthesiology: an answer? Biomed Instrum Technol. 1996;30:341-345.

Kiriaka J. EMS roadshow. JEMS. 2000;25:40-47.

Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ. 2003;37:267-277.

Kofke WA, Rosen KA, Barbaccia J, Sinz E, Cain J. The value of acute care simulation. WV Med J. 2000;96:396-402.

Kurrek MM, Devitt JH. The cost for construction and operation of a simulation centre. Can J Anaesth. 1997;44:1191-1195.

Kurrek MM, Devitt JH, McLellan BA. Full-scale realistic simulation in Toronto. Am J Anesthesiol. 2000;122:226-227.

Lacey O, Hogan J, Flanagan B. High-fidelity simulation team training of junior hospital staff. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Lampotang S, Ohrn MA, van Meurs WL. A simulator-based respiratory physiology workshop. Acad Med. 1996;71:526-527.

Lederman L. Debriefing: a critical reexamination of the postexpe-rience analytic process with implications for its effective use. Simulation Games. 1984;15:415-431.

Lederman L. Debriefing: toward a systematic assessment of theory and practice. Simulation Gaming. 1992;23:145-160.

Lewis CH, Griffin MJ. Human factors consideration in clinical applications of virtual reality. Stud Health Technol Inform. 1997;44:35-56.

Lippert A, Lippert F, Nielsen J, Jensen PF. Full-scale simulations in Copenhagen. Am J Anesthesiol. 2000;27:221-225.

Lopez-Herce J, Carrillo A, Rodriguez-Nunez A. Newborn manikins. Resuscitation. 2003;56:232-233.

Mackenzie CF, Group L. Simulation of trauma management: the LOTAS experience. http://134.192.17.4/simulati.html:1–10.

Manser T, Dieckmann P, Rall M. Is the performance of anesthesia by anesthesiologists in the simulator setting the same as in the OR? Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Marsch SCU, Scheidegger DH, Stander S, Harms C. Team training using simulator technology in basel. Am J Anesthesiol. 2000;74:209-211.

Martin D, Blezek D, Robb R, Camp LA. Nauss: Simulation of regional anesthesia using virtual reality for training residents. Anesthesiology. 1998;89:A58.

McCarthy M. US military revamps combat medic training and care. Lancet. 2003;361:494-495.

Meller G. Typology of simulators for medical education. J Digit Imaging. 1997;10(1):194-196.

Meller G, Tepper R, Bergman M, Anderhub B. The tradeoffs of successful simulation. Stud Health Technol Inform. 1997;39:565-571.

Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63-S67.

Mondello E, Montanini S. New techniques in training and education: simulator-based approaches to anesthesia and intensive care. Minerva Anestesiol. 2002;68:715-718.

Morhaim DK, Heller MB. The practice of teaching endotracheal intubation on recently deceased patients. J Emerg Med. 1991;9:515-518.

Mukherjee J, Down J, Jones M, Seig, S, Martin, G, Maze M. Simulator teaching for final year medical students: subjective assessment of knowledge and skills. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Murray DJ. Clinical simulation: technical novelty or innovation in education. Anesthesiology. 1998;89:1-2.

Murray WB, Foster PA. Crisis resource management among strangers: principles of organizing a multidisciplinary group for crisis resource management. J Clin Anesth. 2000;12:633-638.

Murray W, Good M, Gravenstein J, Brasfield W. Novel application of a full human simulator: training with remifentanil prior to human use. Anesthesiology. 1998;89:A56.

Murray W, Gorman P, Lieser J, Haluck RS, Krummel TM, Vaduva S. The psychomotor learning curve with a force feedback trainer: a pilot study. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Murray W, Proctor L, Henry J, Abicht D, Gorman PJ, Vaduva S, et al. Crisis resource management (CRM) training using the Medical Education Technologies, Inc. (METI) simulator: the first year. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Norman G. Editorial: simulation—savior or Satan? Adv Health Sci Educ Theory Pract. 2003;8(1):1-3.

Norman J, Wilkins D. Simulators for anesthesia. J Clin Monit. 1996;12:91-99.

O’Brien G, Haughton A, Flanagan B. Interns’ perceptions of performance and confidence in participating in and managing simulated and real cardiac arrest situations. Med Teach. 2001;23:389-395.

Olympio MA. Simulation saves lives. ASA Newslett. 2001:15-19.

Palmisano J, Akingbola O, Moler F, Custer J. Simulated pediatric cardiopulmonary resuscitation: initial events and response times of a hospital arrest team. Respir Care. 1994;39:725-729.

Paskin S, Raemer DB, Garfield JM, Philip JH. Is computer simulation of anesthetic uptake and distribution an effective tool for anesthesia residents? J Clin Monit. 1985;1:165-173.

Raemer D. In-hospital resuscitation: team training using simulation. Presented at the 1999 Society for Education in Anesthesia Spring Meeting. Rochester, NY, 1999.

Raemer DB, Barron DM. Use of simulators for education and training in nonanesthesia healthcare domains. American Society of Anesthesia Newsletter 1997. Available at: http://www.asahq.org/newsletter/1997/10_97/UsesOf_1097.html

Raemer D, Barron D, Blum R, Frenna T, Sica GT, et al. Teaching crisis management in radiology using realistic simulation. In: 1998 Meeting of the Society for Technology in Anesthesia, Orlando, FL, 1998, p. 28.

Raemer D, Graydon-Baker E, Malov S. Simulated computerized medical records for scenarios. Presented at the 2001 International Meeting on Medical Simulation. Scottsdale, AZ, 2001.

Raemer D, Mavigilia S, Van Horne C, Stone P. Mock codes: using realistic simulation to teach team resuscitation management. In: 1998 Meeting of the Society for Technology in Anesthesia. Orlando, FL, 1998, p. 29.

Raemer D, Morris G, Gardner R, Walzer TB, Beatty T, Mueller KB, et al. Development of a simulation-based labor & delivery team course. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Raemer D, Shapiro N, Lifferth G, Blum RM, Edlow J. Testing probes, a new method of measuring teamwork attributes in simulated scenarios. Presented at the 2001 International Meeting on Medical Simulation, Scottsdale, AZ, 2001.

Raemer D, Sunder N, Gardner R, Walzer TB, Cooper J, et al. Using simulation to practice debriefing medical error. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Rall M, Gaba D. Human performance and patient safety. In Miller RD, editor: Miller’s Anesthesia, 6th ed., Philadelphia: Elsevier, 2005.

Rall M, Manser T, Guggenberger H, Gaba DM, Unertl K. [Patient safety and errors in medicine: development, prevention and analyses of incidents]. Anasthesiol Intensivmed Notfallmed Schmerzther. 2001;36:321-330.

Rall M, Schaedle B, Zieger J, Naef W, Weinlich M. Innovative training for enhancing patient safety: safety culture and integrated concepts. Unfallchirurg. 2002;105:1033-1042.

Riley R, Grauze A, Chinnery C, Horley R. The first two years of “CASMS,” the world’s busiest medical simulation center. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Riley RH, Wilks DH, Freeman JA. Anaesthetists’ attitudes towards an anaesthesia simulator: a comparative survey: U.S.A. and Australia. Anaesth Intensive Care. 1997;25:514-519.

Rizkallah N, Carter T, Essin D, Johnson C, Steen SN, et al. Mini-sim: a human patient simulator. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Robinson B, Little J, McCullough S, Lange R, Lamond C, Levack W, et al. Simulation based training for allied health professionals: physiotherapy respiratory workshop. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Rubinshtein R, Robenshtok E, Eisenkraft A, Vdan A, Hourvitz A. Training Israeli medical personnel to treat casualties of nuclear, biologic and chemical warfare. Isr Med Assoc J. 2002;4:545-548.

Rudolph J, Raemer D. Using collaborative inquiry to debrief simulated crisis events: lessons from action science. Presented at the 2001 International Meeting on Medical Simulation, Scottsdale, AZ, 2001.

Rutherford W. Aviation safety: a model for health care? It is time to rethink the institutions and processes through which health care is delivered if a “culture of safety” is to be achieved. Qual Saf Health Care. 2003;12:162-164.

Sanders J, Haas RE, Geisler M, Lupien AE. Using the human patient simulator to test the efficacy of an experimental emergency percutaneous transtracheal airway. Mil Med. 1998;163:544-551.

Sarman I, Bolin D, Holmer I, Tunell R. Assessment of thermal conditions in neonatal care: use of a manikin of premature baby size. Am J Perinatol. 1992;9:239-246.

Satish U, Streufert S. Value of a cognitive simulation in medicine: towards optimizing decision making performance of healthcare personnel. Qual Saf Health Care. 2002;11:163-167.

Schaefer JJ, Gonzalez RM. Dynamic simulation: a new tool for difficult airway training of professional healthcare providers. Am J Anesthesiol. 2000;27:232-242.

Schaivone K, Jenkins L, Mallott D, Budd N. Development of a comprehensive simulation experience: a faculty training project. Presented at the International Meeting on Medical Simulation, San Diego, 2003

Scherer YK, Bruce SA, Graves BT, Erdley WS. Acute care nurse practitioner education: enhancing performance through the use of clinical simulation. AACN Clin Issues. 2003;14:331-341.

Schlindwein M, von Wagner G, Kirst M, Rajewicz M, Karl F, Schochlin J, et al. Mobile patient simulator for resuscitation training with automatic external defibrillators. Biomed Tech (Berl). 2002;47(1):559-560.

Schweiger J, Preece J. Authenticity of the METI anesthesia patient simulator: medical students’ perception. Crit Care Med. 1995;23:432-433.

Schwid HA, O’Donnell D. The Anesthesia Simulator Consultant: simulation plus expert system. Anesthesiol Rev. 1993;20:185-189.

Schwid H. An educational simulator for the management of myocardial ischemia. Anesth Analg. 1989;68:S248.

Shapiro MJ, Simmons W. High fidelity medical simulation: a new paradigm in medical education. Med Health R I. 2002;85:316-317.

Sheplock G, Thomas P, Camporesi E. An interactive computer program for teaching regional anesthesia. Anesthesiol Rev. 1993;20:53-59.

Sikorski J, Jebson P, Hauser P. Computer-aided instruction simulating intraoperative vents in anesthesia residents training. Anesthesiology. 1983;59:A470.

Skartwed R, Ferguson S, Eichorn M, Wilks D. Using different educational modalities to optimize efficiency in an interdisciplinary simulation center. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Small S. What participants learn from anesthesia crisis resource management training. Anesthesiology. 1998;89:A71.

Small SD, Wuerz RC, Simon R, Shapiro N, Conn A, Setnik G. Demonstration of high-fidelity simulation team training for emergency medicine. Acad Emerg Med. 1999;6:312-323.

Smith B, Gaba D. Simulators in clinical monitoring: practical application. In: Lake C, Blitt C, Hines R, editors. Clinical Monitoring: Practical Applications for Anesthesia and Critical Care. Philadelphia: Saunders; 2001:26-44.

Sowb YA, Kiran K, Reznek M, Smith-Coggins R, Harter P, Stafford-Cecil S, et al. Development of a three-level curriculum for crisis resource management training in emergency medicine. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Stern D. Improving resuscitation team performance using a full body simulator. Presented at the 2001 International Meeting on Medical Simulation, Ft. Lauderdale, FL, 2001.

Taekman J, Andregg B. SimDot: an interdisciplinary web portal for human simulation. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Taekman J, Eck J, Hobbs G. Integration of PGY-1 anesthesia residents in simulation development. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Takayesu J, Gordon J, Farrell S, Evans AJ, Sullivan JE, Pawlowski J. Learning emergency medicine and critical care medicine: what does high-fidelity patient simulation teach? Acad Emerg Med. 2002;9:476-477. (abstract 319).

Tan G, Ti L, Suresh S, Lee T-L. Human patient simulator is an effective way of teaching physiology to first year medical students. Presented at the International Conference on Medical Simulation, Ft. Lauderdale, FL, 2001.

Tarver S. Anesthesia simulators: concepts and applications. Am J Anesthesiol. 1999;26:393-396.

Tarver S. A relational database to improve scenario and event design on the MidSim simulator. Presented at the International Meeting on Medical Simulation, Ft. Lauderdale, FL, 2001.

Tebich S, Loeb R. Using patient simulation to train CA-1 residents’ rule-based decision making. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Thompson W, Pinder M, See J, Chinnery C, Grauze A. Simulation training for the medical emergency team of a metropolitan teaching hospital. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Underberg K. Multidisciplinary resource management: value as perceived by nurses. Presented at the 2001 International Meeting on Medical Simulation, Scottsdale, AZ.

Underberg K. Nurses’ perceptions of a crisis management (malignant hyperthermia) with a full human simulator. Presented at the International Meeting on Medical Simulation, Scottsdale, AZ, 2001.

Vadodaria B, Gandhi S, McIndoe A. Selection of an emergency cricothyroidotomy kit for clinical use by dynamic evaluation on a (METI) human patient simulator. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Via DK, Kyle RR, Trask JD, Shields CH, Mongan PD. Using high-fidelity patient simulation and an advanced distance education network to teach pharmacology to second-year medical students. J Clin Anesth. 2004;16:142-143.

Von Lubitz D. Medical training at sea using human patient simulator. Presented at the International Meeting on Medical Simulation, Scottsdale, AZ, 2001.

Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357:945-949.

Watterson L, Flanagan B, Donovan B, Robinson B. Anaesthetic simulators: training for the broader health-care profession. Aust N Z J Surg. 2000;70:735-737.

Weinger MB, Gonzalez D, Slagle J, Syeed M. Videotaping of anesthesia non-routine events. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Weinger MB, Raemer DB, Barker SJ. A new Anesthesia & Analgesia section on technology, computing, and simulation. Anesth Analg. 2001;93:1085-1087.

Westenskow D, Runco C, Tucker S, Haak S, Joyce S, Johnson S, et al. Human simulators extend an anesthesiology department’s educational role. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Woods DD. Coping with complexity: the psychology of human behavior in complex systems. In: Goodstein LP, Andersen HB, Olsen SE, editors. Tasks, Errors, and Mental Models. London: Taylor & Francis; 1988:128-148.

Wright M, Skartwed R, Jaramillo Y. Management of a postpartum hemorrhage using the full body human patient simulator. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78:783-788.

Ziv A, Donchin Y, Rotstein Z. National medical simulation center in Israel: a comprehensive model. Presented at the International Meeting on Medical Simulation, Scottsdale, AZ, 2001.

“It Stands to Reason” Articles in Nonanesthesia Disciplines

Aabakken L, Adamsen S, Kruse A. Performance of a colonoscopy simulator: experience from a hands-on endoscopy course. Endoscopy. 2000;32:911-913.

Adrales GL, Chu UB, Witzke DB, Donnelly MB, Hoskins D, Mastrangelo MJJr, et al. Evaluating minimally invasive surgery training using low-cost mechanical simulations. Surg Endosc. 2003;17:580-585.

Barker VL. CathSim. Stud Health Technol Inform. 1999;62:36-37.

Barker VL. Virtual reality: from the development laboratory to the classroom. Stud Health Technol Inform. 1997;39:539-542.

Bar-Meir S. A new endoscopic simulator. Endoscopy. 2000;32:898-900.

Baur C, Guzzoni D, Georg O. VIRGY: a virtual reality and force feedback based endoscopic surgery simulator. Stud Health Technol Inform. 1998;50:110-116.

Beard JD. The Sheffield basic surgical training scheme. Ann R Coll Surg Engl. 1999;81:298-301. (Suppl), 307.

Bholat OS, Haluck RS, Kutz RH, Gorman PJ, Krummel TM. Defining the role of haptic feedback in minimally invasive surgery. Stud Health Technol Inform. 1999;62:62-66.

Bloom MB, Rawn CL, Salzberg AD, Krummel TM. Virtual reality applied to procedural testing: the next era. Ann Surg. 2003;237:442-448.

Bro-Nielsen M, Tasto JL, Cunningham R, Merril GL. PreOp endoscopic simulator: a PC-based immersive training system for bronchoscopy. Stud Health Technol Inform. 1999;62:76-82.

Bro-Nielsen M, Helfrick D, Glass B, Zeng X, Connacher H. VR simulation of abdominal trauma surgery. Stud Health Technol Inform. 1998;50:117-123.

Bruce S, Bridges EJ, Holcomb JB. Preparing to respond: Joint Trauma Training Center and USAF Nursing Warskills Simulation Laboratory. Crit Care Nurs Clin North Am. 2003;15:149-162.

Burd LI, Motew M, Bieniarz J. A teaching simulator for fetal scalp sampling. Obstet Gynecol. 1972;39:418-420.

Cakmak, HK, Kühnapfel U. Animation and simulation techniques for VR-training systems in endoscopic surgery. http://citeseer.nj.nec.com/cakmak00animation.html, 2000.

Caudell TP, Summers KL, Holten J4th, Hakamata T, Mowafi M, Jacobs J, et al. Virtual patient simulator for distributed collaborative medical education. Anat Rec. 2003;270B:23-29.

Chester R, Watson MJ. A newly developed spinal simulator. Man Ther. 2000;5:234-242.

Cotin S, Dawson SL, Meglan D, Shaffer DW, Ferrell MA, Bardsley RS, et al. ICTS, an interventional cardiology training system. Stud Health Technol Inform. 2000;70:59-65.

Dawson S, Kaufman J. The imperative for medical simulation. Proc IEEE. 1998;86:479-483.

De Leo G, Ponder M, Molet T, Fato M, Thalmann D, Magnenat-Thalmann N, et al. A virtual reality system for the training of volunteers involved in health emergency situations. Cyberpsychol Behav. 2003;6:267-274.

Dev P, Heinrichs WL, Srivastava S, Montgomery KN, Senger S, Temkin B, et al. Simulated learning environments in anatomy and surgery delivered via the next generation Internet. Medinfo. 2001;10:1014-1018.

Dev P, Montgomery K, Senger S, Heinrichs WL, Srivastava S, Waldron K. Simulated medical learning environments on the Internet. J Am Med Inform Assoc. 2002;9:437-447.

Eaves RH, Flagg AJ. The U.S. Air Force pilot simulated medical unit: a teaching strategy with multiple applications. J Nurs Educ. 2001;40:110-115.

Ecke U, Klimek L, Muller W, Ziegler R, Mann W. Virtual reality: preparation and execution of sinus surgery. Comput Aided Surg. 1998;3:45-50.

Edmond CVJr, Wiet GJ, Bolger B. Virtual environments: surgical simulation in otolaryngology. Otolaryngol Clin North Am. 1998;31:369-381.

El-Khalili N, Brodlie K, Kessel D. WebSTer: a web-based surgical training system. Stud Health Technol Inform. 2000;70:69-75.

Englmeier KH, Haubner M, Krapichler C, Reiser M. A new hybrid renderer for virtual bronchoscopy. Stud Health Technol Inform. 1999;62:109-115.

Fellander-Tsai L, Stahre C, Anderberg B, Barle H, Bringman S, Kjellin A, et al. Simulator training in medicine and health care: a new pedagogic model for good patient safety. Lakartidningen. 2001;98:3772-3776.

Frey M, Riener R, Burgkart R, Proll T. Initial results with the Munich knee simulator. Biomed Tech (Berl). 2002;47(1):704-707.

Gordon MS. Cardiology patient simulator: development of an animated manikin to teach cardiovascular disease. Am J Cardiol. 1974;34:350-355.

Gordon MS. Learning from a cardiology patient stimulator. RN. 1975;38:ICU1. ICU4, ICU6.

Gordon MS, Ewy GA, Felner JM, Forker AD, Gessner IH, Juul D, et al. A cardiology patient simulator for continuing education of family physicians. J Fam Pract. 1981;13:353-356.

Gordon MS, Ewy GA, Felner JM, Forker AD, Gessner I, McGuire C, et al. Teaching bedside cardiologic examination skills using “Harvey,” the cardiology patient simulator. Med Clin North Am. 1980;64:305-313.

Gorman PJ, Lieser JD, Murray WB, Haluck RS, Krummell TM. Evaluation of skill acquisition using a force feedback, virtual reality based surgical trainer. Stud Health Technol Inform. 1999;62:121-123.

Grantcharov TP, Rodenberg J, Pahle E, Funch-Jensen PF. Virtual reality computer simulation: an objective method for the evaluation of laparoscopic surgical skills. Surg Endosc. 2001;15:242-244.

Grosfeld JL. Presidential address: visions: medical education and surgical training in evolution. Arch Surg. 1999;134:590-598.

Gunther SB, Soto GE, Colman WW. Interactive computer simulations of knee-replacement surgery. Acad Med. 2002;77:753-754.

Hahn JK, Kaufman R, Winick AB, Carleton T, Park Y, Lindeman R, et al. Training environment for inferior vena caval filter placement. Stud Health Technol Inform. 1998;50:291-297.

Hanna GB, Drew Y, Clinch P, Hunter B, Cuschieri A. Computer-controlled endoscopic performance assessment system. Surg Endosc. 1998;12:997-1000.

Hanna GB, Drew T, Clinch P, Hunter B, Shimi S, Dunkley P, et al. A micro-processor controlled psychomotor tester for minimal access surgery. Surg Endosc. 1996;10:965-969.

Hanna GB, Drew T, Cuschieri A. Technology for psychomotor skills testing in endoscopic surgery. Semin Laparosc Surg. 1997;4:120-124.

Hasson HM. Improving video laparoscopy skills with repetitive simulator training. Chicago Med. 1998;101:12-15.

Heimansohn H. A new orthodontic teaching simulator. Dent Dig. 1969;75:62-64.

Henkel TO, Potempa DM, Rassweiler J, Manegold BC, Alken P. Lap simulator, animal studies, and the Laptent: bridging the gap between open and laparoscopic surgery. Surg Endosc. 1993;7:539-543.

Hikichi T, Yoshida A, Igarashi S, Mukai N, Harada M, Muroi K, et al. Vitreous surgery simulator. Arch Ophthalmol. 2000;118:1679-1681.

Hilbert M, Muller W. Virtual reality in endonasal surgery. Stud Health Technol Inform. 1997;39:237-245.

Hilbert M, Muller W, Strutz J. Development of a surgical simulator for interventions of the paranasal sinuses: technical principles and initial prototype. Laryngorhinootologie. 1998;77:153-156.

Hochberger J, Maiss J, Hahn EG. The use of simulators for training in GI endoscopy. Endoscopy. 2002;34:727-729.

Hubal RC, Kizakevich PN, Guinn CI, Merino KD, West SL. The virtual standardized patient: simulated patient-practitioner dialog for patient interview training. Stud Health Technol Inform. 2000;70:133-138.

Iserson KV. Simulating our future: real changes in medical education. Acad Med. 1999;74:752-754.

Iserson KV, Chiasson PM. The ethics of applying new medical technologies. Semin Laparosc Surg. 2002;9:222-229.

John NW, Phillips N. Surgical simulators using the WWW. Stud Health Technol Inform. 2000;70:146-152.

John NW, Riding M, Phillips NI, Mackay S, Steineke L, Fontaine B, et al. Web-based surgical educational tools. Stud Health Technol Inform. 2001;81:212-217.

Johnson L, Thomas G, Dow S, Stanford C. An initial evaluation of the Iowa Dental Surgical Simulator. J Dent Educ. 2000;64:847-853.

Johnston R, Weiss P. Analysis of virtual reality technology applied in education. Minim Invasive Ther Allied Technol. 1997;6:126-127.

Jones R, McIndoe A. Non-consultant career grades (NCCG) at the Bristol Medical Simulation Centre (BMSC). Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Karnath B, Frye AW, Holden MD. Incorporating simulators in a standardized patient exam. Acad Med. 2002;77:754-755.

Karnath B, Thornton W, Frye AW. Teaching and testing physical examination skills without the use of patients. Acad Med. 2002;77:753.

Kaufmann C, Liu A. Trauma training: virtual reality applications. Stud Health Technol Inform. 2001;81:236-241.

Keyser EJ, Derossis AM, Antoniuk M, Sigman HH, Fried GM. A simplified simulator for the training and evaluation of laparoscopic skills. Surg Endosc. 2000;14:149-153.

Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ. 2003;37:267-277.

Kneebone R, ApSimon D. Surgical skills training: simulation and multimedia combined. Med Educ. 2001;35:909-915.

Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ. 2002;36:628-634.

Knudson MM, Sisley AC. Training residents using simulation technology: experience with ultrasound for trauma. J Trauma. 2000;48:659-665.

Krummel TM. Surgical simulation and virtual reality: the coming revolution. Ann Surg. 1998;228:635-637.

Kuhnapfel U, Kuhn C, Hubner M, Krumm H, Mass H, Neisus B. The Karlsruhe endoscopic surgery trainer as an example for virtual reality in medical education. Minim Invasive Ther Allied Technol. 1997;6:122-125.

Kuppersmith RB, Johnston R, Jones SB, Jenkins HA. Virtual reality surgical simulation and otolaryngology. Arch Otolaryngol Head Neck Surg. 1996;122:1297-1298.

LaCombe DM, Gordon DL, Issenberg SB, Vega AI. The use of standardized simulated patients in teaching and evaluating prehospital care providers. Am J Anesthesiol. 2000;4:201-204.

Ladas SD, Malfertheiner P, Axon A. An introductory course for training in endoscopy. Dig Dis. 2002;20:242-245.

Laguna Pes MP. [Teaching in endourology and simulators]. Arch Esp Urol. 2002;55:1185-1188.

Lucero RS, Zarate JO, Espiniella F, Davolos J, Apud A, Gonzalez B, et al. Introducing digestive endoscopy with the “SimPrac-EDF y VEE” simulator, other organ models, and mannequins: teaching experience in 21 courses attended by 422 physicians. Endoscopy. 1995;27:93-100.

Mabrey JD, Gillogly SD, Kasser JR, Sweeney HJ, Zarins B, Mevis H, et al. Virtual reality simulation of arthroscopy of the knee. Arthroscopy. 2002;18:E28.

Majeed AW, Reed MW, Johnson AG. Simulated laparoscopic cholecystectomy. Ann R Coll Surg Engl. 1992;74:70-71.

Manyak MJ, Santangelo K, Hahn J, Kaufman R, Carleton T, Hua XC, et al. Virtual reality surgical simulation for lower urinary tract endoscopy and procedures. J Endourol. 2002;16:185-190.

Marescaux J, Clement JM, Tassetti V, Koehl C, Cotin S, Russier Y, et al. Virtual reality applied to hepatic surgery simulation: the next revolution. Ann Surg. 1998;228:627-634.

McCarthy AD, Hollands RJ. A commercially viable virtual reality knee arthroscopy training system. Stud Health Technol Inform. 1998;50:302-308.

Medical Readiness Trainer Team. Immersive virtual reality platform for medical training: a “killer-application”. Stud Health Technol Inform. 2000;70:207-213.

Medina M. Formidable challenges to teaching advanced laparoscopic skills. JSLS. 2001;5:153-158.

Medina M. The laparoscopic-ring simulation trainer. JSLS. 2002;6:69-75.

Merril GL, Barker VL. Virtual reality debuts in the teaching laboratory in nursing. J Intraven Nurs. 1996;19:182-187.

Michel MS, Knoll T, Kohrmann KU, Alken P. The URO Mentor: development and evaluation of a new computer-based interactive training system for virtual life-like simulation of diagnostic and therapeutic endourological procedures. BJU Int. 2002;89:174-177.

Molin SO, Jiras A, Hall-Angeras M, Falk A, Martens D, Gilja OH, et al. Virtual reality in surgical practice in vitro and in vivo evaluations. Stud Health Technol Inform. 1997;39:246-253.

Munro A, Park KG, Atkinson D, Day RP, Capperauld I. A laparoscopic surgical simulator. J R Coll Surg Edinb. 1994;39:176-177.

Munro A, Park KG, Atkinson D, Day RP, Capperauld I. Skin simulation for minor surgical procedures. J R Coll Surg Edinb. 1994;39:174-176.

Neame R, Murphy B, Stitt F, Rake M. Virtual medical school life in 2025: a student’s diary. BMJ. 1999;319:1296.

Neumann M, Mayer G, Ell C, Felzmann T, Reingruber B, Horbach T, et al. The Erlangen Endo-Trainer: life-like simulation for diagnostic and interventional endoscopic retrograde cholangiography. Endoscopy. 2000;32:906-910.

Oppenheimer P, Weghorst S, Williams L, Ali A, Cain J, MacFar-lane M, et al. Laparoscopic surgical simulator and port placement study. Stud Health Technol Inform. 2000;70:233-235.

Owa AO, Gbejuade HO, Giddings C. A middle-ear simulator for practicing prosthesis placement for otosclerosis surgery using ward-based materials. J Laryngol Otol. 2003;117:490-492.

Pawlowski J, Graydon-Baker E, Gallagher M, Cahalane M, Raemer DB. Can progress notes and bedside presentations be used to evaluate medical student understanding in patient simulator based programs? Presented at the 2001 International Meeting on Medical Simulation, Scottsdale, AZ.

Pichichero ME. Diagnostic accuracy, tympanocentesis training performance, and antibiotic selection by pediatric residents in management of otitis media. Pediatrics. 2002;110:1064-1070.

Poss R, Mabrey JD, Gillogly SD, Kasser JR, Sweeney HJ, Zarins B, et al. Development of a virtual reality arthroscopic knee simulator. J Bone Joint Surg Am. 2000;82:1495-1499.

Pugh CM, Heinrichs WL, Dev P, Srivastava S, Krummel TM. Use of a mechanical simulator to assess pelvic examination skills. JAMA. 2001;286:1021-1023.

Radetzky A, Bartsch W, Grospietsch G, Pretschner DP. [SUSILAP-G: a surgical simulator for training minimal invasive interventions in gynecology]. Zentralbl Gynakol. 1999;121:110-116.

Raibert M, Playter R, Krummell TM. The use of a virtual reality haptic device in surgical training. Acad Med. 1998;73:596-597.

Riener R, Hoogen J, Burgkart R, Buss M, Schmidt G. Development of a multi-modal virtual human knee joint for education and training in orthopaedics. Stud Health Technol Inform. 2001;81:410-416.

Rogers DA, Regehr G, Yeh KA, Howdieshell TR. Computer-assisted learning versus a lecture and feedback seminar for teaching a basic surgical technical skill. Am J Surg. 1998;175:508-510.

Rosen J, Massimiliano S, Hannaford B, Sinanan M. Objective evaluation of laparoscopic surgical skills using hidden Markov models based on haptic information and tool/tissue interactions. Stud Health Technol Inform. 2001;81:417-423.

Ross MD, Twombly A, Lee AW, Cheng R, Senger S. New approaches to virtual environment surgery. Stud Health Technol Inform. 1999;62:297-301.

Rudman DT, Stredney D, Sessanna D, Yagel R, Crawfis R, Heskamp D, et al. Functional endoscopic sinus surgery training simulator. Laryngoscope. 1998;108:1643-1647.

Sackier JM, Berci G, Paz-Partlow M. A new training device for laparoscopic cholecystectomy. Surg Endosc. 1991;5:158-159.

Sajid AW, Ewy GA, Felner JM, Gessner I, Gordon MS, Mayer JW, et al. Cardiology patient simulator and computer-assisted instruction technologies in bedside teaching. Med Educ. 1990;24:512-517.

Sajid AW, Gordon MS, Mayer JW, Ewy GA, Forker AD, Felner JM, et al. Symposium: a multi-institutional research study on the use of simulation for teaching and evaluating patient examination skills. Annu Conf Res Med Educ. 1980;19:349-358.

Satava RM. Improving anesthesiologist’s performance by simulating reality. Anesthesiology. 1992;76:491-494.

Satava RM. The bio-intelligence age: surgery after the information age. J Gastrointest Surg. 2002;6:795-799.

Satava RM. Virtual reality and telepresence for military medicine. Comput Biol Med. 1995;25:229-236.

Satava RM. Virtual reality surgical simulator: the first steps. Surg Endosc. 1993;7:203-205.

Satava RM. Virtual reality, telesurgery and the new world order of medicine. J Image Guid Surg. 1995;1:12-16.

Satava RM, Fried MP. A methodology for objective assessment of errors: an example using an endoscopic sinus surgery simulator. Otolaryngol Clin North Am. 2002;35:1289-1301.

Schreiner RL, Stevens DC, Jose JH, Gosling CG, Sternecker L. Infant lumbar puncture: a teaching simulator. Clin Pediatr (Phila). 1981;20:298-299.

Schreiner RL, Gresham EL, Escobedo MB, Gosling CG. Umbilical vessel catheterization: a teaching simulator. Clin Pediatr (Phila). 1978;17:506-508.

Schreiner RL, Gresham EL, Gosling CG, Escobedo MB. Neonatal radial artery puncture: a teaching simulator. Pediatrics. 1977;59:1054-1056. (Suppl)

Sedlack RE, Kolars JC. Colonoscopy curriculum development and performance-based assessment criteria on a computer-based endoscopy simulator. Acad Med. 2002;77:750-751.

Senior MA, Southern SJ, Majumder S. Microvascular simulator—a device for micro-anastomosis training. Ann R Coll Surg Engl. 2001;83:358-360.

Shapiro SJ, Gordon LA, Daykhovsky L, Senter N. The laparoscopic hernia trainer: the role of a life-like trainer in laparoendoscopic education. Endosc Surg Allied Technol. 1994;2:66-68.

Shapiro SJ, Paz-Partlow M, Daykhovsky L, Gordon LA. The use of a modular skills center for the maintenance of laparoscopic skills. Surg Endosc. 1996;10:816-819.

Shekhter I, Ward D, Stern D, Papadakos DJ, Jenkins JS. Enhancing a patient simulator to respond to PEEP, PIP, and other ventilation parameters. Presented at the Society for Technology in Anesthesia Annual Meeting, San Diego, 1999.

Sherman KP, Ward JW, Wills DP, Mohsen AM. A portable virtual environment knee arthroscopy training system with objective scoring. Stud Health Technol Inform. 1999;62:335-336.

Shimada Y, Nishiwaki K, Cooper JB. Use of medical simulators subject of international study. J Clin Monit Comput. 1998;14:499-503.

Sica G, Barron D, Blum R, Frenna TH, Raemer DB. Computerized realistic simulation: a teaching module for crisis management in radiology. AJR Am J Roentgenol. 1999;172:301-304.

Smith CD, Farrell TM, McNatt SS, Metreveli RE. Assessing laparoscopic manipulative skills. Am J Surg. 2001;181:547-550.

Smith CD, Stubbs J, Hananel D. Simulation technology in surgical education: can we assess manipulative skills and what does it mean to the learner. Stud Health Technol Inform. 1998;50:379-380.

Smith S, Wan A, Taffinder N, Read S, Emery R, Darzi A. Early experience and validation work with Procedicus VA—the Pro-solvia virtual reality shoulder arthroscopy trainer. Stud Health Technol Inform. 1999;62:337-343.

Sorid D, Moore SK. Computer-based simulators hone operating skills before the patient is even touched: the virtual surgeon. Comput Graphics. 2000;21:393-404.

Stallkamp J, Wapler M. Development of an educational program for medical ultrasound examinations: Ultra Trainer. Biomed Tech (Berl). 1998;43:38-39. (Suppl)

Stallkamp J, Wapler M. UltraTrainer—a training system for medical ultrasound examination. Stud Health Technol Inform. 1998;50:298-301.

Stone RJ, McCloy RF. Virtual environment training systems for laparoscopic surgery; at the UK’s Wolson Centre for Minimally Invasive Surgery. J Med Virtual Reality. 1996;1:42-51.

Stredney D, Sessanna D, McDonald JS, Hiemenz L, Rosenberg LB. A virtual simulation environment for learning epidural anesthesia. Stud Health Technol Inform. 1996;29:164-175.

Sutcliffe R, Evans A. Simulated surgeries—feasibility of transfer from region to region. Educ Gen Pract. 1998;9:203-210.

Sutton C, McCloy R, Middlebrook A, Chater P, Wilson M, Stone R. MIST VR: a laparoscopic surgery procedures trainer and eval-uator. Stud Health Technol Inform. 1997;39:598-607.

Szekely G, Bajka M, Brechbuhler C, Dual J, Enzler R, Haller U. Virtual reality based surgery simulation for endoscopic gynaecology. Stud Health Technol Inform. 1999;62:351-357.

Taffinder N. Better surgical training in shorter hours. J R Soc Med. 1999;92:329-331.

Takashina T, Masuzawa T, Fukui Y. A new cardiac auscultation simulator. Clin Cardiol. 1990;13:869-872.

Takashina T, Shimizu M, Katayama H. A new cardiology patient simulator. Cardiology. 1997;88:408-413.

Takuhiro K, Matsumoto H, Mochizuki T, Kamikawa Y, Sakamoto Y, Hara Y, et al. Use of dynamic simulation for training Japanese emergency medical technicians to compensate for lack of training opportunities. Presented at the International Meeting on Medical Simulation, San Diego, 2003.

Tasto JL, Verstreken K, Brown JM, Bauer JJ. PreOp endoscopy simulator: from bronchoscopy to ureteroscopy. Stud Health Technol Inform. 2000;70:344-349.

Taylor L, Vergidis D, Lovasik A, Crockford P. A skills programme for preclinical medical students. Med Educ. 1992;26:448-453.

Tendick F, Downes M, Cavusoglu CM, Gantert W, Way LW. Development of virtual environments for training skills and reducing errors in laparoscopic surgery. In: Boger MS, Charles ST, Grundfest WS, Harrington JA, Katzir A, Lome LS, et al, editors. Proceedings of Surgical Assist Systems. Bellingham, WA: SPIE Optical Engineering Press; 1998:36-44.

Thomas WE, Lee PW, Sunderland GT, Day RP. A preliminary evaluation of an innovative synthetic soft tissue simulation module (Skilltray) for use in basic surgical skills workshops. Ann R Coll Surg Engl. 1996;78(6):268-271.

Tooley MA, Forrest FC, Mantripp DR. MultiMed—remote interactive medical simulation. J Telemed Telecare. 1999;5(1):S119-S121.

Ursino M, Tasto JL, Nguyen BH, Cunningham R, Merril GL. CathSim: an intravascular catheterization simulator on a PC. Stud Health Technol Inform. 1999;62:360-366.

Vahora F, Temkin B, Marcy W, Gorman PJ, Krummel TM, Hein-richs WL. Virtual reality and women’s health: a breast biopsy system. Stud Health Technol Inform. 1999;62:367-372.

Varghese D, Patel H. An inexpensive and easily constructed laparoscopic simulator. Hosp Med. 1998;59:769.

Verma D, Wills D, Verma M. Virtual reality simulator for vitreo-retinal surgery. Eye. 2003;17:71-73.

Wagner C, Schill M, Hennen M, Manner R, Jendritza B, Knorz MC, et al. [Virtual reality in ophthalmological education]. Oph-thalmologe. 2001;98:409-413. (in German).

Waikakul S, Vanadurongwan B, Chumtup W, Assawamongkolgul A, Chotivichit A, Rojanawanich V. A knee model for arthrocentesis simulation. J Med Assoc Thai. 2003;86:282-287.

Walsh MS, Macpherson D. The Chichester diagnostic peritoneal lavage simulator. Ann R Coll Surg Engl. 1998;80:276-278.

Wang Y, Chui C, Lim H, Cai Y, Mak K. Real-time interactive simulator for percutaneous coronary revascularization procedures. Comput Aided Surg. 1998;3:211-227.

Webster RW, Zimmerman DI, Mohler BJ, Melkonian MG, Haluck RS. A prototype haptic suturing simulator. Stud Health Technol Inform. 2001;81:567-569.

Weidenbach M, Wild F, Scheer K, Muth G, Kreutter S, Grunst G, et al. Computer-based training in two-dimensional echocardiography using an echocardiography simulator. J Am Soc Echocardiogr. 2005;18:362-366.

Wentink M, Stassen LP, Alwayn I, Hosman RJ, Stassen HG. Rasmussen’s model of human behavior in laparoscopy training. Surg Endosc. 2003;17:1241-1246.

Whalley LJ. Ethical issues in the application of virtual reality to medicine. Comput Biol Med. 1995;25:107-114.

Wiet GJ, Stredney D. Update on surgical simulation: the Ohio State University experience. Otolaryngol Clin North Am. 2002;35:1283-1288. viii

Wiet GJ, Stredney D, Sessanna D, Bryan JA, Welling DB, Schmalbrock P. Virtual temporal bone dissection: an interactive surgical simulator. Otolaryngol Head Neck Surg. 2002;127:79-83.

Williams CB, Saunders BP, Bladen JS. Development of colonoscopy teaching simulation. Endoscopy. 2000;32:901-905.

Wilson MS, Middlebrook A, Sutton C, Stone R, McCloy RF. MIST VR: a virtual reality trainer for laparoscopic surgery assesses performance. Ann R Coll Surg Engl. 1997;79:403-404.

Articles Touching on the Theme “The Canary in the Mineshaft”

The next group of articles shows how a Simulator functions as “The Canary in the Mineshaft.” The simulator uncovers clinical weaknesses. By extension, then, once you uncover a clinical weakness you can correct the weakness. Correct the weakness, improve the clinician, improve the care for our patients.

In the previous group of articles, the “It Stands to Reason” articles, you had to make a leap of faith to “buy into” Simulators. You had to say, “It stands to reason Simulators are a good thing, so we should lay out a lot of resources to support a Simulator.” In this batch of articles, you also have to make a leap of faith. You have to say, “The Simulator functions as a canary in a mineshaft, so it can lead to better patient outcomes.”

Simulator as a canary in the mineshaft → better outcome

That’s quite a long jump. Instead, we’re stuck with a multijump argument.

Teach in the simulator → uncover weakness → correct weakness → achieve better outcome

There’s a lot of implied benefit and supposed improvement—You hope that’s how it works out in the end. But that, alas, is where we stand right now, at least with these articles. So read on, and see about that valiant canary, braving deadly fumes in the mineshaft.

BARSUK D, ZIV A, LIN G, BLUMENFIELD A, RUBIN O, KEIDEN I, ET AL. Using advanced simulation for recognition and correction of gaps in airway and breathing management skills in prehospital trauma care. Anesth Analg 2005;100:803–9.

Right now Israel and Denmark are moving toward Simulator scenarios as part of their board certification process, so their views have some heft. (“Ready or not, here we come!” the Simulators seem to be saying to us.)

A group of 72 postinternship doctors were divided into two groups: 36 (non-Simulator-trained) were assessed on two trauma scenarios (one with HPS and one with Sim Man). Their most common airway management mistakes were used to develop a 45-minute additional airway training session for the next group of 36. Those trained in the Simulator did better. Both groups had to go through two scenarios.

In this study, the Simulator came across once again as the canary in the mineshaft. Here, these postinternship doctors, who should know something, were making all kinds of mistakes.

And voila! The Simulator reveals all. Maybe we should call Simulators “truth detectors.”

What this study showed was that Simulators are great intermediate trainers. Simulator-trained people do better in the Simulator world. Does that translate into the real world? Maybe so, maybe no. For example, several doctors made the mistake of not giving drugs before intubating the Simulator. So you might be tempted to say, “In the real world, with a real patient, they would make the exact same mistake.” Well, no. In the real world, the patient would bite down and resist—something the Simulator can’t do. The authors noted that there is a need for (1) studies that demonstrate transfer of skills from simulation to reality and (2) to determine the rate of skills degradation over time and decide the correct frequency of training. The appendix in the article includes checklists of specific actions reflecting essential actions for safe treatment and successful outcome of severe chest trauma and severe head trauma.

BERKENSTADT H, KANTOR GS, YUSIM Y, GAFNI N, PEREL A, EZRI T, ET AL. Feasibility of sharing simulation-based evaluation scenarios in anesthesiology. Anesth Analg 2005;101:1068–74.

Everything else is globalized, why not anesthesia scenarios? Dr. Berkenstadt and the Tel Hashomer gang snagged four scenarios from Dr. Schwid.

A group of 31 junior anesthesia residents ran through the gauntlet of those four scenarios. They liked them and rated the scenarios as quite realistic. Graders trotted out their checklists, reviewed the videotapes, and passed Solomonic judgment upon the residents. It worked.

Oh, a little sidelight. The Israeli residents did better than the American residents! Dr. Berkenstadt graciously explains this away, saying our two systems are different, and the Israelis maybe had more experience in their home countries before immigrating to Israel. The heck you say! Israel kicked our butt, fair and square.

Now it’s time for us to whip our people into shape. Let me at a resident. I’ll teach him a thing or two. I demand a rematch! The World Cup of Simulation. Bring it on!

BYRNE AJ, JONES JG. Inaccurate reporting of simulated critical anaesthetic incidents. Br J Anaesth 1997;78:637–41.

Byrne had previously shown that trainees often misinterpret data presented during a simulated case and make numerous errors when describing their actions. In this study, the authors wanted to determine if these inaccuracies result from trainees—

Why wait for real cases to see how trainees react when we have Simulators to serve as the canary? Eleven trainees (3 to 8 years of clinical experience) entered a simulated case using the ACCESS Simulator. The case was a young patient undergoing an ankle repair. They faced two “crises”—an episode of bradycardia followed by an episode of anaphylaxis with bronchospasm and hypotension. The authors evaluated participants’ ability to record their actions and their accuracy when documenting the two complications in an incident report.

What happened? For the bradycardia episode, 3 of 11 failed to record the event on their paper chart, and 2 of 11 failed to record their treatment of the arrhythmia. Only 4 of the 11 trainees mentioned bradycardia in the critical incident report, and only 1 of the participants accurately documented this event. For the bronchospasm and hypotensive event, the results were worse—none of the trainees mentioned that the arterial pressure had been normal prior to the event, and only 2 of the 11 accurately described the event.

The authors urge caution when studying anesthetic emergencies—previously their diagnosis and treatment was built from the analysis of critical incident forms. This study showed that the information derived from this source may not reflect actual events. How can we solve this dilemma? Byrne offers, “automated recording of monitoring and videotaping of the case would seem to provide the best solution, but this is unlikely to receive widespread acceptance and has significant cost implications.” You bet it does … there is a high price to pay if our main source of data is full of errors. This time the medical record may also be a canary.

BYRNE AJ, SELLEN AJ, JONES JG. Errors on anaesthetic record charts as a measure of anaesthetic performance during simulated critical incidents. Br J Anaesth 1998;80:58–62.

Byrne and colleagues described “mental workload” as the conscious effort required to carry out a complex task. Experts exert relatively low mental workload while carrying out complex tasks, whereas high mental workload is typical of novices and those who lose control when faced with stressful complicated situations. Anesthesiology often requires one to focus on multiple tasks. Studies in aviation have shown that low mental workload allows an experienced pilot to carry out both primary tasks (highest priority) and secondary tasks (lower priority). Byrne argues that a measure of one’s mental workload is his/her ability to carry out secondary tasks.

Rather than use a rater’s subjective opinion of residents’ ability, Byrne and colleagues used the record chart from a simulated anesthetic case as a reflection of the secondary tasks (the primary task was managing the patient). Ten trainees went through a simulated case using the ACCESS simulator. It involved a 25-year-old woman undergoing ACL repair. All trainees were exposed to the same 25-minute scenario in the same sequence.

Throughout the case and for a few minutes after the scenario ended, participants completed the record chart to document events and data. The data recorded were the following.

What happened? As expected, all trainees treated their “patient” appropriately; however, more than 20% of the values recorded by the participants were in error by more than 25% of the actual values. There was high variability among participants and within the same participant. Two lessons resulted from this study.

It is better to find out that trainees make errors in chart recording during simulated cases rather than waiting for a retrospective investigation of an adverse event.

DEANDA A, GABA DM. Unplanned incidents during comprehensive anesthesia simulation. Anesth Analg 1990;71:77–82.

DeAnda and Gaba smoked out a few problems while running their Simulators. (This shows what happens when clever people leap into a new field and keep their eyes peeled. They didn’t set out to study these incidents, but when the incidents happened DeAnda and Gaba were alert to the implications. Fate favors the prepared mind.)

Errors during the simulator scenarios were most often human errors—a lot of them document fixation errors. (Damn! I’m forever telling residents to worry about the record at the end of the case, when the patient is safely in the hands of the PACU nurse. Take care of the patient first!)

What did those silly bunnies do? Forgot to turn the ventilator back on after hand-ventilating, syringe swaps, turning the stopcock the wrong way. You name it, they found a way to mess it up.

The simulator uncovered mistakes galore. This was one overworked canary. When you see the mistakes they made, it does not become such a gigantic leap of faith to think you could:

Run Simulator → see mistakes made → correct mistakes → prevent repeat of mistake → protect patient from harm

GABA DM, DEANDA A. The response of anesthesia trainees to simulated critical incidents. Anesth Analg 1989;68:444–51.

One of the first studies by Gaba revealed that our residents may not be as good as we presumed. He and DeAnda sent 19 first- and second-year anesthesia residents through five scenarios on their Simulator.

All of the simulations were videotaped and reviewed. The authors measured the response time to detect and initiate correction of the problems. All kinds of errors were made—here is just a few of the most common.

Although second-year residents tended to correct problems faster than the first-year “novices,” there was wide variation in each group. Many in the first year did well, and a few second-year residents did poorly. The authors note, “the imperfect behavior of the outliers may be more meaningful than the mean performance of the group.”

Not all mines were dangerous, but the canaries identified the ones that were—not all anesthesia residents are dangerous, but the Simulator can identify the ones that may be.

I GARDI T, CHRISTENSEN UC, JACOBSEN J, JENSEN PF, ORDING H. How do anaesthesiologists treat malignant hyperthermia in a full-scale anaesthesia simulator? Acta Anaesthesiol Scand 2001;45:1032–5.

The Danish team is at it again … this time they studied 32 teams (1 anesthetist had 9 years’ experience; 1 nurse anesthetist had 8 years’ experience) from several university and community hospitals. The authors evaluated teams on the ability to correctly diagnose and manage a case of malignant hyperthermia based on national guidelines. The 25- to 30-minute scenario consisted of a “routine” case that gradually evolved to a fulminant syndrome over 15 minutes.

How did the teams do?

An important finding in this study was that the cause of undermanagement was more practical than thinking—they knew what to do, they did not execute. The authors concluded that “practical training in full-scale simulators can become a useful part of training for complex treatment procedures.” Yes! These canaries are singing, and we are listening!

HAMMOND J, BERMANN M, CHEN B, KUSHINS L. Incorporation of a computerized human patient simulator in critical care training: a preliminary report. J Trauma 2002;53:1064–7.

It turns out that anesthesiologists are not the only ones who make mistakes. Hammond and his colleagues evaluated eight second-year surgery residents during their critical care rotation. They put the residents through three scenarios on a full-patient Simulator.

Each participant was evaluated on a minimum of 13 preselected tasks. So how did these surgeons do?

This study showed that we have problems not only with the training of our residents, especially with tension pneumothorax, but also with our evaluations. How can the resident who saved the patient the fastest have the lowest score? That is the main weakness of a checklist—they reward methodical practice but penalize efficiency—experts always know how to take short cuts. The solution—add a global rating scale, measure decision-making timing.

The authors make an important concluding remark. The true value of the Simulators is less their ability to assess individuals but more their ability to uncover deficiencies in training programs.

JACOBSEN J, LINDEKAER AL, OSTERGAARD HT, NIELSEN K, OSTERGAARD D, LAUB M, ET AL. Management of anaphylactic shock evaluated using a full-scale anaesthesia simulator. Acta Anaesthesiol Scand 2001;45:315–19.

A total of 42 anesthetists in Denmark went through a Simulator session involving an anaphylactic reaction to a drug. Guess what? “Something’s rotten in the state of Denmark.” (I just had to say that.) Nobody pegged it during the first 10 minutes, and only 6 of 21 teams (the 42 people were divided into 21 two-person teams) ever even considered the right diagnosis. And those people needed hints! Ay Chihuahua, or maybe ay Copenhagen.

Either the Simulator didn’t do a good job “conveying” anaphylaxis (the old validity question rears its head again), or no one is teaching anesthesiologists in Denmark to diagnose and treat anaphylaxis. The confounders with anaphylaxis during anesthesia are as follows:

But the conclusion from this was pretty clear: we need to be better prepared to deal with anaphylaxis because right now we’re not. (As you pound through these articles, you can draw whatever conclusion you want. I myself, again and again, see the Simulator as the great “revealer of our teaching inadequacies.”)

LINDEKAER AL, JACOBSEN J, ANDERSEN G, LAUB M, JENSEN PF. Treatment of ventricular fibrillation during anaesthesia in an anaesthesia simulator. Acta Anaesthesiol Scand 1997;41:1280–4.

This is one of the earlier studies from Denmark—and this team is not afraid to find out what is wrong with their trainees and are determined to do something about it. The authors point out again that 70% to 80% of accidents in anesthesia are a result of human error. A very serious accident is mismanaging ventricular fibrillation.

A group of 80 anesthetists were divided into 40 teams comprising one anesthetist and one nurse anesthetist. Each session was videotaped; and although participants knew something was going to happen during the simulation—they did not know what “it” would be. Seven minutes into an uncomplicated case of a middle-aged man with a gastric tumor, the patient developed ventricular fibrillation.

Teams were evaluated based on if they followed European Resuscitation Council Guidelines for ventricular fibrillation. How well prepared were these teams for an important “emergency”? It varied … widely. None of the teams followed the published guidelines. There was wide variation and inconsistency in managing ventricular fibrillation despite said guidelines. Two of the forty teams did not administer any shocks to the patient and 27% of the teams did not give the full three shocks. They committed other mistakes as well.

The authors concluded that better education and training are needed for common skills such as ACLS, and Simulators are well suited for this.

MARSCH SCU, TSCHAN F, SEMMER N, SPYCHIGER M, BREUER M, HUNZIKER PR. Performance of first responders in simulated cardiac arrests. Crit Care Med 2005;33:963–7.

Many of these studies involve “mines” located in the operating room, but what happens during critical events that occur “on the floor” by the “first responders”—because nurses are the ones who actually spend the most time with patients, they are usually the first responders—having to page the resident who is either eating or napping. That is just what Marsch and colleagues did—they enrolled 20 ICU teams, each comprising three nurses and a stand-by resident. Each team responded to a case on the Simulator—a 67-year-old man with an acute myocardial infarction who had just undergone successful angioplasty of the right coronary artery and was being sent to the ICU. The patient soon had a cardiac arrest from pulseless ventricular tachycardia. The teams had to respond.

Although the nurses called the resident promptly to help diagnose the problem faster, there was considerable delay in basic life support (they teach that to babysitters), which resulted in chest compression occurring less that 25% of the time. (As an aside, Dr. Gordon Ewy from University of Arizona is on a crusade—well ahead of the American Heart Association—that in the presence of a cardiac arrest forget about the two breaths, the AED—just go ahead and start compressions—100 per minute—this saves lives!) Back to our story … 33% of the teams failed to provide an adequate number of shocks, and 8 of 20 teams failed to give epinephrine.

The authors noted that the first responders failed to build an effective team structure that would ensure effective management of the patient. This may reflect a cultural attitude in which nurses are reluctant to assume a leadership role in the presence of a resident. This was the pervasive attitude in aviation until the 1980s when a couple of plane accidents resulted because flight attendants did not think “it was their place” to bother the pilot about ice on the wings or an engine on fire.

Before they viewed themselves on videotape, the teams thought they had done pretty well. None of the participants had realized or recalled unnecessary interruptions in basic life support! We call this the unconscious incompetent. So much for those code flow sheets and incident reports accurately reflecting what happens. But without these important studies, we would not be moving forward.

MORGAN PJ, CLEAVE-HOGG D, DESOUSA S, TARSHIS J. Identification in gaps in the achievement of undergraduate anesthesia educational objectives using high fidelity patient simulation. Anesth Analg 2003;97:1690–4.

Tweet tweet! The Simulator uncovered the failings and frailties of 165 medical students in this study. What did the simulator unmask?

Um, this study begs the question. Just what, precisely, did the students do? Did the students themselves have a pulse? The authors point out, as we have repeatedly, that residents also make these mistakes. Now that we understand no one is competent, let’s do something about it. Morgan and her colleagues have completely overhauled their anesthesia training program for medical students. What more can you ask?

MORGAN PJ, CLEAVE-HOGG D. Evaluation of medical students’ performance using the anaesthesia simulator. Med Educ 2000;34:42–5.

Not so much a canary uncovering specific mistakes here (“they blew it on the intubation”) as using the Simulator as an overall evaluation tool. Dr. Morgan in Toronto said, “Let’s use the simulator on 24 medical students, run them through the gauntlet (RSI—treating hypoxemia, managing hypovolemia, treating anaphylaxis) and see if we can use this as our testing technique.” Results were a little muddy, truth to tell. Their “simulator grade” did not correlate with their “clinical grade” (how they were rated on the clerkship by the people who worked with them in the real OR).

Hmmm. Simulator as “grading canary”? This becomes problematic. (Too bad, right when you’re on a roll and you think Simulators are perfect in every way, something like this comes along, throwing a wrench in the works, or, more precisely, a wrench in the canary cage.)

OLSEN JC, GURR DE, HUGHES M. Video analysis of emergency medicine residents performing rapid sequence induction. J Emerg Med 2000;18:469–72.

This is not a simulation study but a kind of “canary-esque” training study. To uncover intubation errors, Dr. Olsen and his Chicago buddies videotaped emergency medicine residents during intubations. (By extension to Simulato-land, we use a lot of videotaping to uncover mistakes.) Lo and behold, 45% of the residents don’t do the Sellick maneuver right, and 34% don’t use the all-important end-tidal carbon dioxide detector to make sure the tube is in the right place.

Once again, to beat the drum: