Article Text

Download PDFPDF

Simulation based training
Are simulation and didactic crisis resource management (CRM) training synergistic?
Free
  1. J B Cooper
  1. Correspondence to:
 J B Cooper
 Associate Professor of Anaesthesia, Harvard Medical School, Massachusetts General Hospital, Boston, Mass 02114, USA; Director, Biomedical Engineering, Partners Healthcare System Inc; Executive Director, Center for Medical Simulation; jcooperpartners.org

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Simulation may have an additive component to classroom style training, at least in the short term

Full environment simulation is achieving wide adoption despite weak evidence of its impact on outcome. It is doing so because it has strong face validity, creates much enthusiasm from both students and teachers, and because it is what other high hazard industries do to mitigate errors and to create and maintain a culture of safety. Yet most of us working with simulation technologies and techniques generally try to maintain our objectivity. We ask ourselves if it really does what we think it does, how much fidelity is needed to achieve our educational goals, and how we weigh the costs and benefits. When we are using it for non-technical training such as improving teamwork, we want to understand how it should be used to meet the real objective—creating real lasting behavior and culture changes that will make health care more effective and safer.

The paper in this issue of QSHC by Shapiro and colleagues1 demonstrates a model for using simulation to sustain behavior change and adds some additional evidence to bolster our general beliefs. But, as often happens with studies of educational and training interventions, we are left with many more questions than answers and are disappointed by an underpowered study, although not by much. That is not the fault of the investigators whose underlying methods were an advance over what we usually see in the world of non-technical simulation based training. The fault lies with having so few resources to perform the robust research designs needed, and also with the challenges of doing any research on human performance in naturalistic settings.

What is the utility of high fidelity, high realism, simulation based training for non-technical skills and culture change? We have plenty of evidence that those who experience it usually feel strongly that it is important for teaching skills which they do not otherwise experience or practice.2,3 We have anecdotes illustrating how it appears to impact on clinical performance.4 Almost anyone who uses simulation to teach or reinforce teamwork or crisis resource management (CRM, or crew resource management as it is called in aviation) has encountered students who say they altered their fundamental way of doing things and working with their colleagues. I have heard many of these stories first hand, so I know the passion of those who have had such a transformational experience. That is one of the most useful applications of this form of simulation: transformational change for those who need first to recognize the problem before they can start to work on it. But fundamental, lasting, outcome altering organizational change cannot come with single interventions of one type. The important illustration from this study is how simulation can be coupled with other forms of CRM techniques to sustain improvements. Neither simulation nor non-simulation based training is likely to be effective alone for their intended purposes. Aviation, maritime, and nuclear industries all use combinations of stand-up training and simulation based training to establish and maintain human factors programs intended to minimize error, mitigate the error chain, and enhance performance. We get some tantalizing evidence in this new report that simulation has an additive component, at least in the short term, to classroom style training.

The general methodology used by Shapiro et al is illustrative of the kind of trials needed to produce evidence of transfer-of-training. The groups are randomized, there are sound validated measures of behavior with a measure of inter-rater reliability, and the raters are blinded to which cohort they are observing. But the study also demonstrates the flaws typical of educational studies: the sample size is too small, it is not linked to patient health outcomes (injury, death, reduced length of stay in hospital), there is no cost/benefit measure, and there are many sub-elements in the independent variable (degree of realism, quality of instruction, time of instruction, time between the MedTeams and simulation training) which can strongly impact on the effectiveness of training but are not examined in the experiment.

It is easy to criticize educational studies. I have not personally been involved with a successful one that is up to the standards of the “hard” sciences (which often themselves give us answers that later prove to be wrong by further research). This is difficult work but it needs to be done—even with the flaws—because each piece of evidence adds something to what we know. We also have to be willing to publish the negative results and to validate tools and approaches for studying simulation.5,6

Regardless of any criticism I might have of studies of simulation, when it comes to adopting simulation as an integral component of creating high reliability healthcare organizations, I accept and promote Gaba’s observation that “... no industry in which human lives depend on the skilled performance of responsible operators has waited for unequivocal proof of the benefits of simulation before embracing it.”7 Why should health care be different?

Simulation may have an additive component to classroom style training, at least in the short term

REFERENCES

Footnotes

  • Competing interest disclaimer: The author is Executive Director of the center in which the simulation training described in this report was conducted but had no direct involvement in the study.

Linked Articles