Acute care skills in anesthesia practice: a simulation-based resident performance assessment

Anesthesiology. 2004 Nov;101(5):1084-95. doi: 10.1097/00000542-200411000-00007.

Abstract

Background: A recurring initiative in graduate education is to find more effective methods to assess specialists' skills. Life-sized simulators could be used to assess the more complex skills expected in specialty practice if a curriculum of relevant exercises were developed that could be simply and reliably scored. The purpose of this study was to develop simulation exercises and associated scoring methods and determine whether these scenarios could be used to evaluate acute anesthesia care skills.

Methods: Twenty-eight residents (12 junior and 16 senior) managed three intraoperative and three postoperative simulation exercises. Trainees were required to make a diagnosis and intervention in a simulation encounter designed to recreate an acute perioperative complication. The videotaped performances were scored by six raters. Three raters used a checklist scoring system. Three faculty raters measured when trainees performed three key diagnostic or therapeutic actions during each 5-min scenario. These faculty also provided a global score using a 10-cm line with scores from 0 (unsatisfactory) to 10 (outstanding). The scenarios included (1) intraoperative myocardial ischemia, (2) postoperative anaphylaxis, (3) intraoperative pneumothorax, (4) postoperative cerebral hemorrhage with intracranial hypertension, (5) intraoperative ventricular tachycardia, and (6) postoperative respiratory failure.

Results: The high correlation among all of the scoring systems and small variance among raters' scores indicated that all of the scoring systems measured similar performance domains. Scenarios varied in their overall difficulty. Even though trainees who performed well on one exercise were likely to perform well in subsequent scenarios, the authors found that there were considerable differences in case difficulty.

Conclusion: This study suggests that simulation can be used to measure more complex skills expected in specialty training. Similar to other studies that assess a broad content domain, multiple encounters are needed to estimate skill effectively and accurately.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acute Disease
  • Analysis of Variance
  • Anesthesiology / education
  • Anesthesiology / standards*
  • Clinical Competence / standards*
  • Humans
  • Internship and Residency
  • Manikins
  • Psychometrics
  • Quality Assurance, Health Care / methods*