Developing a tool for assessing competency in root cause analysis

Jt Comm J Qual Patient Saf. 2009 Jan;35(1):36-42. doi: 10.1016/s1553-7250(09)35006-0.

Abstract

Background: Root cause analysis (RCA) is a tool for identifying the key cause(s) contributing to a sentinel event or near miss. Although training in RCA is gaining popularity in medical education, there is no published literature on valid or reliable methods for assessing competency in the same.

Methods: A tool for assessing competency in RCA was pilot tested as part of an eight-station Objective Structured Clinical Examination that was conducted at the completion of a three-week quality improvement (QI) curriculum for the Mayo Clinic Preventive Medicine and Endocrinology fellowship programs. As part of the curriculum, fellows completed a QI project to enhance physician communication of the diagnosis and treatment plan at the end of a patient visit. They had a didactic session on RCA, followed by process mapping of the information flow at the project clinic, after which fellows conducted an actual RCA using the Ishikawa fishbone diagram. For the RCA competency assessment, fellows performed an RCA regarding a scenario describing an adverse medication event and provided possible solutions to prevent such errors in the future.

Results: All faculty strongly agreed or agreed that they were able to accurately assess competency in RCA using the tool. Interrater reliability for the global competency rating and checklist scoring were 0.96 and 0.85, respectively. Internal consistency (Cronbach's alpha) was 0.76. Six of eight of the fellows found the difficulty level of the test to be optimal.

Discussion: Assessment methods must accompany education programs to ensure that graduates are competent in QI methodologies and are able to apply them effectively in the workplace. The RCA assessment tool was found to be a valid, reliable, feasible, and acceptable method for assessing competency in RCA. Further research is needed to examine its predictive validity and generalizability.

MeSH terms

  • Curriculum
  • Education, Medical, Graduate
  • Educational Measurement
  • Fellowships and Scholarships
  • Humans
  • Inservice Training*
  • Medical Errors / prevention & control*
  • Minnesota
  • Physicians
  • Pilot Projects
  • Professional Competence*
  • Quality Assurance, Health Care*
  • Risk Management / methods*