Article Text

Download PDFPDF

Effect of a quality improvement curriculum on resident knowledge and skills in improvement
  1. Lisa M Vinci1,
  2. Julie Oyler1,
  3. Julie K Johnson2,
  4. Vineet M Arora1
  1. 1Section of General Medicine, Department of Medicine, University of Chicago, Chicago, Illinois, USA
  2. 2Centre for Clinical Governance Research, University of New South Wales, Sydney, Australia
  1. Correspondence to Dr Lisa M Vinci, University of Chicago, 5841 S Maryland Avenue, MC3051, Chicago, IL 60637, USA; lvinci{at}medicine.bsd.uchicago.edu

Abstract

Background While many residency programmes have implemented quality improvement (QI) training programmes, few have been rigorously evaluated.

Methods Residents at the authors' institution participated in a required course, the Quality Assessment and Improvement Curriculum during two 1-month-long rotations. The American Board of Internal Medicine (ABIM) Practice Improvement Module (PIM) was used to guide residents through chart reviews for quality measures, surveys of patient satisfaction and an assessment of clinic systems. Residents received 12 h of training in QI skills (ie, using Plan–Do–Study–Act cycles). Residents worked in groups to test the effect of a small QI project of their choosing. Residents completed the Quality Improvement Knowledge Assessment Tool (QIKAT) to assess QI knowledge, and a self assessment of QI skills. Third-year residents who did not participate in the curriculum served as a historical control group.

Results 87% (26/30) of PGY2s (intervention group) and 83% (24/29) of PGY3 residents (historical controls) completed the self assessment and QIKAT. PGY2 residents showed a significant improvement in QIKAT scores (Pre: 6.98 (6.23 to 7.72) vs Post: 9.70 (8.92 to 10.50); p<0.001) and all 12 QI skills. In addition, the post-PGY2 intervention group outperformed the PGY3 historical control group in QIKAT scores (PGY2 Post-9.59 (8.82 to 10.36) vs PGY3 Control 7.34 (6.48 to 8.20); p<0.001) and all QI skills.

Conclusion A QI curriculum using the ABIM PIMs and small-group, resident chosen QI projects can result in improvements in resident knowledge and self-assessed skills in QI. The use of a historical control group was a helpful way to account for the effects of accumulating experience in the pre-post evaluation of this curriculum.

  • Ambulatory care
  • graduate medical education
  • healthcare quality
  • quality of care
  • teams

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Demonstration of competence in quality improvement (QI) is a component of the American Board of Medical Specialties (ABMS) Maintenance of Certification (MOC) Program for US physicians. In addition, QI is a major component of the Accreditation Committee on Graduate Medical Education (ACGME) core competencies Practice-based Learning and Improvement (PBLI) and Systems-based Practice (SBP) which apply to all US physicians in training. Future physicians will need the necessary knowledge and skills required to assess the quality of their clinical practice then identify and implement needed improvements. Thus, there is an increasing need to identify and implement feasible, accessible, effective and sustainable curricula in QI.

While many internal medicine residency programmes have developed and implemented QI education, few curricula for residents have been rigorously evaluated.1 A systematic review of QI educational interventions which included programmes at all training levels and various providers showed that most published QI curricula show improvements in provider knowledge and confidence in QI.1 However, only four of the 39 studies reported both educational and clinical outcomes, and of these, none addressed internal medicine residency training.1 Furthermore, most evaluations of QI curricula in residency training are pre-post evaluations, and do not account for accumulating resident knowledge and experience during their training. Therefore, the aim of this study is to assess the effectiveness of the Quality Assessment and Improvement Curriculum (QAIC), a required curriculum in QI for internal medicine residents, on resident knowledge and skill in QI. We use a historical control group of senior residents who did not participate in the curriculum to account for the effect of accumulating resident experience and knowledge. Clinical results of the curriculum, which showed a beneficial effect on quality of care, have been published previously.2 We hypothesise that this curriculum would result in improved knowledge and self-assessed skills in the participating residents, as compared with a historical control group of residents who do not receive the curriculum.

Educational curriculum

Incorporating a QI curriculum into residency training has many challenges. At the level of the faculty and resident interface, these include resident and faculty time constraints, engaging trainees and overseeing multiple simultaneous QI projects. To address these barriers, a four-member faculty team (LV, JO, JJ, VA) was created to teach QAIC during the existing ambulatory rotation, on which PGY2 residents spend two 1-month blocks per year, meeting weekly for 90 min sessions. Using a team of faculty allows at least one instructor to be available to teach any given session. To meet the need for a reliable and valid practice assessment tool, residents use an off-the-shelf product, the American Board of Internal Medicine (ABIM) Practice Improvement Modules (PIMs), which are evidence-based, web-based practice assessment tools initially developed for MOC and adapted for use by resident trainees. To optimise engagement and minimise the number of concurrent projects, residents work in teams on a project of their choice which is implemented in their own continuity clinic.

During the residents' first rotation, QAIC faculty teach principles of quality assessment and improvement, and residents collect clinical practice data using the Clinical Preventive Services (CPS) PIM, which is broadly applicable to all resident continuity patients. To complete the PIM, residents review a prospective sequential sample of five of their patient charts for compliance with approximately 25 US Preventive Services Task Force-based screening and preventive care indicators, survey five of their patients about the quality of care they receive from the clinic and complete a systems survey of the clinic. The chart reviews are completed with a web-based abstraction tool which is integrated into the ABIM PIM. The patient satisfaction survey includes questions such as ‘In the past 12 months, how much of a problem has it been to get a prescription refill from this practice?’ and ‘How is this practice at reminding you to get tests to screen for cancer?’ The systems survey directs the residents through a review of the ambulatory clinical system in which they practice and includes questions which assess available information technology, QI activities within the clinic, and support for care management and patient education. The data are submitted electronically to the ABIM, which returns a report to the course directors that is reviewed with the residents, and potential areas of improvement are discussed. During the second rotation, residents receive formal instruction in process mapping, writing aim statements and using the Plan–Do–Study–Act (PDSA) cycle.3 The resident team then chooses an area for improvement, develops a specific aim statement and designs and implements a small QI project focusing on a PIM measure. Projects that the residents implemented include an initiative to document body mass index on all patients, improvement in the referral process for tobacco cessation counselling and development of a preventive health screening form. A detailed description of the curriculum and the resulting QI projects has been published previously.2

Methods

The PGY2 residents participated in the curriculum between July 2006 and June 2007. To evaluate the effect of the curriculum on resident knowledge, the Quality Improvement Knowledge Assessment Tool (QIKAT) developed by Ogrinc and colleagues was administered to PGY2 residents on the first day of the curriculum and again upon completion.4 The QIKAT consists of three short case scenarios which include a primary care practitioner who is trying to improve quality of care provided for diabetics, an emergency department with inadequate bed access and a nephrology consult service with inefficient rounding practices. The resident reviews each scenario and determines an appropriate focus for improvement, and then answers the following three open-ended questions for each case: (1) What would be the aim of the improvement activity? (2) What would you measure to assess the situation? (3) Identify one change that might be worth testing. The answers were graded by three independent blinded raters (LV, JO and JJ) using a standardised scoring system (box 1) out of a possible 5 points per scenario for a maximum score of 15. A research assistant assigned an encrypted code to reflect whether the QIKAT was from the pre or post period and whether it was the intervention or historical control group. QIKAT scores were entered by faculty into a spreadsheet with the identifier, which was then decoded by the research assistant prior to analysis by one faculty investigator who was not involved in the QIKAT scoring (VA). A random sample of 10 scenarios were graded together to establish inter-rater reliability, with a resulting κ of >0.8.

Box 1 Scoring sheet for Quality Improvement Knowledge Assessment Tool as used by University of Chicago Quality Assessment and Improvement Curriculum

Scoring for each scenario

  • 1 point good aim, 2 points excellent aim

  • 1 point good measure

  • 1 point good intervention (feasible)

  • 1 point all answers related

  • Total 5 points possible per scenario

  • Total 15 points for Quality Improvement Knowledge Assessment Tool (total of three scenarios)

A self-assessment of 12 core QI skills was also administered to PGY2 residents prior to and after completion of the curriculum (figure 1). Residents were asked to rate their comfort level on a four-point Likert scale from 1 (not at all comfortable) to 4 (extremely comfortable) with 12 core QI skills, such as writing an aim statement and using small cycles of change. Open-ended feedback regarding the value of the course was also elicited. To account for the effect of accumulating experience, these evaluation tools were administered once to graduating PGY3 residents spring 2007 who did not participate in the curriculum, and who constituted a historical control group (figure 1). The PGY3 residents in the control group received a short introductory lecture to QI after completion of the assessment tools.

Figure 1

Study design: pre-post and PGY3 historical controls.

Residents also rated their satisfaction with teaching about QI on a scale from 1 (very unsatisfied) to 5 (very satisfied) on an end-of-the-year clinic satisfaction survey.

Data analysis

Three main analyses were used to compare the effect of QAIC on PGY2 (participant) knowledge and self-assessed skills. First, the pre- and postcurriculum PGY2 performance on the QIKAT (figure 2) and the QI skills self-assessment (table 1) was compared. Paired t tests were used for QIKAT scores and paired Wilcoxon signed rank tests for QI skills self-assessment scores. In addition, we also compared the postcurriculum PGY2 performance to the PGY3 controls who did not receive the curriculum (figure 1). Since the data for this comparison came from two different groups, unpaired t tests for QIKAT scores and χ2 tests for self assessment of QI skills were used. In addition to these analyses, the PGY3 performance was also compared with the precurriculum PGY2 performance using an unpaired t test for QIKAT scores and χ2 tests for the self-assessment scores to ascertain if there were any differences in performance between these two groups prior to the PGY2s receiving the curriculum (figure 1). For the assessment of satisfaction with QI teaching, the percentage of those residents reporting being satisfied (either 4 or 5) was compared between the PGY2 (QAIC participants) and PGY3 historical controls using a two-sample test of proportions.

Figure 2

Quality Improvement Knowledge Assessment Tool scores by group.

Table 1

Self-assessment of comfort level with quality improvement tasks: pre-post and Post (PGY2) versus end of year PGY3

Statistical significance was defined as p<0.05 for the QIKAT knowledge assessment. Due to multiple comparisons, statistical significance was defined as p=0.001 after Bonferroni correction. All tests were performed using STATA 10.0 statistical software (STATA, College Station, Texas).

Results

Eighty-seven per cent (26/30) of PGY2s completed the QIKAT and the QI skills self assessment before and after the curriculum. Eighty-three per cent (24/29) of PGY3 residents completed the QIKAT and the QI skills self assessment at the end of their PGY3 year. Few differences were noted between the pre-PGY2 scores and the PGY3 control scores on either the QIKAT knowledge assessment (figure 2) or the self assessment of QI skills (table 1). Although PGY3s appeared to be more confident in such skills as studying the process and building next improvement, there were no differences in confidence in QI-specific skills such as writing a clear aim and using PDSA cycles. After the QAIC curriculum, PGY2 residents showed a significant improvement in QI knowledge (Pre QIKAT score 6.98 (6.23 to 7.72) vs Post QIKAT scores 9.70 (8.92 to 10.50); p<0.001) (figure 2) and nearly all self-assessed QI skills (table 1). For example, 89% of post-PGY2s rated their comfort level with PDSA cycles moderate to high versus 9% of pre-PGY2s (p=0.001).

After the QAIC curriculum, the PGY2 residents also outperformed the PGY3 historical control group in QI knowledge (PGY2 Post QIKAT 9.59 (8.82 to 10.36) vs PGY3 Control 7.34 (6.48 to 8.20); p<0.001) (figure 2) and all self-assessed QI skills (table 1). Given that pre-PGY2 scores were not significantly different from those of the PGY3 controls, these findings are consistent with the possibility that the changes observed are due to the curriculum.

Open-ended feedback comments provided anecdotal evidence of the usefulness of the QAIC curriculum (box 2). After participating in the QAIC, PGY2 participants also reported on an end-of-year clinic satisfaction survey being significantly more satisfied with their training in QI in the clinic than the PGY3 historical controls (20/23 (90%) vs 15/26 (58%); p=0.023).

Box 2 Sample verbatim comments from residents

  • ‘I gained skills to make me feel less frustrated and helpless within the system I work’

  • ‘It is important to have an objective way to evaluate if a change is made’

  • I gained ‘… the ability to focus an aim and design and implement quality improvement projects in a measurable fashion’

Discussion

Our results demonstrate that QAIC is a feasible and effective curriculum which improves knowledge and self-assessed skills in QI. This curriculum is generalizable across a wide range of internal medicine residency training programmes but would require local adaptation. Other specialties may also be able to adapt existing Maintainance of Certification tools to graduate education in QI. Building data collection and review into the scheduled educational sessions has been a simple, yet key element of the QAIC success, because we were able to minimise time burden on faculty and residents. Residents estimated that they spent about 90 min outside the scheduled course time on chart reviews during the first block of the curriculum. Furthermore, the course is accessible, as it uses an off-the-shelf, modestly priced educational tool which can be purchased directly from the ABIM over the internet (http://www.abim.org/residency). The PIM chart review tool also provides convenient links to evidence-based literature sources for the quality indicators it includes.

Limitations

Our study was completed at a single site limiting the generalizability to other institutions. Additionally, we used a self-assessment of confidence in QI skills. QIKAT raters may have been able to infer year of training by quality of answers provided to the open-ended questions. We do not have data on residents' use of QI skills in practice after residency. While it is likely that there will be a decay in resident confidence and skills without additional reinforcement, we believe exposure to a QI curriculum coupled with the opportunity to apply improvement skills to their own practice will prepare resident physicians to build improvement into their daily work.

Conclusion

A QI curriculum using the ABIM PIMs and small-group, resident chosen QI projects can result in significant improvements in resident knowledge and self-assessed skills in QI. Using a historical control group was a helpful way to account for the effects of accumulating resident experience in the pre-post evaluation of this QI curriculum.

Acknowledgments

The authors would like to acknowledge J Woodruff, S Glavin, H Humphrey, K Alvarez and L Hale, for their support of the educational curriculum and the quality improvement projects. We would also like to thank G Ogrinc and E Holmboe, for their assistance.

References

View Abstract

Footnotes

  • Funding Supported by an internal grant from the Pritzker School of Medicine, Graduate Medical Education Committee and The University of Chicago Department of Medicine, Excellence in Medical Education and Clinical Care Award.

  • Competing interests VMA receives an honorarium from the ABIM as a member of the Internal Medicine test writing committee.

  • Ethics approval Ethics approval was provided by the University of Chicago Institutional Review Board (Protocol#14982B).

  • Provenance and peer review Not commissioned; externally peer reviewed.