Article Text

A case study of translating ACGME practice-based learning and improvement requirements into reality: systems quality improvement projects as the key component to a comprehensive curriculum
  1. A M Tomolo1,2,3,
  2. R H Lawrence2,
  3. D C Aron1,2,3
  1. 1
    Department of Medicine, Louis Stokes Cleveland Department of Veterans Affairs Medical Centre, Cleveland, Ohio, USA
  2. 2
    Center for Quality Improvement and Research, Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Cleveland, Ohio, USA
  3. 3
    Department of Medicine, Case Western Reserve University School of Medicine, Cleveland, Ohio, USA
  1. Dr A M Tomolo, Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Assistant Professor of Medicine, Case Western Reserve University School of Medicine, 10701 East Boulevard, 170A(W), Cleveland, OH 44106, USA; anne.tomolo{at}


Background: In 2002, the Accreditation Council for Graduate Medical Education (ACGME) introduced a new requirement: residents must demonstrate competency in Practice-Based Learning and Improvement (PBLI). Training in this domain is still not consistently integrated into programmes, with few, if any, adequately going beyond knowledge of basic content and addressing all components of the requirement.

Aim: To summarise the implementation of a PBLI curriculum designed to address all components of the requirement and to evaluate the impact on the practice system.

Methods: A case-study approach was used for identifying and evaluating the steps for delivering the curriculum, along with the Model for Improvement’s successive Plan–Do–Study–Act (PDSA) cycles (July 2004–May 2006).

Data source: Notes from curriculum development meetings, notes and presentation slides made by teams about their projects, resident curriculum exit evaluations curriculum and interviews.

Results: Residents reported high levels of comfort by applying PBLI-related knowledge and skills and that the curriculum improved their ability to do various PBLI tasks. The involvement of multiple stakeholders increased. Twelve of the 15 teams’ suggestions with practical systems-relevant outcomes were implemented and sustained beyond residents’ project periods. While using the traditional PDSA cycles was helpful, there were limitations.

Conclusion: A PBLI curriculum that is centred around practice-based quality improvement projects can fulfil the objectives of this ACGME competency while accomplishing sustained outcomes in quality improvement. A comprehensive curriculum is an investment but offers organisational rewards. We propose a more realistic and informative representation of rapid PDSA cycle changes.

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


Physicians in training operate in complex healthcare delivery systems, but many have not been equipped with the knowledge or skills to analyse clinical environments and continually improve patient care.1 Instead, their training emphasises the clinical management of individual patients. The Accreditation Council for Graduate Medical Education (ACGME) acknowledged the changing training needs of physicians when it endorsed two novel core competencies that are typically omitted from formal medical curriculum: practice-based learning and improvement (PBLI) and systems-based practice (SBP).23 PBLI and SBP emphasise evaluation and improvement of clinical practices, patient safety, and the influence of systems within and outside healthcare upon patient outcomes. The ACGME’s PBLI competency involves six points (see box 1). The overall objective is competency in analysis of residents’ own patient care practices, evaluating and integrating scientific evidence, and improving their practices.


The ACGME was not prescriptive about how to implement and evaluate PBLI, and an established curriculum and validated assessment tool did not exist.4 Some graduate medical education (GME) programmes and clinician educators developed curricula and evaluation tools that emphasised continuous quality improvement (CQI) principles; others struggled to define clear objectives. While helpful, none of these addressed all of the components of PBLI as described by the ACGME.3510

In April 2003, three faculty members with expertise in PBLI content worked on establishing curricular goals and objectives. In July 2003, two faculty members began facilitating four half-day teaching sessions over a month. The initial curriculum emphasised developing, with clinic faculty mentorship, individual CQI projects on a disease-focused subset of each resident’s continuity clinic patients. Examples of projects included improving blood glucose levels, blood pressure and mammogram screening.

Dissatisfaction with the extent of QI and systems knowledge attained by participants prompted changes but concerns remained about whether our curriculum and others adequately addressed the competency requirements.59 Concerns included: (1) lack of depth and ownership of fundamental components of improvement knowledge and skills attained by residents, (2) residents’ inability to actually develop and implement clinically based CQI projects, (3) lack of a systems perspective given the emphasis on improvement in individual provider’s disease-specific clinical outcomes, (4) absence of projects that involved the practice setting and had a sustainable impact on the organisation, (5) residents not developing skills to identify and involve other relevant stakeholders, and (6) lack of meaningful opportunities for residents to facilitate the learning of others in the organisation.


To achieve the goal set by the ACGME, it is necessary to develop a curriculum that moves beyond content knowledge and addresses the gaps noted. That is, quality improvement of the curriculum meant applying the very principles of PBLI.311 We now summarise the development and formative evaluation of our revised curriculum. Since systems relevant quality improvement projects are the key to our comprehensive PBLI curriculum, we provide information on the projects (supplemental table 1).

Table 1 Summary of Practice-Based Learning and Improvement (PBLI) curriculum


A fundamental component of the conceptual model for development of our PBLI curriculum included the use of rapid cycle change, which is part of the Model for Improvement.1213 The process of multiple PDSA cycles is typically depicted as a smooth linear ramp of improvement across time with advancing complexity.1213 The model has been expanded to address multiple simultaneous changes (ramps).

Box 1 Accreditation Council for Graduate Medical Education outcome project

General Competency Practice-Based Learning and Improvement (from—residents are expected to:

  • Analyse practice experience and perform practice-based improvement activities using a systematic methodology

  • Locate, appraise and assimilate evidence from scientific studies related to their patients’ health problems

  • Obtain and use information about their own population of patients and the larger population from which their patients are drawn

  • Apply knowledge of study designs and statistical methods to the appraisal of clinical studies and other information on diagnostic and therapeutic effectiveness

  • Use information technology to manage information, access on-line medical information and support their own education

  • Facilitate the learning of students and other healthcare professionals


Description, development and implementation into a practice-based setting

Based on the noted curriculum gaps and discussions with faculty and residents, a revised curriculum was established in July 2004. Table 1 provides an overview of the modules for the 4-week curriculum. Basic information was included in slides for didactic components. Table 2 provides the basic recipe for implementing the PBLI curriculum integrated into the practice setting. The groundwork section summarises the necessary ingredients we have identified to have in place for the “taking off” phase, which summarises the main teaching points learnt during implementation of the curriculum.

Table 2 Launching and delivering the Practice-Based Learning and Improvement (PBLI) comprehensive curriculum

Setting and function

The Internal Medicine Residency Program at University Hospitals of Cleveland and the Louis Stokes Cleveland Department of Veterans Affairs (LSCDVAMC) trains an estimated 80 residents annually. Residents spend approximately one-third of their training at the LSCDVAMC. Each year, residents participate in a 1-month ambulatory rotation where they participate in a variety of specialty medical clinics. The majority of this rotation occurs at the LSCDVAMC. To decrease teaching burden, the PBLI curriculum is offered on alternate months.


Analytical method: case study

We conducted a case analysis of the development and implementation of our PBLI curriculum, using the PDSA cycle framework to help organise and summarise the processes and data. Data include notes from meetings, notes and presentation slides made by the teams about their projects, resident exit evaluations of the curriculum and interviews. Appendix A (online) provides an in-depth look at a particular resident project to help inform others interested in establishing or refining PBLI curriculum.

Below we summarise the outcomes sought to indicate implementation of a successful comprehensive curriculum regarding (1) process evaluation, (2) curriculum evaluation by residents and (3) evaluation of organisational practice impact.

Process evaluation related to teaching faculty, collaborators, residents and organisation

Teaching faculty

The PBLI curriculum teaching faculty members were evaluated, and a successful curriculum outcome was defined by their ability to help identify and nurture resident ownership of promising practice-based projects.


Process indicators associated with successful involvement of other collaborating stakeholders included willingness to assist in sponsoring clinical projects.


Implementation of a successful curriculum meant that residents needed to demonstrate preparation for and participation in the curriculum and project as exemplified through several factors, including: discussion of process discoveries and suggestions generated to improve the practice setting, evidence of recognising, soliciting and integrating other relevant stakeholders in their team’s work, identification of project suggestions that include practical components that would be integrated and assimilated into the practice setting, and attendance of residents at classes and the Internal Medicine Morbidity and Mortality Conferences (IM MMC), a high profile conference attended by the Chief of Medicine and Chief of Staff.


At the organisational level, process evaluation indicators included decisions about continuing resident IM MMC presentations beyond a trial period, active involvement of the audience regarding the residents’ presentations, and recognition by management of the projects in other settings. Notes and summaries of meetings were used to gauge process indicators.

Curriculum evaluation by residents

The curriculum should help the residents feel comfortable with the tools and skills needed to engage in PBLI. During one phase of formative evaluation (from July 2004 to April 2005), an anonymous exit evaluation was distributed to residents (five blocks, alternate months) that provided feedback for continually improving the course. One set of eight close-ended questions asked about level of comfort with skills in various aspects of PBLI on a scale from 1 representing “very uncomfortable” to 5 representing “very comfortable” (middle value represented “neutral”). The skills included such things as implementing a pilot study to defining a clear problem. The other set of close-ended questions asked for ratings about how well the PBLI course improved ability to do various tasks or skills on a scale from 1 representing “very poorly” to 5 representing “very well” (middle value represented “neutral”). Items assessed such things as using flow charts and understanding the Model for Improvement.1213Table 3 contains the exact wording. A final open-ended question asked respondents to indicate if they would use the knowledge and skills gained in these teaching sessions in their work.

Table 3 Exit evaluation of curriculum by residents (five blocks; N = 42)

Evaluation of organisational practice impact

If the curriculum is comprehensive and integrated with the clinical activities, there should be sustained components from the residents’ projects. Relevant staff members were interviewed and questioned about whether the systems components that emerged from residents’ projects were still used even though the residents’ projects had ended.


To date, a total of 87 residents have participated in this curriculum. Supplemental table 1 provides specifics by projects about some of the evaluation indicators. The teams of residents are organised by the project focus or theme (third column). We have indicated the number of teams and number of team members that were involved in the overall project focus.

Process evaluation

Teaching faculty

Efforts to identify and nurture resident ownership of projects began with emphasising projects related to practise performance rather than an individual patient or disease-specific population. Initial project themes evolved from former residents’ system failures analyses.14 As indicated in the “source of project idea’ in supplemental table 1, the developmental trajectory went from the source being the teaching faculty to collaborating faculty, other faculty and finally residents themselves identifying handoffs in the Emergency Department, rapid response team and inadequate medication supply for the next visit in the resident clinic as relevant topics for a project focus.


The original PSDA cycle began with the plan for collaborating faculty to mentor resident projects. This plan was aborted because there was lack of potential and actual project development in the resident continuity clinic: though willing to mentor, there was no action. Alternative options were developed: identify faculty members with an interest in CQI, incorporate non-physicians as collaborators (eg, Patient Safety Officer, nurses), and use the clinical setting of one of the teaching faculty. In identifying faculty with interest in CQI, we were successful in identifying clinical champions to facilitate project development in the continuity clinics and grew from two to five faculty members who agreed to be champions for projects.

Champions were unable to identify focused projects and to adequately monitor resident project development. This provided an opportunity for teaching faculty and residents to teach PBLI to collaborating faculty. To nurture the process further, teaching faculty began reviewing champions’ proposed projects, and together they created a case description of an actual scenario that became the foundation for project development. One of the collaborating faculty attended PBLI resident training sessions.


The curriculum was successful as evidenced by the growing knowledge about QI and skills to implement QI projects. The most relevant findings are summarised in the columns in supplemental table 1 labelled “process discoveries by team” and “team suggestions.” Part of the process of discoveries was the recognition of other stakeholders/professions (medical and non-medical). At first, residents relied on teaching faculty to actively facilitate networking with stakeholders and project development. For example, the initial development of the missed lung mass project required the teaching faculty to identify and arrange contact with project stakeholders in radiology, quality management and pulmonary. As the curriculum evolved, residents became more resistant to teaching faculty’s attempts to shape the projects as in the case of the medication refill project in the resident continuity clinic and rapid-response team project. Residents actively networked with project stakeholders, and as a result new aspects of established projects developed.

Finally, attendance of residents at classes and IM MMC was consistently high.


Teaching faculty recognised the need for increased resident ownership of projects, prompting them to request to have residents present their projects at IM MMC. After two informal conversations, the conference directors agreed to a trial period. The transition from trial basis to a regular part of the schedule occurred after only one set of presentations by residents.

Successful curriculum implementation also meant active involvement of participants at the IM MMC. The discussion points from faculty in attendance included engaging residents about data, decisions, stakeholders, next steps and other strategies. Three new collaborators approached teaching faculty about possible project ideas after attending resident presentations at IM MMC.

Finally, success of the curriculum was evidenced when the local organisation’s management became active promoters of the PBLI curriculum efforts. The resident’s projects were mentioned at LSCDVAMC meetings of quality management, medical service and medical executive committees. The projects were also presented in national forums: Association of Program Directors in Internal Medicine and Society of General Internal Medicine.

Curriculum evaluation by residents

Forty-two of the 43 participants in this evaluation phase returned exit evaluations. Table 3 summarises the findings for the close-ended items. Overall, residents reported high levels of comfort, applying knowledge and skills, and reported that the PBLI course improved their ability to do various tasks; 74.4% of the indicated they would use the knowledge and skills gained in the PBLI teaching sessions in their work (39 of the 42 respondents answered this item).

Twenty of the 42 residents who completed the exit evaluation provided written comments and/or suggestions. Table 4 presents all curriculum relevant comments which led to the following adjustments: time devoted to project presentation during sessions was reduced, criteria for collaborating staff were further developed, projects were permitted to continue in subsequent months, the first session emphasised the relevance of PBLI, and presentations of previous groups were made available to new groups.

Table 4 Feedback from residents

Institutional impact evaluation

The last three columns in supplemental table 1 cover the systems/institutional impact evaluation. While not all suggestions led to practical systems outcomes, many of those that did were still sustained, albeit it in a limited capacity for some. More specifically, all but two project themes had some practical systems outcomes that are still sustained. The last column summarises which institutional personnel verified whether the outcome from the project suggestions was sustained (integrated and part of the practice-setting at an institutional level). Clearly the residents were able to take a project theme and develop systems solutions with components that could and often were integrated into the organisation (columns 6 and 7).

Summary and overview of implementation process by stakeholder

It is important to note the curriculum development and implementation from the perspective of different stakeholders. Supplemental table 2 is an effort to provide an overview of the developmental trajectories by summarising the evolution of creating a comprehensive curriculum by stakeholders and identifying next steps. Specifically, supplemental table 2 summarises the various measures or indicators of success associated with the goals organised by major stakeholders related to curriculum development.

Challenges of using PDSA cycles and impact on curriculum development

The repeated use of rapid cycle tests of change1213 was an important component of (1) evaluating curriculum development and implementation processes; (2) the curriculum itself as a continuous quality improvement tool; and (3) the residents’ projects. However, there were limits to the usefulness of the smooth linear ramp of improvement across time.1213 This traditional representation does not portray important nuances and the uneven, dynamic and sometimes messy reality of implementation. That is, contexts, with their challenges and opportunities, and change create an interplay that is rarely neat and linear. Thus, using the PDSA Model for Improvement in teaching trainees to perform rapid cycle change as a part of their project development was challenging. It did not translate to the application of the rapid cycle change in the clinical context. As a result, there were times as a teacher that attempts were made to force strict adherence to the steps outlined in the model (P-D-S-A), inhibiting the natural progression of incomplete cycles (eg, P-D, S-P-A) that may evolve in response to challenges and opportunities identified in the practice setting. As such, the application of PDSA cycles became more flexible and acknowledged the emerging needs of the project.

Figure 1 provides a variation on the PDSA conceptual model. Our variation acknowledges false starts, miss firings, plateaus, regroupings, backsliding, feedback and overlapping scenarios within the process. The blank sections of the cycle indicate phases that were not relevant to that cycle. The model also acknowledges that opportunities and challenges exist that help to propel the process upward or potentially stall or introduce setbacks in the process.

Figure 1

Revised conceptual model of rapid cycle change.

We have used arrows to indicate the major flow of activity within and across phases or cycles, that is, the way the steps in a cycle feed forward. In addition, we have used dashed lines for arrowheads that indicate a lingering (background) impact—for example, an action path aborted due to obstacles (brick wall) continues to feed forward, even though it is aborted. In some instances the background feed forward helps inform new paths (ie, develop new plans) because the obstacles are insurmountable. In other instances, it creates a new approach (or plan) that manifests itself later in project development, and the obstacle is revisited and surmounted.

The size of the circle helps to denote relationships among the cycles. Not all cycles have equal impact on project development and therefore vary in size, and not all cycles are completed. Some steps are pivotal and remained so for future cycles as represented by solid lines and arrows connecting cycles. Further, font size helps to convey information about components of the cycle. Large font depicts insights that help shape the way the modifications are integrated into future cycles, and those cycles need not be adjacent in sequence. Thus, our model acknowledges the multiple impacts of some components of previous cycles in the cumulative history that is brought to bear to create systems solutions. Those steps had a fundamental role in modifying actions into a sustainable systems solution, and as such they are represented by a larger letter. Cycles at varying levels of success interact with other cycles in various stages of the PDSA cycle. In the far left of the figure, we depict the scenario where studying leads to further simultaneous studying in a smaller PSDA cycle before continuing, and the actions derived from both inter-related cycles impact on the planning in the next chronological cycle.

Moreover, not all cycles necessarily lead to greater complexity in an upward linear fashion. Rather some cycles explore or define limitations or setbacks, and it is not until later cycles that the challenges are harnessed to make improvement possible (and that may not always happen). Thus, fig 1 seems a more realistic and informative representation that recognises the interdependent patterns that continue the interplay toward achieving change which is not necessarily always smooth, defined by complete and equally important cycles, or linear with an upward flow. We suggest that the revised model provides nuances that help in better understanding implementation processes. As is true of all general descriptive models, this is meant as a prototype that alerts one to possibilities rather than depicting every circumstance.

Appendix A (online) contains a detailed record of the application and challenges related to summarising PDSA cycles of a single resident project: the evolution from a concern regarding missing abnormal lab results to a systems process to address the concern.


By identifying and addressing the needs of relevant stakeholders (eg, physician, non-physician clinicians and non-physician managers) at the local level, we were able to create and implement a comprehensive curriculum that addressed all components of the ACGME competency. A curriculum based on systems projects resulted in residents perceiving increases in PBLI knowledge and skills, and demonstrating abilities to analyse and improve their practice, find, appraise and assimilate project-relevant scientific evidence, collect and interpret relevant information from their practice, use research evidence and apply research and statistical methods, apply information technology and facilitate the learning of others. The resulting curriculum engaged residents to take ownership and actively apply PBLI principles, consistent with current thinking about effective educational interventions.1520

Specifically, the curriculum and projects focused on systems change which requires increasing appreciation of the stakeholders. For example, in our initial trial of the PBLI curriculum, several residents focused upon improving the management of blood pressure in a convenience sample of their hypertensive patients. On the surface this appears to be a practice-based improvement; however, when residents completed the ambulatory rotation, there was no evidence of impact at the practice-based level of care. Residents involved in a true practice-based change in blood pressure management (January and March 2006) focused upon the process of care for all patients in their clinic and pilot-tested a change in practice resulting in modification in the evaluation of blood pressure for all patients in the resident continuity clinic in the LSCDVAMC.

Residents became role models and teachers of PBLI through their identification and integration of relevant projects into clinical practice and presentation of their efforts at IM MMC. From these presentations, faculty, who were not sought out, approached the teaching faculty with project ideas. Moreover, the organisational leaders were active promoters in disseminating and discussing the projects in different forums. Our experiences and focus of going beyond basic knowledge of CQI and using projects with systems implementation value is consistent with a curriculum recently developed and evaluated for surgery residents.15 In addition, resident comments and calibre of projects indicated that the curriculum empowered them to work with stakeholders to improve their clinical practice (eg, improve management of abnormal test results) and to identify other areas in need of improvement.

There are several limitations that should be noted. While case study methods are ideal for studying developmental and implementation processes, they have limits regarding generalizability. Thus, we cannot make inferences about the curriculum’s transportability and likelihood of success given that this curriculum occurred at one institution and was driven by the teaching faculty, particularly in its first year. However, the insights from the in-depth analysis (appendix A online) offer strategies for taking into account context and the dynamics related to various stakeholders.

In addition, while the curriculum was delivered successfully by new faculty beginning in July 2005, the initial teaching faculty members were still involved in the periphery and providing some oversight. It is not clear how much time and investment of various stakeholders is needed to integrate this type of curriculum meaningfully into the residents’ practice setting, and we did not assess this. However, our study suggests that a PBLI curriculum is more likely to be sustainable if it gets integrated at the organisational level in a meaningful way, contributes to the overall practice setting and incorporates key stakeholders.

The focus of this paper was on formative evaluation. As such, we focused on self-assessment of knowledge and did not explicitly evaluate change in knowledge and application of tools. While the success of the projects suggests improvements, it remains important to assess this directly.

Despite these limitations, the curriculum was successful in achieving the purposed goals. Our PBLI curriculum provided residents a meaningful opportunity to be a stakeholder active in making a difference in patient care through systems quality-improvement projects while learning basic skills. Additionally, we modified the traditional model of rapid cycle change and created a conceptual model that depicts a more realistic, contextually based representation of the developmental nature of an improvement project.


The authors would like to thank A Caron, for her contributions to the development and implementation of the curriculum. In addition, we would like to thank the Teaching and Collaborating Internal Medicine faculty at the Louis Stokes Cleveland Department of Veterans Affairs Medical Center.


Supplementary materials


  • Additional supplemental tables and an appendix are published online only at

  • Competing interests: None.

  • Ethics approval: Provided by The Louis Stokes Cleveland VA Medical Center's Institutional Review Board Human Studies Subcommittee.