Designing quality improvement initiatives: the action effect method, a structured approach to identifying and articulating programme theory =========================================================================================================================================== * Julie E Reed * Christopher McNicholas * Thomas Woodcock * Laurel Issen * Derek Bell ## Abstract **Background** The identification and articulation of programme theory can support effective design, execution and evaluation of quality improvement (QI) initiatives. Programme theory includes an agreed aim, potential interventions to achieve this aim, anticipated cause/effect relationships between the interventions and the aim and measures to monitor improvement. This paper outlines the approach used in a research and improvement programme to support QI initiatives in identifying and articulating programme theory: the action effect method. **Background to method development** Building on a previously used QI method, the driver diagram, the action effect method was developed using co-design and iteration over four annual rounds of improvement initiatives. This resulted in a specification of the elements required to fully articulate the programme theory of a QI initiative. **The action effect method** The action effect method is a systematic and structured process to identify and articulate a QI initiative's programme theory. The method connects potential interventions and implementation activities with an overall improvement aim through a diagrammatic representation of hypothesised and evidenced cause/effect relationships. Measure concepts, in terms of service delivery and patient and system outcomes, are identified to support evaluation. **Discussion and conclusions** The action effect method provides a framework to guide the execution and evaluation of a QI initiative, a focal point for other QI methods and a communication tool to engage stakeholders. A clear definition of what constitutes a well-articulated programme theory is provided to guide the use of the method and assessment of the fidelity of its application. * Evaluation methodology * Quality improvement methodologies * Quality measurement * Implementation science * Communication ## Introduction ### Need for theory in quality improvement The number of quality improvement (QI) initiatives is increasing in an attempt to improve quality of care and reduce unwarranted variation. It is essential to understand the effectiveness of these initiatives; however, they commonly lack underlying theory linking a change to its intended outcome, which inhibits the ability to demonstrate causality and hinders widespread uptake.1–3 Programme theory is used to describe an intervention and its anticipated effects and is critical to support both high-quality evaluation and the development of interventions and implementation plans.3–6 Development of programme theory can provide a means to tackle common social challenges of QI such as creating a shared strategic aim and increasing acceptance of interventions.7 While QI methods for the identification and articulation of theory and causal relationships exist,8 ,9 there has been little study of their application in practice in healthcare settings. This paper describes the approach developed by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) Northwest London (NWL) to identify and articulate programme theory, the action effect method (AEM). ### NIHR CLAHRC for NWL The NIHR in the UK commissioned regional research programmes, the CLAHRCs, to support systematic and effective translation of evidence into practice, aiming to improve quality of care and outcomes for patients.10 CLAHRC NWL adopted an approach using a suite of QI methods to support a range of research and improvement initiatives across the region, driven by an overarching research agenda to investigate the application and impact of QI methods in healthcare.11–15 In delivering this programme, CLAHRC NWL developed extensive first-hand experience of supporting initiatives to develop programme theory in diagrammatic form. ## Background to method development The AEM was developed using co-design through four iterations that took place over four annual rounds of improvement initiatives, building on driver diagrams.9 In total, 43 unique programme theory diagrams (driver diagrams and action effect diagrams (AEDs)) were produced over this time, each for a specific improvement initiative and setting. During the early phase of CLAHRC NWL, teams used the driver diagram approach (22 diagrams produced in years 1 and 2 collectively). Driver diagrams are intended to offer an easy-to-read and digestible theory of improvement for an initiative. It was observed, however, that teams struggled to work collaboratively to produce useful theory and perceived diagram construction as low value. Best-practice examples of driver diagrams were provided by expert facilitators,16 but it became evident that there was limited published guidance on what constituted a ‘good’ diagram, the purpose of such diagrams and how individual components were defined. The driver diagrams produced in practice were of limited value as programme theory, which resulted in time delays and reworks to identify suitable metrics and develop evaluation plans (personal communication; Huckvale K, Woodcock T, Poots A, 2014). To overcome these challenges, a more systematic approach was developed, describing the components of the diagram and how it can be used consistently to clearly articulate programme theory. The term ‘driver’ was often deemed confusing by improvement initiative team members, associating the term with strategic influences such as financial and political motivations rather than the actions that could be undertaken by the initiative. In the new approach, the programme theory diagram was named the Action Effect Diagram to more accurately signify its purpose. This approach retains the benefits of the driver diagram visual layout and clearly specifies the necessary programme theory features and their representation in diagrammatic fashion. The new approach, the Action Effect Method, has been applied in the later phase of CLAHRC NWL QI initiatives (21 diagrams produced in years 3 and 4 collectively), as well as national and regional QI initiatives in England, Scotland and Australia. ## The AEM ### What is the AEM? The AEM is a facilitated approach to developing an AED, a visual representation of the programme theory for a QI initiative. Programme theory, in the context of improvement initiatives, is defined as the articulation of an overall aim, potential intervention(s) that will be tested in an attempt to achieve this aim, hypothesised cause/effect relationships linking intervention(s) to the aim and measure concepts that link to the cause/effect chains to support evaluation. The AEM initially engages members of a QI initiative in a group session to share their individual perspectives and aspirations for the project to develop a shared aim and identify factors contributing to that aim. The diagram produced in this initial session is developed into a full programme theory diagram as details of the initiative are discussed and agreed. This programme theory may be contributed to further as the initiative develops over time. Through this articulation of programme theory, the AEM acts as a multipurpose methodology for practitioners and researchers; it supports the design and execution of QI initiatives as part of a suite of QI methods, highlights existing evidence where available and informs related evaluation activities. ### Diagram overview The diagram consists of an aim, factors, cause/effect arrows, measure concepts and evidence annotations. Factors are categorised into contributing factors, interventions and implementation activities. The aim sits to the left of the diagram, and intervention(s) and implementation activities to the right (definitions are provided below and in table 1). Cause/effect chains, made up of factors linked by arrows, indicate how actions could result in the intended effect and, ultimately, improvement(s). Because the aim is on the left-hand side of the diagram, the arrows always point from right to left to indicate cause and effect (figure 1). An example of an AED is presented in figures 2 and 3 based on a CLAHRC NWL chronic obstructive pulmonary disease (COPD) QI initiative. Figure 2 represents the initial diagram produced by the QI team, and figure 3 represents the revised diagram prepared retrospectively at the end of the initiative, incorporating learning gained throughout the initiative.17 Selected examples are given in box 1 to provide further explanation of the diagram's features and share our practical experience regarding how to construct a high-quality programme theory diagram.18 Further examples are given in online supplementary appendix 1. View this table: [Table 1](http://qualitysafety.bmj.com/content/23/12/1040/T1) Table 1 Definition of action effect diagram features ![Figure 1](http://qualitysafety.bmj.com/https://qualitysafety.bmj.com/content/qhc/23/12/1040/F1.medium.gif) [Figure 1](http://qualitysafety.bmj.com/content/23/12/1040/F1) Figure 1 Schematic action effect diagram: guide to interpreting the components and overall structure of a typical action effect diagram. ![Figure 2](http://qualitysafety.bmj.com/https://qualitysafety.bmj.com/content/qhc/23/12/1040/F2.medium.gif) [Figure 2](http://qualitysafety.bmj.com/content/23/12/1040/F2) Figure 2 Example action effect diagram for chronic obstructive pulmonary disease (COPD) constructed at the start of a Collaboration for Leadership in Applied Health Research and Care Northwest London improvement initiative (before implementation commenced). The diagram depicts the initial programme theory proposed for how the implementation of a care bundle will contribute to the overall aim of improved care for patients with COPD. ![Figure 3](http://qualitysafety.bmj.com/https://qualitysafety.bmj.com/content/qhc/23/12/1040/F3.medium.gif) [Figure 3](http://qualitysafety.bmj.com/content/23/12/1040/F3) Figure 3 Example action effect diagram for a chronic obstructive pulmonary disease improvement initiative constructed retrospectively at the end of the initiative. The diagram depicts the programme theory for how the implementation of a care bundle was enacted, drawing on learning held at the end of the initiative. COPD, chronic obstructive pulmonary disease; CNS, Clinical Nurse Specialist; GP, general practitioner; PR, pulmonary rehabilitation. ### Box 1 Selected examples of action effect diagram (AED) features related to the chronic obstructive pulmonary disease (COPD) AED in figures 2 and 3 (additional examples can be found in online supplementary appendix 1) **Diagram overview** *Reading the diagram from left to right answers the question, ‘What changes can we make that will result in an improvement?*' ▸ Following a single cause/effect chain, one factor influencing the aim is a patient's self-management of their lifestyle including *whether the patient smokes*. One factor influencing patient smoking behaviour is *attendance at and engagement with a smoking cessation service* (and so on). *Reading the diagram from right to left answers the question, ‘What are we trying to accomplish?’* **Aim** *The aim should be high-level and patient-focused but specific enough to guide the improvement initiative and subsequent evaluation*. ▸ A general aim, improving quality of care for patients with COPD, could be specified as: *To improve the health, quality of life and experience of care for patients from hospital X who are discharged following an acute exacerbation of COPD*. **Contributing factors** *The major contributing factors in column 1 should be of a similar type and form a logical group*. ▸ In the COPD example, we use the stages of patient care *appropriate care in-hospital, self-management postexacerbation* and *quality of additional clinical care post-exacerbation*. The three factors are all of a similar type and form a logical group, in this case, contexts of care. **Interventions and implementation activities** *Interventions are intended to become part of routine service delivery, implementation activities are carried out by the quality improvement team*. ▸ Interventions can aim to improve consistency of existing service, modify current service, or introduce a new service. Taking ‘consistency of existing service’ as an example, there may be variation in staff competency and confidence in inhaler technique training, requiring improvement to increase consistency and equity of services. An associated implementation activity is *Specialised staff education session on inhaler techniques*. **Cause/effect chains** *Factors connected by an arrow must be clearly related with no illogical leaps.* ▸ The casual link between *referral to smoking cessation* and *whether the patient smokes* makes an illogical leap. It is not just the referral itself that influences whether a patient smokes in the future; it also matters what happens following a referral, including whether a patient is motivated to attend or able to complete a smoking cessation programme. ▸ Adding *attendance at and engagement with smoking cessation programme* helps unpack this connection. **Evidence, predictions, assumptions and measures** *Cause/effect relationships may be supported by existing evidence.* ▸ An evidence-based cause/effect relationship exists between *attendance at and engagement with smoking cessation* and *whether the patient smokes.*19 *Measure concepts must be clearly associated with the relevant factor or aim.* ▸ The success of *staff training* to influence *patient education for inhaler technique* can be measured by the proportion of staff designated as inhaler technique providers who have attended the specialised staff training sessions (indicating extent to which intervention took place) and the impact this training had on percentage of patients who received inhaler education. Reading from either side, the diagram reflects answers to the questions posed in the model for improvement.9 From left to right, the diagram answers the question, ‘What changes can we make that will result in an improvement?’. From right to left, the diagram answers, ‘What are we trying to accomplish?’. Both questions are key elements of constructing and interpreting the AED. As a consequence of the cause/effect direction, factors represented on an AED will move from those under direct control on the right of a diagram (e.g. an intervention or implementation activity that is directly actionable by a member of the improvement initiative such as training staff), through those that are under direct influence (e.g. by training staff you have direct influence over patients being taught correct inhaler technique), to those that are only under indirect influence (e.g. patients’ self-management overall or quality of care for patients will be influenced by many factors, only some of which are under direct influence of the QI initiative). The resulting diagram should clearly represent the rationale and intention of a QI initiative and be sufficiently comprehensive and detailed for a non-expert stakeholder to interpret readily. Additionally, the AED has explicit connectivity to other QI methods to support their systematic use as a suite of methods rather than individual activities (table 2).9 ,19–21 View this table: [Table 2](http://qualitysafety.bmj.com/content/23/12/1040/T2) Table 2 Links to other improvement methodologies ### The aim At the far left of the diagram, the aim describes the overall objective of the improvement effort and provides the central focus for cause/effect chains to converge. The aim should be of sufficient detail and specificity to guide the improvement initiative and subsequent evaluation efforts, should be patient-centred22 and should represent the most specific aspiration that all members of the initiative can agree on. Using this method, interventions and measure concepts should not be included in the aim. Placement on the left is deliberate, using the natural tendency in readers of Western languages to read from left to right to encourage an initial focus on the aim rather than potential solution(s).23 ,24 ### Contributing factors Contributing factors are boxes representing the logical steps required to connect the interventions and the aim, that is, they are caused by the intervention(s) and the achievement of the aim is caused by them. They indicate how the intervention(s) are intended to collectively cause the aim to be achieved. For clarity and objectivity, factors should not include measure concepts or verbs indicating aspiration (e.g. to reduce, to improve). Factors in the first column, those directly influencing the overall aim, are referred to as major contributing factors. Major here refers to the first (major) division of the aim into things (factors) that contribute to its achievement rather than to their relative importance. They represent a hypothesis: improvements across all major contributing factors are sufficient to achieve the overall aim. Major contributing factors should be of a similar type to one another, or form a logical group, to support review of the programme theory and assessment of missing factors. ### Interventions and implementation activities Interventions and implementation activities specify changes to care delivery and associated activities. AEM distinguishes an intervention as a change to service delivery that is intended to become routine. This may represent new practice, modifications to current service delivery or the desire to improve consistency of delivery of current service provision. Distinctions between contributing factors, interventions and implementation activities may be subjective, with the specific needs of an initiative shaping the perceived focus. The definition of an intervention can therefore be problematic, particularly as initiatives spread to different settings and local adaptations are made. The articulation of programme theory is an important step to support intervention definition and clarification of interactions with other factors that influence the overall aim. This also supports reproducibility and transferability of interventions between healthcare settings.25 To facilitate communication, ‘highlighting’ boxes may be added to an AED, surrounding a group of factors that all influence another factor or to signify a collection of factors that make up an intervention (figure 3). Importantly, given the complex nature of improvement in healthcare and the difficulty of predicting what will work in advance, interventions and implementation activities are not final solutions but ideas to be tested and iteratively modified as feedback is received on their effectiveness.17 This connects the AED to plan-do-study-act cycle methodology (table 2).9 ,19 The impact of this iterative development can be seen in the difference between AED produced at the outset and at the end of a QI initiative (figures 2 and 3). The interventions and implementation activities referred to on the AED, and their development over time, require detailed description in supporting documents in order to be reproducible and support transfer of knowledge. ### Cause/effect chains Interventions are connected (with arrows) through the contributing factors to the overall aim forming cause/effect chains. There may be any number of factors in a cause/effect chain. The ordering of factors on the diagram is determined by its position in the cause-and-effect chain(s) that link it to the aim and not by an assessment of its importance. The AED does not draw any conclusions about the relative importance of each factor but can be used in structuring information to support evaluation of relative importance and impact of different factors. We have developed an aid to support vertical alignment of factors in the diagram once cause/effect chains have been established (see online supplementary appendix 1 example 12). This guidance is only one of potentially several ways to aid the organisation of factors. The AED is intended to be a live document with iterations or additions made throughout an improvement initiative. Missing contributing factors or substantial new evidence (e.g. publication of guidelines, systematic reviews) identified during an initiative should be added or amended on the AED. When making changes, users need to consider the influence on evaluation; this may involve decisions in the planning stage as to any factors that will not change to support rigorous evaluation. ### Evidence, predictions, assumptions and measures In essence, the cause/effect chains represent the notion that improving against a particular factor will cause improvement against another factor or achievement of the overall aim. The justification for each connection in a cause/effect chain may be based on existing evidence, predictions or assumptions. Predictions may be based on local ideas or on explicit theories of change.26–29 Predictive cause/effect relationships in the diagram can be identified as those that include measure concepts annotated on each box. Assumed cause/effect relationships are defined as pairs of factors where at least one of the pairs is not measured (thereby limiting the ability to assess cause/effect relationships). Evidenced cause/effect relationships can be annotated to distinguish them from assumed or predicted relationships, for example, in figure 3 we use dotted lines for assumed/predicted and solid lines for definition/evidenced relationships with an additional annotation to denote the level of evidence associated with the relationships. Measure concepts can also be annotated on evidenced relationships. Measure concepts associated with cause/effect chains are developed into well-defined measures used to test individual predictions and evaluate impact of the intervention(s) (personal communication, Woodcock T, Poots A, Huckvale K, *et al*, 2014). Evaluation of a programme theory is more comprehensive if there is a distribution of measures across the diagram, from process to outcome measures.30 ,31 ## Discussion The AEM provides a thorough specification of a method for articulating the programme theory of a QI initiative through a clear visual representation of cause/effect relationships between an improvement aim and potential interventions, with annotation of related evidence and measure concepts. As well as clearly identifying the components of programme theory expressed in an AED, links to other QI methods are also demonstrated. The AEM is designed to act as a prospective guide for improvement teams as well as in support of evaluating the impact and the spread and sustainability of QI initiatives by moving away from individually held tacit knowledge.32 Getting a ‘correct’ theory prior to initiation is not the goal of the AEM; iterations of the diagram will occur throughout use of this method. As QI initiatives develop over time, the strength or weakness of the assumed cause/effect chains becomes apparent, as well as other factors that were not considered in the original programme theory.17 This iterative process of theorisation and evaluation helps explain the results of both positive and negative trials and can support both prospective process evaluation30 and ex-post hoc analysis.33 The AEM provides a platform for further research to explore what ‘good’ programme theory is and how it might enable the transfer of learning from one project (e.g. figure 3) to another project. The need for programme theories and logic models is well articulated.6 ,17 ,34–39 However, there is little practical guidance available on how to construct good quality diagrams. The AEM adds to this through explicit articulation of the components of programme theory and their relationship to one another in diagrammatic form, something that other models often lack. Articulating complex concepts in a single diagram plays an important cognitive role in supporting readers to more readily access large amounts of information to support problem solving and inference-making.40 The AED differs from the ‘cause and effect diagram’ or the Ishikawa diagram outlined in QI literature8 as the cause/effect chains do not represent only potential problems inhibiting an improvement attempt, but more generally the hypothesised relationships between actions and the improvement aim, including those necessary for measurement and evaluation. The AEM builds on the key principles of driver diagrams, but with additional clarity regarding functions and purpose of different components of the diagram providing a more scientifically rigorous approach to the development and articulation of programme theory.41 This methodological specification will support further research to evaluate the benefits of using the AEM in practice.19 Our experience of developing programme theory diagrams reveals that the process of construction is as important as the resulting diagram in supporting the planning and delivery of improvement. When the construction process is well facilitated, it enables patients, academics and healthcare professionals to share and make sense of multiple sources of knowledge (including tacit knowledge) and evidence in a manner that minimises conflict, with the AED acting as a boundary object to aid communication between these groups.42 The agreement of a shared aim promotes greater engagement with a wide range of stakeholders and can promote patient-centred conversations. The diagram itself is a powerful communication tool, demonstrating the connection between strategic and political drivers of senior management with the actions and motivations of frontline staff.43 The AEM does not distinguish or limit scope to certain levels of change but encourages teams to be aware of and consider all relevant factors that can influence outcome, including those out of direct control of the QI team. The engagement of diverse stakeholders in the articulation of programme theory is still a significant challenge and can be best addressed by expert and neutral facilitation throughout the process. Further research is necessary to assess the range of the social functions of the AEM, both through the use of the diagram as a boundary object and through the process of facilitating its creation. ## Conclusions The AEM gives structure to the identification and articulation of programme theory, an important step of QI initiative development. It provides a framework to guide execution and evaluation of an initiative, a focal point for other QI methods and a communication tool to engage stakeholders. A clear definition of what constitutes a well-articulated programme theory is provided to guide the use of the method and assessment of the fidelity of its application. ## Footnotes * Contributors JER, TW and DB identified the need for method development and led the intellectual and practical development of the AEM. CM contributed to the development of the AEM. JER and CM prepared the first draft for publication. TW and LI contributed significantly to clarify and articulate the concepts presented in the text and to develop the example COPD diagram and text. All authors contributed to the development and final version of the text. * Funding This article presents independent research commissioned by the National Institute for Health Research (NIHR) under the Collaborations for Leadership in Applied Health Research and Care (CLAHRC) programme for North West London. JR and TW are supported by Improvement Science Fellowships with the Health Foundation. The views expressed in this publication are those of the author(s) and not necessarily those of the Health Foundation, the NHS, the NIHR or the Department of Health. * Competing interests None. * Provenance and peer review Not commissioned; externally peer reviewed. This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) ## References 1. Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci 2010;5:14. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1748-5908-5-14&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=20181130&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 2. Shojania KG, Grimshaw JM. Evidence-based quality improvement: the state of the science. Health Aff (Millwood) 2005;24:138–50. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6OToiaGVhbHRoYWZmIjtzOjU6InJlc2lkIjtzOjg6IjI0LzEvMTM4IjtzOjQ6ImF0b20iO3M6MjA6Ii9xaGMvMjMvMTIvMTA0MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 3. Foy R, Ovretveit J, Shekelle PG, et al. The role of theory in research to develop and evaluate the implementation of patient safety practices. BMJ Qual Saf 2011;20:453–9. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoicWhjIjtzOjU6InJlc2lkIjtzOjg6IjIwLzUvNDUzIjtzOjQ6ImF0b20iO3M6MjA6Ii9xaGMvMjMvMTIvMTA0MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 4. Grol RPTM, Bosch MC, Hulscher MEJL, et al. Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q 2007;85:93–138. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1111/j.1468-0009.2007.00478.x&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=17319808&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000244341800005&link_type=ISI) 5. Weiss CH. Theory-based evaluation: past, present, and future. New Dir Eval 1997;1997:41–55. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1002/ev.1086&link_type=DOI) 6. Walshe K. Understanding what works--and why--in quality improvement: the need for theory-driven evaluation. Int J Qual Health Care 2007;19:57–9. [FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6NjoiaW50cWhjIjtzOjU6InJlc2lkIjtzOjc6IjE5LzIvNTciO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 7. Dixon-Woods M, McNicol S, Martin G. Ten challenges in improving quality in healthcare: lessons from the Health Foundation's programme evaluations and relevant literature. BMJ Qual Saf 2012;21:876–84. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoicWhjIjtzOjU6InJlc2lkIjtzOjk6IjIxLzEwLzg3NiI7czo0OiJhdG9tIjtzOjIwOiIvcWhjLzIzLzEyLzEwNDAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 8. Plsek PE. Tutorial: management and planning tools of TQM. Qual Manag Health Care 1993;1:59–72. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1097/00019514-199322000-00008&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10130528&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 9. Langley G, Moen R, Nolan K, et al. The improvement guide: a practical approach to enhancing organizational performance. 2nd edn. San Francisco, California: Jossey-Bass Publishers, 2009. 10. National Institute of Health Research. 4.5 Collaborations for leadership in applied health research and care (CLAHRCs). http://www.nihr.ac.uk/documents/about-NIHR/Briefing-Documents/4.5-Collaborations-for-Leadership-in-Applied-Health-Research-and-Care.pdf (accessed Sep 2014). 11. Caldwell SEM, Mays N. Studying policy implementation using a macro, meso and micro frame analysis: the case of the Collaboration for Leadership in Applied Health Research & Care (CLAHRC) programme nationally and in North West London. Health Res Policy Syst 2012;10:32. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1478-4505-10-32&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=23067208&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 12. Doyle C, Howe C, Woodcock T, et al. Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implement Sci 2013;8:127. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1748-5908-8-127&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=24160758&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 13. Howe C, Randall K, Chalkley S, et al. Supporting improvement in a quality collaborative*.* Br J Healthc Manag 2013;19:434–42. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.12968/bjhc.2013.19.9.434&link_type=DOI) 14. Reed JE, Bell DB. The concept and reality of Evidence Based Implementation. *HaCIRIC International Conference*; 26th–28th September 2011. Manchester, 2011. 15. Renedo A, Marston C, Spyridonidis D, et al. Patient and public involvement in healthcare quality improvement: how organisations can help patients and professionals to collaborate. Public Manag Rev 2014;1–18. 16. NHS Institute for Innovation and Improvement. Driver Diagrams. [http://www.institute.nhs.uk/quality\_and\_service\_improvement\_tools/quality\_and\_service\_improvement\_tools/driver\_diagrams.html](http://www.institute.nhs.uk/quality\_and\_service\_improvement\_tools/quality\_and\_service_improvement_tools/driver_diagrams.html) (accessed Feb 2014). 17. Grol R. Beliefs and evidence in changing clinical practice. BMJ 1997;315:418–21. [FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjEyOiIzMTUvNzEwNS80MTgiO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 18. National Clinical Guideline Centre. Chronic obstructive pulmonary disease: management of chronic obstructive pulmonary disease in adults in primary and secondary care. London: National Clinical Guideline Centre 2010. 19. Taylor MJ, McNicholas C, Nicolay C, et al. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf 2014;23:290–8. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoicWhjIjtzOjU6InJlc2lkIjtzOjg6IjIzLzQvMjkwIjtzOjQ6ImF0b20iO3M6MjA6Ii9xaGMvMjMvMTIvMTA0MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 20. Berwick DM. A primer on leading the improvement of systems. BMJ 1996;312:619–22. [FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjEyOiIzMTIvNzAzMS82MTkiO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 21. Deming WE. Out of the crisis. Cambridge, Massachusetts: MIT Press, 1986. 22. Plsek PE. Quality improvement methods in clinical medicine. Pediatrics 1999;103(Supplement_E1):203–14. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTA6InBlZGlhdHJpY3MiO3M6NToicmVzaWQiO3M6MjE6IjEwMy9TdXBwbGVtZW50X0UxLzIwMyI7czo0OiJhdG9tIjtzOjIwOiIvcWhjLzIzLzEyLzEwNDAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 23. Wheildon C, Warwick M. Type & layout: how typography and design can get your message across-or get in the way. Berkeley, California: Strathmoor Press, 1995. 24. Esain AE, Williams SJ, Gakhal S, et al. Healthcare quality improvement—policy implications and practicalities. Int J Health Care Qual Assur 2012;25:565–81. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1108/09526861211261172&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=23276053&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 25. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 2014;348:g1687. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjE3OiIzNDgvbWFyMDdfMy9nMTY4NyI7czo0OiJhdG9tIjtzOjIwOiIvcWhjLzIzLzEyLzEwNDAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 26. Baker R, Camosso-Stefinovic J, Gillies C, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2010;(3):CD005470. 27. Michie S, Johnston M, Abraham C, et al. Making psychological theory useful for implementing evidence based practice: a consensus approach. BMJ Qual Saf 2005;14:26–33. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoicWhjIjtzOjU6InJlc2lkIjtzOjc6IjE0LzEvMjYiO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 28. Michie S, Johnston M, Francis J, et al. From theory to intervention: mapping theoretically derived behavioural determinants to behaviour change techniques*.* Applied Psychology 2008;57:660–80. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1111/j.1464-0597.2008.00341.x&link_type=DOI) 29. Nolan T, Resar R, Haraden C, et al. Improving the reliability of health care. IHI Innovation Series white paper. Boston: Institute for Healthcare Improvement, 2004. ([http://www.IHI.org](http://www.IHI.org)). 30. Grant A, Treweek S, Dreischulte T, et al. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013;14:15. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1745-6215-14-15&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=23311722&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 31. Stetler CB, Mittman BS, Francis J. Overview of the VA Quality Enhancement Research Initiative (QUERI) and QUERI theme articles: QUERI Series. Implement Sci 2008;3:8. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1748-5908-3-8&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=18279503&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 32. Bate SP, Robert G. Knowledge management and communities of practice in the private sector: lessons for modernizing the National Health Service in England and Wales. Public Adm 2002;80:643–63. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1111/1467-9299.00322&link_type=DOI) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000179861000002&link_type=ISI) 33. Dixon-Woods M, Bosk CL, Aveling EL, et al. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q 2011;89:167–205. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1111/j.1468-0009.2011.00625.x&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=21676020&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000292083600001&link_type=ISI) 34. Bartholomew LK, Mullen PD. Five roles for using theory and evidence in the design and testing of behaviour change interventions. J Public Health Dent 2011;71:S20–33. 35. Davies R. Scale, complexity and the representation of theories of change. Evaluation 2004;10:101–21. [Abstract](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToic3BldmkiO3M6NToicmVzaWQiO3M6ODoiMTAvMS8xMDEiO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 36. Davies R. Scale, complexity and the representation of theories of change: Part II. Evaluation 2005;11:133–49. [Abstract](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToic3BldmkiO3M6NToicmVzaWQiO3M6ODoiMTEvMi8xMzMiO3M6NDoiYXRvbSI7czoyMDoiL3FoYy8yMy8xMi8xMDQwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 37. Douthwaite B, Kuby T, van de Fliert E, et al. Impact pathway evaluation: an approach for achieving and attributing impact in complex systems. Agric Syst 2003;78:243–65. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1016/S0308-521X(03)00128-8&link_type=DOI) 38. Rogers P J. Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation 2008;14: 29–48. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NToic3BldmkiO3M6NToicmVzaWQiO3M6NzoiMTQvMS8yOSI7czo0OiJhdG9tIjtzOjIwOiIvcWhjLzIzLzEyLzEwNDAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 39. W. K. Kellogg Foundation. Logic Model Development Guide. Battle Creek, MI. 2004. Retrieved from [http://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide](http://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide) 40. Larkin JH, Simon HA. Why a diagram is (sometimes) worth ten thousand words. Cogn Sci 1987;11:65–100. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1016/S0364-0213(87)80026-5&link_type=DOI) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=A1987H205100003&link_type=ISI) 41. Walshe K. Pseudoinnovation: the development and spread of healthcare quality improvement methodologies. Int J Qual Health Care 2009;21:153–9. [Abstract/FREE Full Text](http://qualitysafety.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NjoiaW50cWhjIjtzOjU6InJlc2lkIjtzOjg6IjIxLzMvMTUzIjtzOjQ6ImF0b20iO3M6MjA6Ii9xaGMvMjMvMTIvMTA0MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 42. Kislov R, Harvey G, Walshe K. Collaborations for leadership in applied health research and care: lessons from the theory of communities of practice. Implement Sci  2011; 6:64. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1186/1748-5908-6-64&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=21699712&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) 43. Ferlie EB, Shortell SM. Improving the quality of health care in the United Kingdom and the United States: a framework for change. Milbank Q 2001;79:281–315. [CrossRef](http://qualitysafety.bmj.com/lookup/external-ref?access_num=10.1111/1468-0009.00206&link_type=DOI) [PubMed](http://qualitysafety.bmj.com/lookup/external-ref?access_num=11439467&link_type=MED&atom=%2Fqhc%2F23%2F12%2F1040.atom) [Web of Science](http://qualitysafety.bmj.com/lookup/external-ref?access_num=000169237200006&link_type=ISI)