Article Text

Download PDFPDF

Implementation and de-implementation: two sides of the same coin?
  1. Leti van Bodegom-Vos1,
  2. Frank Davidoff2,
  3. Perla J Marang-van de Mheen3
  1. 1Department of Medical Decision Making, Leiden University Medical Center, Leiden, The Netherlands
  2. 2The Dartmouth Institute, Geisel School of Medicine at Dartmouth College, Hanover, New Hampshire, USA
  3. 3Medical Decision Making, J10-S, Leiden University Medical Centre, Leiden, The Netherlands
  1. Correspondence to Dr Leti van Bodegom-Vos, Department of Medical Decision Making, Leiden University Medical Center, PO Box 9600, Leiden 2300 RC, The Netherlands; l.vanbodegom-vos{at}lumc.nl

Abstract

Avoiding low value care received increasing attention in many countries, as with the Choosing Wisely campaign and other initiatives to abandon care that wastes resources or delivers no benefit to patients. While an extensive literature characterises approaches to implementing evidence-based care, we have limited understanding of the process of de-implementation, such as abandoning existing low value practices. To learn more about the differences between implementation and de-implementation, we explored the literature and analysed data from two published studies (one implementation and one de-implementation) by the same orthopaedic surgeons. We defined ‘leaders’ as those orthopaedic surgeons who implemented, or de-implemented, the target processes of care and laggards as those who did not. Our findings suggest that leaders in implementation share some characteristics with leaders in de-implementation when comparing them with laggards, such as more open to new evidence, younger and less time in clinical practice. However, leaders in de-implementation and implementation differed in some other characteristics and were not the same persons. Thus, leading in implementation or de-implementation may depend to some degree on the type of intervention rather than entirely reflecting personal characteristics. De-implementation seemed to be hampered by motivational factors such as department priorities, and economic and political factors such as cost-benefit considerations in care delivery, whereas organisational factors were associated only with implementation. The only barrier or facilitator common to both implementation and de-implementation consisted of outcome expectancy (ie, the perceived net benefit to patients). Future studies need to test the hypotheses generated from this study and improve our understanding of differences between the processes of implementation and de-implementation in the people who are most likely to lead (or resist) these efforts.

  • Implementation science
  • Quality improvement
  • Surgery

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Background

In recent decades, abandonment of low value care has become more important in many countries. Evidence shows for example that an estimated 30% of all medical spending in the USA is unnecessary, and does not add value in care.1 The importance of abandoning low value care is underscored by the Choosing Wisely campaign which was launched in the USA in 2012 to encourage physicians and patients to engage in conversations about unnecessary tests, treatments and procedures; the campaign is now being adopted in many other countries.2 ,3 A key element of Choosing Wisely is that medical societies create ‘better not to do’ lists of tests, treatments and procedures in their discipline for which there is strong evidence of overuse, potential harm or significant and unjustifiable costs. The next step is to translate these ‘better not to do’ lists into action. A recent study of Rosenberg et al4 has shown that creating such recommendations is not enough for the abandonment of low value care. Additional interventions are needed to support clinicians and their teams in implementing such recommendations.5 Benchmarking, data feedback, communication training, systems interventions (eg, clinical decision support), patient-focused strategies and financial incentives are examples of interventions that can drive change.4 ,5 However, whereas an extensive literature exists on implementation of innovations, our understanding of the process of abandonment of existing low value care is limited:6 ,7 little knowledge is available about the specific agents involved in abandonment, the barriers and facilitators for abandonment and effective interventions that accelerate abandonment of low value care.

Terminology and underlying concepts

No common language or conceptual framework in research initiatives about abandonment of low value care is currently available, which makes it difficult to search literature and to connect the results of relevant research initiatives. Niven et al8 have recently demonstrated the lack of common language in this area; they identified 43 different terms for the process of abandonment, with ‘disinvest*’ (39%) and ‘decrease use’ (24%) being the most frequently cited terms. Although all such terms are seen as being more or less synonymous, important differences among them exist. To facilitate the connection between future research initiatives, we recommend future publications to distinguish between different terms for the abandonment of low value care and use appropriate words. For example, the verbs ‘decrease use’ and ‘withdraw’ describe the direction of a change, but say nothing about what activates the change. Undiffusion, discontinuance and de-adoption are, on the other hand, viewed as physician-initiated processes, in which physicians themselves decide to abandon the care.9 In contrast, disinvestment and de-implementation are managed processes that require specific and different types of activities.7 In disinvestment, healthcare resources are partially or completely withdrawn from an existing healthcare practice, procedure, technology or pharmaceutical.10 In de-implementation, the use of low value care is reduced or stopped on a structural basis in a planned process that uses a set of activities, which can include financial disincentives, as well as other activities such as data feedback, systems interventions and the like. We favour the term de-implementation, because abandoning low value care seems to require managed action that addresses a variety of the factors that drive use of low value care, rather than simply employing financial disincentives.4 This is also seen in quality improvement interventions, which often do not happen automatically but require a strategy that includes multiple elements, such as education, audit and feedback.11

We can distinguish two types of de-implementation: (1) substitution, which occurs when an innovation is substituted by an alternative (eg, radiography replaced by computerised tomography) and (2) disenchantment, which occurs when new information indicates that the existing medical care is not effective, or the benefits of the existing medical care do not justify the costs or adverse effects (eg, adenotonsillectomy has no major clinical benefits over watchful waiting in children with mild symptoms of throat infections).7 ,9 In the case of substitution, de-implementation is sometimes viewed as the ‘other side of the coin’ in relation to implementation, because the abandonment of low value care entails the substitution of a new practice. However, it is increasingly recognised that a pervasive asymmetry in human psychology makes it harder to give up low value modes of care than to adopt new and more promising modes, even when new evidence reveals that the former have little or no benefit.6 ,12 The activities required to abandon low value care, both through substitution and disenchantment, might therefore not be the simple inverse of those needed for implementation and diffusion.12

To learn more about the differences between activities needed for implementation and de-implementation, we first need insight in the people and factors that drive implementation and de-implementation. After all, activities directed at the implementation of promising innovations, or de-implementation of low value care can be expected to be more effective when they are specifically focused at appropriate target groups and at factors that affect implementation or de-implementation.13 In this paper, we therefore try to answer the following two questions: Are the characteristics of the people involved early on in the implementation of new practices the same as those who are the first to de-implement established practices? And, do the same or analogous factors drive implementation and de-implementation? We approach this task by studying both the existing literature and analyse available empirical data.

Insights from the literature

Do implementation and de-implementation involve the same group of people?

Theories of behaviour change identify several subgroups within the overall group of agents involved in an implementation initiative. The most familiar classification is that of Rogers, whose analysis concerns the spontaneous process in which adoption of innovations begins among innovators and early adopters (‘leaders’), followed by an early majority, a late majority and laggards.9 Rogers describes innovators and early adopters as being especially willing to take risks, similar in age to laggards, high in social and financial status, above average in education, more favourable than average towards science, actively engaged in information seeking, oriented towards commercial activity and interested in interacting with other innovators.9 According to Rogers, leaders of de-implementation share some of the same characteristics as laggards in implementation: being less educated, having lower socioeconomic status and less contact than average with change agents.9 Davidoff suggests in addition that leaders in de-implementation may never have been fully convinced that the practice in question was effective, safe or both, tend to be cautious and risk averse and may lack opinion-leadership.6 However, at least some early adopters in implementation might also be early de-implementers, because such people tend to be open to new evidence, have a high degree of opinion-leadership and generally be respected by their peers.6 Empirical evidence is lacking whether leaders in implementation are the same persons as leaders in de-implementation, and whether they share personal characteristics with leaders in implementation.

Do the same factors drive change in implementation and de-implementation?

Several models describe factors that influence the pace and extent of the implementation of an innovation, for example, implementation framework by Grol et al,14 the model by Cabana et al,15 determinants of change model by Fleuren et al,16 consolidated framework by Damschroder et al,17 Bate's six challenges18 and Model for Understanding Success in Quality by Kaplan et al.19 These models suggest that the following groups of factors influence the pace and extent of implementation:

  • the innovation itself: for example, empirical evidence, attractiveness, credibility, feasibility;

  • the individual health professionals: for example, awareness, knowledge, attitude, outcome expectancy (ie, the perceived net benefit to patients), beliefs, routines;

  • the patients: for example, knowledge, skills, attitude, adherence;

  • the social context: for example, opinion of colleagues, culture of the network, leadership;

  • the organisational context: for example, organisation of care processes, staff, capacities, structures and

  • the economic and political context: for example, financial arrangements, regulations, policies.14

A number of commentaries, editorials and opinion pieces20–25 suggest that the same or similar groups of factors influence the pace and extent of de-implementation. However, many authors also point out that psychological biases are at play in de-implementation that are not present in implementation.12 ,24 ,26 ,27 Examples of these biases are confirmation bias (the tendency of professionals to favour information that confirms prior beliefs) and loss aversion (people are more strongly distressed by losses than they are gratified by similarly sized gains). In addition to these psychological biases, pressure for de-implementation can threaten providers' sense of professional autonomy, which may strengthen their resistance to abandoning low value care.27 Many important questions about de-implementation therefore remain: do the barriers and facilitators within each of the above groupings play comparable roles in implementation and de-implementation? Is each grouping similar in its effect on the pace and extent of implementation and de-implementation?

Empirical data

We use empirical data here from two previously published studies, which are performed at different moments in time: one involving implementation28 ,29 and one involving de-implementation30 ,31 among Dutch orthopaedic surgeons. The implementation study aimed to identify factors influencing the implementation of a step-wise strategy of non-surgical therapy for patients with hip and knee osteoarthritis (OA) (including medication and referral guidelines) before undertaking surgical therapy.14 The de-implementation study aimed to identify factors influencing the de-implementation of low value blood management practices in primary hip and knee arthroplasties (the use of perioperative cell salvage (CS) and erythropoietin (EPO)).16 These studies were chosen because these were carried out in the same 86 Dutch orthopaedic surgeons, which enables us to study whether leaders in implementation are the same persons as leaders in de-implementation. As these data represent a subset of our findings and were not collected for this purpose, we use them here only to generate hypotheses. See online supplementary appendix 1 for more detail about the data collection and the method used to analyse the data.

Are leaders and laggards the same group of people in implementation and de-implementation?

We analysed the data of our implementation and de-implementation study among orthopaedic surgeons and first compared leaders and laggards in implementation and de-implementation on several characteristics (table 1). Student's t tests were used for continuous variables and χ2 tests for categorical variables. In addition, we compared leaders and laggards in implementation and de-implementation to assess whether these were the same persons, using χ2 tests. Leaders were defined as those orthopaedic surgeons who implemented, or de-implemented, the medication guidelines, referral guidelines or low value blood management practices and laggards as those who did not.

Table 1

Characteristics of leaders and laggards in implementation and de-implementation (n=86)

As shown in table 1, leaders in implementation and de-implementation share some characteristics when comparing them with laggards. First, leaders in implementation are significantly younger (45.6 vs 50.0 years, p<0.05) and have less work experience (10.1 vs 14.6 years, p<0.05 and 9.7 vs 13.6 years, p<0.05) than laggards. In de-implementation the same trend is observed, although somewhat smaller, and not statistically significant. A possible explanation may be that younger professionals with less years of experience had received more recent education with up-to-date knowledge on best evidence practices, and that professionals of higher age and with more years of experience have more highly ingrained habits and patterns of practice, which are less easy to change. Second, leaders in both implementation and de-implementation less frequently view their own clinical experience as being more reliable than randomised controlled trials (new evidence) compared with laggards (table 1). This finding supports the idea of Davidoff that people who are open to new research evidence might be early adopters in both implementation and de-implementation.6 Furthermore, we did not find any significant difference between leaders and laggards in the type of practice setting where they worked (table 1). Since surgeons in general hospitals and private clinics usually have higher incomes than those in university hospitals, clinicians' economic orientation does not seem to determine whether they are likely to be a leader in implementation or de-implementation. However, our study sample may have been too small to detect meaningful differences in this area, and one could argue that strong financial position and commercial orientation are less important in de-implementation than in implementation, since implementation requires greater upfront investments. On the other hand, the savings resulting from de-implementation are frequently accompanied by negative outcomes, including the abandonment of sunk costs and losses of income; achieving those savings can also require substantial investments, including changes in culture, which may not be adequately compensated.25

Besides these shared characteristics, differences also exist between leaders in implementation and leaders in de-implementation. Our orthopaedist leaders in implementation of referral guidelines see on average more new patients with knee OA per month than our orthopaedist laggards in implementation (47.8 vs 26.1 patients, p<0.05), which may be a proxy for the extent of involvement with the patient group. In contrast, leaders in de-implementation seem to see fewer new patients with knee OA per month than laggards, although this difference was not statistically significant (26.9 vs 31.5, p>0.05). The explanation for these findings may lie in two different processes. The first is that professionals seeing more patients may be more inclined to try new innovations for this patient group, than professionals seeing fewer patients. The second is that professionals seeing more patients may be more reluctant to withhold an intervention for these patients in their desire to avoid injustice.

Even though leaders in implementation and de-implementation in our studies show many similar characteristics, they were not the same persons: we did not find a significant association between individual leaders in de-implementation and individual leaders in implementation of medication guidelines (χ2=0.50, p>0.05) or referral guidelines (χ2=1.01, p>0.05). Part of the explanation may be that we are not looking at the implementation and de-implementation of exactly the same intervention. Answering such a research question is probably not feasible due to the lead time between the implementation of an intervention and its subsequent de-implementation, when it is shown to be ineffective or becomes obsolete as result of new developments. These findings suggest that being a leader in implementation or de-implementation may depend to some degree on the type of intervention rather than being entirely a personal characteristic.

Do the same factors drive change in implementation and de-implementation?

We also explored which factors influence implementation and de-implementation, and whether these are the same or different in the two processes. See online supplementary appendix 1 for the method used to identify independent groups of factors associated with implementation and de-implementation.

As shown in table 2, both implementation of guidelines and de-implementation of low value blood management practices are independently associated with our orthopaedic surgeons' outcome expectations of the recommended interventions. They were more likely to have implemented the medication guidelines if they felt that patients would benefit from acetaminophen and non-steroidal anti-inflammatory drugs (OR 2.73, 95% CI 1.46 to 5.08). Similarly, if they were convinced about the effectiveness of CS/EPO or felt a lack of benefit for care delivery, they were less likely to have de-implemented these practices (OR 0.30, 95% CI 0.13 to 0.71). This means that outcome expectancy can influence the pace and extent of both implementation as de-implementation.

Table 2

Groups of barriers and facilitators independently associated with implementation and de-implementation

Table 2 also suggests that implementation and de-implementation are potentially driven by other factors. Next to outcome expectancy, the implementation of guidelines is independently associated in multivariable analyses with the organisational context in which guidelines need to be adopted. If surgeons have clarity on what the patient has done at the physical therapist and there is easy communication with the dietary therapist, they were more likely to have implemented the referral guidelines (OR 2.54, 95% CI 1.06 to 6.05). This suggests that orthopaedic surgeons who are more positive about collaboration and communication between the disciplines involved in guideline-driven care are more likely to follow the guidelines. Although factors related to organisational context were also identified as being potentially relevant to de-implementation of low value blood management practices, these factors were not significantly associated with the abandonment of these practices in univariate regression analysis, and multivariate analyses for them are thus not included in table 2.

De-implementation on the other hand, is independently associated with the motivation of orthopaedic surgeons to use the low value care and the economic and political context in which they are working in multivariate analyses (table 2). If orthopaedic surgeons themselves or their department considered it very important to prevent blood transfusions, they were less likely to have de-implemented the low value blood management practices (OR 0.28, 95% CI 0.11 to 0.70). Likewise, if they considered it not important to save costs of EPO/CS, they were also less likely to have de-implemented EPO/CS (OR 0.43; 95% CI 0.19 to 0.98). Although economic and political context factors could have contributed to orthopaedic surgeons' motivation to implement guidelines, these factors were not significantly associated with the implementation of guidelines in univariate regression analysis. In summary, orthopaedic surgeons are less likely to abandon low value care (in our case EPO and CS), when their personal motivation to do so is potentially incompatible with the abandonment of the low value care, and when they are uncomfortable in considering the hospital and societal costs of care delivery in their decision making for patients.

Our data suggest that implementation and de-implementation are not simply two sides of the same coin. First, the people involved early on (leaders) in the implementation of new practices are not necessarily the same as leaders first to de-implement established practices, although they share some characteristics, for example, being open to new evidence, relatively young and having less experience in clinical practice. Second, the factors driving implementation and de-implementation appear to be different in important ways—apart from outcome expectancy, which appears to be important in both processes. De-implementation can be hampered by motivational factors such as departmental priorities, and economic and political factors such as a lack of cost-benefit considerations in care delivery, whereas organisational factors appear to be associated only with implementation. These findings are congruent with previous literature about de-implementation which argues for changes in policies and/or restriction of funding for low value care as effective strategies for de-implementation.8 ,27

What are the most important research questions now?

Because our study sample was small, we may have missed important differences and similarities in the factors that influence implementation and de-implementation. In addition, implementation and de-implementation practices were enumerated based on subjective survey responses of orthopaedic surgeons. As a consequence, we do not know whether their responses reflect their actual use or that they gave socially desirable answers. Therefore, we call for more research in larger study samples which use more objective measures to confirm, refute or add to the first explorations done in this paper. Based on the literature and our data, it seems important that future research should focus on: (1) the characteristics of leaders in de-implementation. Do leaders in de-implementation have specific personal characteristics? Or is leadership in de-implementation mainly determined by the fit between the low value care and the outcome expectancy and motivation of the individual professional? Identifying these characteristics would be helpful in the search for clinical champions who can act as accelerators of de-implementation initiatives; (2) factors that influence the extent and pace of de-implementation. What factors drive de-implementation? Are these factors generalisable to de-implementation of all types of low value care or are they related to the attractiveness of the medical care that needs to be de-implemented? Is it harder to de-implement low value care without substitution than to replace it with an alternative? Do such different types of de-implementation respond to different influencing factors? Insight into these issues could be important in taking the next step: translation of ‘better not to do’ lists, and knowledge about low value care, into action to improve the quality and outcomes of care.

Acknowledgments

We thank Stefanie Hofstede, Anja van der Hout, Veronique Voorn and Manon Wentink for gathering the data of the implementation and/or the de-implementation study.

References

Footnotes

  • Twitter Follow Leti van Bodegom-Vos at @lvanbodegomvos

  • Contributors LvB-V and PJM-vdM conceived the study. LvB-V carried out the analyses and wrote the first draft. FD and PJM-vdM critically revised the manuscript for content, and approved the final version.

  • Funding The data of the implementation and de-implementation study used in this manuscript are gathered in two previously executed studies that were financially supported by grants from ZonMw, the Netherlands Organisation for Health Research and Development, grant numbers 837004002 and 837003001, and from the Netherlands Centre for Clinical Transfusion Research, grant number PPOC13-010.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.