Article Text

Download PDFPDF

Education as a low-value improvement intervention: often necessary but rarely sufficient
Free
  1. Christine Soong1,
  2. Kaveh G Shojania2
  1. 1 GIM, Mount Sinai Hospital, Toronto, Ontario, Canada
  2. 2 Department of Medicine, University of Toronto Faculty of Medicine, Toronto, Ontario, Canada
  1. Correspondence to Dr Christine Soong, GIM, Mount Sinai Hospital, Toronto, ON M5G 1X5, Canada; christine.soong{at}utoronto.ca

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Since the launch of Choosing Wisely in the United States,1 efforts to raise awareness about avoiding low-value care have spread internationally,2 prompting numerous commentaries,3–7 descriptive studies and improvement interventions,8–10 as well as inspiring new hospital job descriptions (eg, Chief Value Officer), journal sections11 and conferences devoted to the ‘Less is More’ paradigm. Low-value clinical care refers to services or interventions that provide little to no benefit to patients in specific clinical scenarios, may cause harm and/or incur unnecessary cost.6 12 13

One example of a commonly encountered low-value practice is the continuation of proton pump inhibitors (PPIs) in patients without indication for ongoing use. Following completion of a defined period of therapy for appropriate indications (eg, peptic ulcer disease), continued use of PPIs provides little value, yet de-prescribing occurs infrequently. Moreover, this low-value use unnecessarily exposes patients to associated PPI-related adverse events such as pneumonia and Clostridioides difficile infections.14 15 Like many other areas of low-value care, PPI deprescribing is the focus of numerous quality improvement interventions.16–18

In this issue of BMJ Quality and Safety, Bruno and colleagues examined the impact of a national educational intervention aimed at reducing outpatient PPI prescriptions in Australia.19 Australia’s NPS MedicineWise (previously the National Prescribing Service) developed the study intervention released in association with Choosing Wisely Australia’s similar educational materials and alerts highlighting the importance of reducing or ceasing PPI prescribing in the absence of specific indications. The programme began with mailings to general practitioners (GPs) showing data about their own PPI prescribing compared with other GPs nationwide. The remainder of the programme consisted of educational interventions, which the authors describe as a mixture of ‘passive’ and ‘active’ components. Passive components of the intervention included online educational resources, mailed evidenced-practice summary sheets and other informational materials.

Active components of the educational programme reported by Bruno et al included: an online self-audit tool allowing GPs to upload information pertaining to 10 of their patients and receive immediate and dynamic patient-specific recommendations; and interactive clinical scenarios related to PPI therapy, with feedback and expert commentary. Uptake of active components was not monitored. As an incentive to complete these educational activities, GPs could receive continuing professional development points.

Over the course of the study period, the investigators observed a small (1.7%) decrease in dispensings of PPIs, without a reduction in statins which represented a control comparator. Yet, the authors found no significant changes in the monthly rate of PPI discontinuation or dose reductions—the main outcomes targeted by the intervention. As the authors noted, one aspect of the Australian context which may have in part limited the intervention’s impact is that the publicly subsidised drug system does not cover the lower strength formulation for esomeprazole, the most commonly prescribed PPI in Australia. Still, as the authors also acknowledged, ‘educational initiatives working alone are unlikely to make the inroads required to curb overuse of PPIs’.

The limited impact of education as an improvement intervention

Numerous studies have reached similar conclusions—that relying on educational interventions to change clinicians’ behaviours tends to produce no improvement, making this category of interventions the most predictably disappointing among improvement efforts. For instance, two systematic reviews examining the effectiveness of quality improvement interventions involving outpatient diabetes management analysed the relative effectiveness of different categories of intervention components (eg, case management, team changes, audit and feedback and clinician education).20 21 When compared with other intervention types, clinician education had negligible impact on glycaemic control. Other systematic reviews have demonstrated similar findings of little to no improvements when examining the impact of education on physician behaviour and clinical outcomes.22 23 Recognising this reality, the Infectious Diseases Society of America’s guideline on implementing antimicrobial stewardship explicitly recommends ‘against relying solely on didactic educational materials’ as a strategy to reduce antimicrobial overuse.24

In the classic ‘hierarchy of effectiveness’ often shown in human factors engineering (figure 1), education ranks as the least effective intervention, right below new rules and policies and far below more system focused categories such as forcing functions and automation.20 21 25 26 Passive educational activities, such as didactic sessions, online modules and delivery of informational materials, produce particularly low impact. Active educational strategies such as educational outreach to clinicians (similar to the ‘academic detailing’ undertaken by pharmaceutical representatives) can achieve better results with improvements comparable to those of audit and feedback and computerised decision support.26 While active educational interventions tend to produce greater improvements than do passive ones, they also require greater investments of personnel time, hence their lower uptake as improvement strategies.

Figure 1

The hierarchy of intervention effectiveness (Adapted from the Institute for Safe Medication Practices25 and Patientsafe Implementing effective safety solutions.43

As a sole strategy, education rarely results in sustained behaviour change, earning it a ‛necessary but insufficient’ status among improvement interventions. A classic adage in quality improvement recommends making the right thing to do the easy thing to do. Education never achieves this. Just as clinical practices can have low value, so can improvement interventions. Passive educational interventions, such as lectures and educational handout materials, often fall into this category. If educational interventions consistently delivered small to moderate improvements, they would have moderate value since they cost so little to deliver. Unfortunately, educational interventions often achieve minimal to zero improvement while requiring at least some resources to implement, hence our characterisation of education as having low value as an improvement strategy.

Below, we describe common scenarios in which over-reliance on low-value educational interventions fails to produce results.

Scenario #1: education applied to problems that do not involve knowledge deficits

When a nurse or physician forgets to check for allergies before administering or prescribing a medication, this oversight usually represents a ‘slip’, not a conscious mistake reflecting a lack of knowledge. Consequently, education about the importance of asking patients about allergies will achieve little to no reduction in the frequency with which clinicians forget to check for allergies. A successful intervention would prompt clinicians to check for allergies at the time of entering a medication order or administering a medication—as with alerts in electronic prescribing or bar-coded medication administration systems. These systems are problematic in other ways,27 28 but they at least address the correct underlying problem.

Hand hygiene provides another example. Few clinicians do not know the recommendation to perform hand hygiene to prevent spread of infections. But, hand hygiene has not become an ingrained habit for most healthcare workers.29 Some investigators have looked at ways to foster the development of this habit.30 But, the need for hand hygiene arises so often throughout a given day that slips can constitute a problem even for clinicians with the habit. Thus, effective educational interventions need to focus on appropriate knowledge targets (eg, common misconceptions about when hand hygiene might not be needed) and must be accompanied by interventions involving cues to remind clinicians as they enter and exit patient rooms, as well as attention to convenient placement of sinks and hand hygiene dispensers.

Forgetting to apply knowledge may also occur as a result of distraction in the midst of other considerations during a given patient encounter. For instance, a GP seeing a patient with diabetes may focus on the patient’s glycaemic control, the need for referral to an eye specialist and other aspects of chronic disease management, but forget to discuss discontinuation of the PPI the patient has been taking for several years.

Various researchers have highlighted the importance of ‘having a theory’ for an improvement intervention—a clearly articulated mechanism for how a proposed intervention addresses the main causes of a target quality problem.31–33 Too often an educational intervention is chosen without a plausible theory for lack of knowledge as the main cause of a quality problem. And, even when aspect of knowledge does play a role, a compelling theory for solving the problem on the basis of education alone seldom exists.

Scenario #2: education makes sense in principle but requires too frequent repetition

Even when a target quality problem clearly involves a lack of knowledge or skills, the opportunity to apply successfully acquired educational content may occur infrequently. In such circumstances, initially successful acquisition of the requisite knowledge or skills may erode over time. For this reason, an intervention to help emergency physicians with a rarely performed but potentially life-saving intervention used a ‘just-in-time’ educational video.34 This video included a brief (30 s) refresher with audio narration of the key steps in the procedure, followed by a step-by-step interactive checklist for performing the procedure. Delivering this information in a lecture or other passive educational intervention would almost certainly have achieved no improvement, as clinicians would simply forget what they had learnt by the time (months or years later) they had to perform the procedure.

Repeated delivery to sustain education directed at uncommon situations clearly represents a low-value proposition. But, education can have low-value even when the subject of educational interventions involves more common situations because staff turnover necessitates periodic re-delivery of the same education. This represents a particular problem in teaching hospitals, where trainees deliver much frontline yet rotate in and out of units and clinics on a monthly (sometimes weekly) basis.

Scenario #3: other factors impede application of the requisite knowledge or skills

Educating clinicians about the lack of benefit from, say, prescribing antibiotics for the common cold35 does not make it any easier to dissuade a patient who came to the clinic desiring precisely this outcome from the visit.36–39 Similarly, education delivered to clinicians about recommendations against routine screening for prostate cancer in men over 75 years of age40 does not equip clinicians with the materials or communication techniques likely to reassure patients interested in such screening.

System factors may also thwart the objectives of educational interventions. A GP may have taken on board the educational message of a Choosing Wisely initiative not to order advanced imaging for patients who have low back pain without any high-risk features.9 But, she might also know that the only way to obtain a timely consultation with a spine surgeon is to have an MRI available for review. A patient admitted to the hospital with heart failure might have had an echocardiogram 6 months ago at an outpatient facility, but it seems more expedient for the inpatient medical team to order another echocardiogram to have it handy right away.

In teaching hospitals, there is also the issue of ordering tests to show that one knows what one is doing—a trainee may order a number of tests unlikely to show anything useful for the patient but do serve to demonstrate to the attending physician that the trainee has considered an appropriately broad differential diagnosis.41 And, of course, financial incentives can drive practice, rewarding clinicians for the volume of care delivered rather than the health outcomes achieved or consistency with best practice.

Conclusion

Passive educational methods such as lectures and distributing informational materials are frequently misapplied to address quality problems that do not primarily reflect a knowledge gap. Even in situations of known deficits in knowledge and/or skills, other systems factors diminish the effectiveness of educational interventions. Education can support improvement interventions by engaging clinicians or familiarising them with the justification for an intervention. But education on its own, especially passive education, typically delivers little value as a change strategy. Admittedly, education can also serve the purpose of ‘raising awareness,’ as has been the case with educating patients and providers about ‘low-value care’,2 42 such as deprescribing PPIs and other Choosing Wisely targets. But achieving worthwhile impact requires designing high-value improvement interventions featuring more effective systems-based changes. These higher value improvement strategies make ‘the right thing to do the easy thing to do’ and include education only when it has a clear role to play.

References

Footnotes

  • Twitter @christinesoong

  • Contributors All authors contributed equally to the design, concept and drafting of the manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Commissioned; internally peer reviewed.

Linked Articles