Article Text

Download PDFPDF

A nudge towards increased experimentation to more rapidly improve healthcare
  1. Allison H Oakes1,2,
  2. Mitesh S Patel1,2,3,4
  1. 1 Crescenz Veterans Affairs Medical Center, Philadelphia, Pennsylvania, USA
  2. 2 Penn Medicine Nudge Unit, University of Pennsylvania, Philadelphia, Pennsylvania, USA
  3. 3 Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
  4. 4 Wharton School, University of Pennsylvania, Philadelphia, Pennsylvania, USA
  1. Correspondence to Dr Mitesh S Patel, University of Pennsylvania, Philadelphia, PA 19104, USA; mpatel{at}

Statistics from

In any healthcare setting, the quality of care depends on the effectiveness of a given treatment, and on the way that the treatment is delivered. The complexities of modern healthcare have created gaps in our ability to consistently deliver the most effective and efficient care. As a result, significant undertreatment and overtreatment co-occur.1–3 This reality has led diverse stakeholders to overhaul the environment, context and systems in which healthcare professionals practice. However, while well intentioned, most ‘advances’ in healthcare delivery rely on untested or poorly tested interventions.4 5 This means that effective interventions don’t scale as fast as they should and that ineffective interventions persist despite providing no benefit. The current status quo presents an opportunity improve the delivery of care through a more systematic approach.

Successful innovation requires experimentation. Embedded research teams around the world have started to systematically test the impact of using subtle changes to the way information is framed or choices are offered to nudge medical decision making.6 7 The trial by Schmidtke demonstrates the feasibility and necessity of rapid-cycle, randomised testing within a healthcare system.8 The authors randomly assigned 7540 front-line staff to either receive a standard letter reminding them of influenza vaccination or one of three letters that used insights from behavioural economics to try and better nudge healthcare workers through different ways of framing social norms. Despite this effort, they found that all four arms had the same vaccination rate of 43%, meaning none of the social norm interventions led to meaningful changes in behaviour. All too often, policies and programmes that ‘make sense’ have been implemented without any kind of formal evaluation. In the Schmidtke trial, however, the rigorous study design allowed researchers to quickly and decisively conclude that the social norms letters were no better than a …

View Full Text

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Linked Articles