Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
I live on an island 70 miles from Boston. I occasionally travel to the US mainland by ferry, but more often via a small airline with nine-passenger planes that are usually flown by a solitary pilot. The pilot’s preflight checklist is a small plastic card, usually reviewed in a matter of seconds—arguably, a simple task. One checklist item is to assure that the flaps are retracted, unlike jetliners that take off with flaps lowered to increase the wing surface area for lift. On a recent flight, the pilot reviewed the checklist and proceeded down the Boston runway. At an altitude of perhaps 10 m, her right hand momentarily moved off the throttle and quickly activated the lever to fully retract the flaps. In spite of reviewing the checklist, she had initiated takeoff with the flaps down. Was this caused by momentary inattention? Boredom? Perhaps she had recently piloted larger planes. Checklists and professional autonomy are brought to mind with increasing frequency, and, as in this case, not always in reassuring settings.
A COMPLEX CONTEXT FOR SIMPLE TASKS
Given the attention that is appropriately focused on the role for complexity science in healthcare improvement,1 Liu and colleagues2 (see page 93) seek to remind us of the simple and complicated tasks that also offer opportunities for improving healthcare quality and patient safety. Simple solutions—such as standardised order forms and checklists—are invariably embedded in complex systems that make their implementation less straightforward than may appear on first glance. Nevertheless, their report makes the case that such forcing functions can play a role in improving timeliness of antibiotic administration for community-acquired pneumonia (CAP).2
I am not sure that the simple and complicated bits are always so easy. The message that stands out in Liu and colleagues’ report is that health professionals should seek out, analyse deeply and engage in simple and complicated approaches when possible. Mathews and Pronovost remind us3 that simple activities such as checklists, when properly applied, are wise and just obligations for healthcare providers. And yet, the pursuit of such strategies can be less than fully effective. I continue to be bemused that we persist in trying to implement the simple task of handwashing with only mixed success, even though effectiveness data for handwashing have been available since the era of Simmelweiss.
A corollary lesson appears to be that the careful identification of a precise, simple strategy tailored to a particular context may be just what is needed to override health professionals’ broad and deep knowledge that otherwise compels them to insist on autonomous—but not always correct—decision making. Mills et al, for example, reported that training and education to avoid adverse drug events (ADEs) actually had a negative effect on ADEs in the absence of alerts and other forcing functions provided by a medication order entry system in the US Veterans Affairs Healthcare System.4
A POSSIBLE ALTERNATIVE TO ALL-OR-NONE RULES
In selecting antibiotic initiation time for CAP, Liu et al picked a much-debated measure that has its share of detractors.5 Yet implicit in their report2 is an important insight regarding all-or-none global practice rules such as the 4-h administration of antibiotics for CAP. Careful dissection of such global rules can lead to simple, complicated and complex options that are embedded in such rules and may direct the provider to the correct application of all or part of the rule in the appropriate patient. All-or-none is then replaced by context-driven, critical, professional judgement about what is appropriate in simple, complicated or complex ways. For example, simple therapeutic rules that are based on evidence usually trump patient preferences5; disordered, complicated patient physiology trumps simplistic therapeutic rules; and, generally, complex social and emotional contexts such as the hypothetical example of the elderly patient with the do-not-resuscitate preference2 trump complicated patient management decisions. In this regard, the question may be less a matter of autonomy versus a global guideline, and more a matter of adapting best-fit components of evidence-based practice to the precise context of the patient at hand.
FEEDBACK PROVIDED BY TEAMS AND SYSTEMS
Amalberti and colleagues suggested that the option to implement simple solutions requires a collaborative or team context to achieve highest levels of safety.6 Liu et al built into their hospital practice environment, real-time feedback that supplied the provider with performance information and clinical outcomes. This system-level feedback—similar to the feedback function found in high performance teams—helped identify system issues that interfered with a flexible and informed provider–patient relationship. It helped maintain an appropriate balance between autonomy and adherence to clearly defined performance measures.
I often reflect on possible explanations for what was going on that day on the Boston airport runway. One possibility is that performance of simple tasks by highly trained experts may just degrade on frequent repetition. Another possibility—apparently not applicable in this case—is that there are occasions when experts appropriately override rules in the interest of safety.6 However, I think that there is a third, straightforward explanation. An autonomous (solitary) pilot, not unlike the autonomous practitioner, must perform tasks in high-risk contexts that may just be too complicated in the absence of the feedback provided by a high-performance team. I’ll wager the flaps would have been in the correct position had there simply been a copilot in the adjacent seat.
Competing interests: None.