Article Text

Download PDFPDF

Spreading human factors expertise in healthcare: untangling the knots in people and systems
Free
  1. Ken Catchpole
  1. Correspondence to Dr Ken Catchpole, Surgical Safety and Human Factors Research, Cedars-Sinai Medical Center, Suite 302, 8797 Beverly Blvd, Los Angeles, CA 90048, USA; ken.catchpole{at}cshs.org

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Human factors (HF) is a term many involved in healthcare delivery are now familiar with, even though a decade ago most had never heard of the concept. The majority of clinicians and healthcare improvement specialists have learned of HF through a particular branch of practice that derived from aviation and arose from the need to address error, teamwork and communication issues. This behavioural safety approach, while entirely legitimate and increasingly well evidenced, is limited. Yet, it has dominated perceptions of what constitutes HF and shaped the application of HF principles in healthcare. Frequently espoused by well meaning clinicians and aviators, rather than academically qualified HF professionals, it has led to misunderstandings about the range of approaches, knowledge, science and techniques that can be applied from the field of HF to address patient safety and quality of care problems.

In this issue, Russ et al1 seek to redress some of the consequences of this misappropriation. They articulate the problems succinctly and expand on earlier calls2 ,3 for greater integration of HF expertise in healthcare. Repeatedly encountering recurrent misunderstandings and misuse of HF undoubtedly concerns academic experts. However, rather than feeling frustration over the ‘fictions’ discussed by Russ et al,1 HF professionals should be encouraged by the tremendous progress made in recent years—from a state in which clinicians had little exposure to HF work, and even fewer saw its value, to widespread acknowledgement of the value of human-centred systems thinking in healthcare.

Yet, the origins of the misunderstandings of HF discussed by Russ et al1 warrant reflection, as they may signal deeper problems in healthcare and the ways in which HF experts have worked in healthcare. One simple reason for misunderstandings about HF arising so commonly in healthcare may be that the spread of HF principles and activities in healthcare has involved many non-experts. A second, deeper source of the misunderstandings of HF in healthcare relates to the dominance of particular HF practices largely as applied to aviation. The remainder of this commentary explores this history of the importation of aviation-focused HF into healthcare.

In the context of the discussion by Russ et al1 about fictions and misunderstandings about HF among non-experts, this editorial aims to move beyond lines of demarcation about what does or does not constitute legitimate HF principles or practices, with the intention of creating a more bidirectional discussion between HF experts and clinicians about how to more productively advance an agenda that many of us regard as fundamentally important for the future of healthcare.

Before proceeding further, though, let me state that my beliefs in the value of HF expertise and human-centred systems design are highly partisan. I value tremendously engaging the clinical community in understanding human fallibility and applying high-quality HF practices and behavioural safety to improve healthcare systems. In part as an apology to those clinicians and HF practitioners whom I may have offended in attempting to achieve these goals, this editorial became a personal and professional imperative. Clinicians and HF professionals should be collectively proud of what has already been achieved. But, harnessing the true potential of HF in healthcare demands that we address the problematic ways in which their principles and techniques have been applied in healthcare to date and the ways in which HF professionals have tended to work with clinicians.

The aviation model: a double-edged sword

Healthcare safety analogies have been dominated by the aviation model.4 ,5 Analogies to aviation helped clinicians understand the principles of systems safety and error causation. They also stimulated reflection on hitherto under-acknowledged systemic vulnerabilities that contribute to technical failures and poor outcomes, hopefully supplanting the traditional view that individual clinicians wholly bear responsibility for all patient outcomes. Comparisons with aviation thus underpinned calls to reduce the culture of blame, and they provided a goal for safety that might one day be achieved in healthcare. Comparisons with other industries could have achieved these goals too. Nuclear power, the military and industries that use HF techniques have also been discussed in healthcare, but aviation is an industry with which we all have at least some familiarity. Indeed, aviation is probably where HF principles have been most successfully applied, and where the profession was born in the 1940s.6 About 40 years later, Crew Resource Management (CRM) training (often termed ‘HF training’) arose to address behavioural safety, after an ecosystem of system-level HF interventions (table 1) had already been established. However, in healthcare, CRM was among the first, and has been by far the most dominant HF paradigm.5

Table 1

Contrasts between aviation and healthcare

The translation of CRM training for healthcare has delivered better perceived teamwork and positive perceptual changes,7–12 better observed team skills,7 ,9 ,12 ,13 better satisfaction with care,7 improved compliance with briefings,14 better processes,15–17 reduced error rates9 ,12 ,18 and better organisational perceptions that help sustained institutional change.8–10 ,16 Indeed, the behavioural safety approach has received more scrutiny in healthcare than in any other industry, including aviation, amplifying the evidence base and raising the awareness and skills of individual clinicians. It also led to the development of a range of tools to assist in improving teamwork and communication that go far beyond the limited scope of such training.19–22 The promise of relatively simple behavioural solutions to safety problems leads to courses developed not by trained HF professionals, but by clinicians or aviators, based purely on the limited CRM models, yet erroneously claiming to teach HF. This development has made many academically qualified HF practitioners uneasy because of the lack of broader approaches to systems theory, HF integration, human-centred design or HF analysis techniques within these courses. An early study illustrates this tension between the academic and clinical models of HF training: we had the opportunity to note the views of James Reason … on many occasions he appeared to find the content of the course somewhat questionable and in some circumstances inaccurate … On several occasions he appeared quite agitated and perplexed at the way in which the course often ‘misses the point’ … [at lunchtime] he decided to stop watching.23

In 2011 the British Medical Journal published a head-to-head discussion asking the question: ‘Have we gone too far in translating ideas from aviation to patient safety?’.24 ,25 To a HF practitioner, this discussion seemed moot, highlighting the erroneous view that aviation provided the ‘principle’, rather than one of many exemplar applications of deeper, scientifically based principles of HF. These same principles have found application in the design of nuclear power stations, military equipment and a wide range of consumer products, including mobile phones, software, and even toothbrushes, chairs and kettles.26 In this broader view of HF, training counts as a limited approach to improving safety—just one aspect of a wider systems approach to equipment, task, environment and organisational design. As a result, the benefits of CRM training in healthcare can be mixed.27 ,28

While CRM courses in aviation were supported by a huge volume of other HF expertise and infrastructure at every level of that highly engineered system, this has not been the case in healthcare (table 1). Maintaining a focus on behavioural change alone can result in a persistent blame-and-retrain mentality that may fit clinical and administrative models of risk management, and the enthusiasm to learn from aviation, but does not fix the underlying organisational, environmental, technological or task-related problems that also predispose to error. As a consequence, the ‘human factor’ has become frequently and erroneously used as a semantic surrogate for blame, the underlying problems are perpetuated, and the perception of HF degraded. Thus, despite its successes, the application of HF through behavioural safety training alone is a reflection of wider endemic systems problems in healthcare that not only disregards the basic concepts that high-quality HF practice should purport, but does a considerable disservice to clinicians, the HF profession, and efforts to improve safety.

From human to system

Healthcare differs from other high-tech industries in which HF has found more widespread application. The transport, defence and nuclear power industries are technology mediated, and have largely been engineered in the last 150 years to achieve specific goals. Healthcare still largely depends on direct human interventions and grew more organically over hundreds (even thousands) of years. It has many (sometimes conflicting) goals, from preserving life and relieving distress to a wide range of service qualities related to efficiency, and satisfaction with care and the care environment. It is arguably more complex than any other broadly equivalent industry, and is extremely resource sensitive, making the evidence base critical and the return on investment often difficult to gauge.

One can train staff about safety relatively easily (and possibly demonstrate some value thereof), but replacing equipment, redesigning complex processes, and addressing environmental limitations of hospital buildings present far greater challenges. It takes time to understand and resolve these issues. Moreover, investigating these deeper systems problems can be painful and expensive for all involved. This ‘systems’ view50 ultimately challenges many concepts that clinical work is based on, such as the nature of evidence,51 the fallacy that good outcomes equate to good processes,52 and the fallacy of self-determinism, especially the view that errors cannot necessarily be avoided through force of will or more training. It can be disturbing for a clinical population who base their status, professional confidence and sometimes their business model on their individual abilities to realise how much their own performance is shaped by the equipment, tasks, environment and organisation around them. Human performance in healthcare systems is extremely complex, and the aviation CRM model alone—which does not address the level of individual, goal, task, evidential or conceptual complexity of clinical work—is insufficient to develop improved systems designs or better training.

The desire to use the broader range of HF principles poses a range of problems for funding, developing and using HF expertise. A relatively small number of qualified HF professionals work full time in healthcare anywhere in the world. Yet the expertise required to understand clinical complexities and conduct clinical HF improvement work requires considerable time and skill investment. HF positions in healthcare are far more elusive and transient than in every other high-tech industry, with many HF activities funded through research. However, HF does not suit traditional medical research paradigms—they are often seen as ‘soft’ science—making it precarious to rely on research funding to support HF work. This situation sharply contrasts with that in aviation, for which there is clear commitment to and acknowledged value of the integration of HF professionals with engineering, safety, training, maintenance and service delivery teams.29

From gate keepers of knowledge to trusted colleagues

The speed with which HF ideas have spread in healthcare reflects recognition of the tremendous need for the application of HF expertise. Clearly, many of these issues are not solely problems that concern HF, but relate to wider questions about how good ideas can be spread and deepened without dilution or degradation. Given the complexity of healthcare systems, there is a coherent argument to say that the aviation approach has been the best place to start. It has built an excellent evidence base, has created a shift towards a system view, has spread the concept of HF (no matter how limited) and enabled immediate and successful intervention. So while the term ‘HF’ has been frequently misappropriated, the spread of reasoning based on these principles can lead to a properly systemic approach. However, the widespread adoption of only a limited set of principles, and the dominance of non-qualified practitioners in this area is a cause for concern. This is an impassioned topic on both sides of the boundaries of knowledge, and I have occasionally managed to upset clinical colleagues who have taken pride in their interpretation and delivery of HF training, and equally dedicated HF colleagues, who feel that this is misappropriation, a professional slur, and potentially dangerous to staff and patients. Addressing this discrepancy may be a unique challenge for the HF profession, while keeping HF only in the domain of qualified practitioners would be alienating and counter-productive.

Clinical HF is different from HF in any other industry. If we wish healthcare to be fundamentally changed by HF, we must also expect HF to be changed by healthcare. Collaboration between clinicians and HF professionals, with each shaping the views of the other, will develop and extend the use of HF for the unique demands of healthcare. We need to develop accreditation for HF professionals working in healthcare, a greater presence of HF in the design of clinical systems and technologies, and in accident investigation and safety management, and we need to deliver training programmes in behavioural change and in system-level HF and appropriate analytical techniques. We also need an infrastructure that permanently supports that work, as has every other industry that has benefitted from HF.

Such developments will require a business model and a commitment to take this expertise forward. The Clinical HF Group (http://www.chfg.org) in the UK has made progress in attempting to address these issues, and has become influential and trusted precisely as it has arisen from the aviation model, but is now moving towards a wider definition, while involving all stakeholders in these challenging discussions. As this shift toward the deeper HF issues occurs, HF practitioners need to work across disciplinary boundaries to demonstrate and teach the value of what they can do. Then, we will be able to untangle these knots and appropriately use HF expertise to create better human-centred healthcare systems.

References

Footnotes

  • Funding Dr Catchpole receives funding from the US Department of Defence, Telemedicine and Advanced Technology Research Center grant W81XWH-10-1-1039, which seeks to reengineer teamwork and technology for twenty-first century trauma care.

  • Competing interests Dr Catchpole has published several papers on ‘aviation style’ human factors training and in the past received honoraria and consultancy fees for delivering lectures, training courses and analyses on this topic.

  • Provenance and peer review Commissioned; internally peer reviewed.

Linked Articles