Article Text

Download PDFPDF

Patient safety
From patients to politicians: a cognitive engineering view of patient safety
  1. K J Vicente
  1. Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cognitive Engineering Laboratory, Department of Mechanical & Industrial Engineering, Institute of Biomaterials and Biomedical Engineering, Department of Computer Science, Department of Electrical & Computer Engineering, University of Toronto, Toronto, Ontario M5S 3G8, Canada; vicente{at}mie.utoronto.ca

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Sizeable and long lasting reductions in adverse events cannot be realized unless decision makers at all levels pay attention to the global system phenomenon of inadvertent harm to patients.

Many healthcare providers now know that patient safety poses a significant risk to public health. The American statistics in particular are frequently cited: preventable medical error is the eighth leading cause of death, it is responsible for 44 000–98 000 deaths annually in hospitals alone, and it results in patient injuries that cost between $17 billion and $29 billion annually.1

Virtually all of the medical experts who have written on this topic have stated that the key to improving patient safety is to apply system design principles from human factors engineering.1,2 This discipline aims to tailor the design of technology to conform to human nature rather than expect people to contort and adapt to technology. By doing so, systems become easier for people to work in, ultimately reducing error. Human factors techniques have been applied to other industries, such as nuclear power and aviation, and have been very successful in reducing error and improving safety in these contexts.

If the magnitude of the problem is significant and widely known, and if there is a consensus on the likely remedy, then why has not more progress been made on improving patient safety? One possibility is that human factors engineering has traditionally been primarily concerned with “knobs and dials” or “graphical user interface (GUI)” interventions to improve the usability of equipment and software. Clearly, this narrow focus does not address most of the threats to patient safety; as a result, many healthcare providers see patient safety and human factors engineering as specialist concerns that are outside their sphere of action and responsibility. This attitude is a legitimate response to the narrow traditional approach to human factors, but it does not apply to cognitive engineering—a newer cognate discipline that has evolved to address the limitations of the traditional approach.3

A state of the art cognitive engineering approach begins with a much broader systems perspective, identifying the various actors—individuals, computer systems, and organizations—in a complex sociotechnical meta-system.4,5 Figure 1 provides a representative example, although the precise number of levels and their labels can vary across industries. In the context of health care, this hierarchy would include, from bottom to top: patients, providers, department managers, hospital CEOs and CFOs, professional regulators and associations, and government (that is, civil servants and politicians). Knowingly or not, each of these individuals and stakeholders makes decisions that affect patient safety.

The growing literature on patient safety has started to address each of these levels. At the level of the individual worker, our work at the University of Toronto has shown how medical devices could be designed to make them easier for providers to use, thereby reducing errors that have been described in reports of patient deaths.6 This is the traditional realm of human factors engineering. At the management level, researchers at Stanford University have developed training programs that help anesthesiologists to manage team communication and coordination, thereby complementing more traditional training programs that focus on individual medical skills.7 At the organizational level, the US Veterans Administration has pioneered a radically different risk management policy that has led to a more humane health care environment in addition to reducing legal costs.8 Finally, at the government level, a researcher at Pepperdine University has documented how some aspects of the US legal system provide impediments to improving patient safety, thereby pointing the way towards reform.9

“Horizontal” research at each of these levels is necessary to improve patient safety. Cognitive engineering does not, and cannot, take the place of these multidisciplinary safety initiatives but, because of its broad systems view, it points to a critical factor that is overlooked by all horizontal research efforts—the additional need for “vertical” integration across the levels in fig 1. Decisions at higher levels should propagate down the hierarchy, whereas information about the current state of affairs should propagate up the hierarchy. These interdependencies across levels of the hierarchy are critical to the successful functioning of a healthcare system as a whole. Even if researchers do an excellent job at conducting horizontal research on a particular topic, they may have little impact on patient safety unless vertical integration is also achieved. For example, the Stanford team has conducted pioneering research on training crisis resource management skills but, unlike aviation, this type of training is not yet legally mandated in health care. Because of this mismatch between the management level and the regulatory and government levels in fig 1, research at the management level has not had as much impact on patient safety as it can or should. Many other examples of lack of vertical integration in healthcare systems could be cited. Given the available evidence from other safety-critical industries, there are strong reasons to believe that these mismatches are the most significant contributors to adverse events.4,10 It may be the lack of coordination across levels, not the individual levels themselves, that poses the greatest threat to patient safety.

“Patient safety is everyone’s business . . . from patients to politicians”

Unfortunately, the holy grail of vertical integration is becoming more important yet more difficult to achieve. As shown on the right of fig 1, the various layers of a complex sociotechnical system are increasingly subjected to external forces that stress healthcare systems. Examples of such perturbations are:

  • changing political climate and public awareness;

  • changing market conditions and financial pressures;

  • changing competencies and levels of education; and

  • changes in technological complexity.

In today’s dynamic society, these external forces are stronger and change more frequently than ever before. When different levels of the system are being subjected to different pressures, each operating at different time scales, it is imperative that efforts to improve patient safety within a level be coordinated with the changing constraints imposed by other levels. To take a simple example, if hospital managers decide to reduce nursing staff levels to cope with budget cuts passed on from above, then the mental workload experienced by individual nurses will increase, making it even more important that medical devices be designed to minimize mental effort. Without coordinating the changes at various levels of healthcare systems, the external forces acting on the system may unintentionally be “preparing the stage for an accident”.4

Patient safety is—and will continue to be—everyone’s business, all the way from patients to politicians. Sizeable and long lasting reductions in adverse events cannot be realized unless decision makers at all levels pay attention, not just to their immediate local concerns, but also to the global system phenomenon of inadvertent harm to patients. The rationale behind this fundamental lesson from cognitive engineering can be revealed by a simple rhetorical acid test: if all you have are (patient) safety departments and specialists, then what does that say about all of your other departments and specialists?

Figure 1

Various levels of a complex sociotechnical system involved in risk management. Adapted from Rasmussen.4

Acknowledgments

The writing of this paper was sponsored in part by the Jerome Clarke Hunsaker Distinguished Professorship at MIT and by a research grant from the Natural Sciences and Engineering Research Council of Canada.

REFERENCES