From patients to politicians: a cognitive engineering view of patient safety
- Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cognitive Engineering Laboratory, Department of Mechanical & Industrial Engineering, Institute of Biomaterials and Biomedical Engineering, Department of Computer Science, Department of Electrical & Computer Engineering, University of Toronto, Toronto, Ontario M5S 3G8, Canada; firstname.lastname@example.org
Sizeable and long lasting reductions in adverse events cannot be realized unless decision makers at all levels pay attention to the global system phenomenon of inadvertent harm to patients.
Many healthcare providers now know that patient safety poses a significant risk to public health. The American statistics in particular are frequently cited: preventable medical error is the eighth leading cause of death, it is responsible for 44 000–98 000 deaths annually in hospitals alone, and it results in patient injuries that cost between $17 billion and $29 billion annually.1
Virtually all of the medical experts who have written on this topic have stated that the key to improving patient safety is to apply system design principles from human factors engineering.1,2 This discipline aims to tailor the design of technology to conform to human nature rather than expect people to contort and adapt to technology. By doing so, systems become easier for people to work in, ultimately reducing error. Human factors techniques have been applied to other industries, such as nuclear power and aviation, and have been very successful in reducing error and improving safety in these contexts.
If the magnitude of the problem is significant and widely known, and if there is a consensus on the likely remedy, then why has not more progress been made on improving patient safety? One possibility is that human factors engineering has traditionally been primarily concerned with “knobs and dials” or “graphical user …