Article Text

Download PDFPDF

Human factors engineering and patient safety
  1. J Gosbee
  1. Correspondence to:
    J Gosbee, Director, Patient Safety Information Systems, National Center for Patient Safety, Department of Veterans Affairs, 24 Frank Lloyd Wright Drive, Lobby M, Ann Arbor, MI 48106, USA;
    john.gosbee{at}med.va.gov

Abstract

The case study and analyses presented here illustrate the crucial role of human factors engineering (HFE) in patient safety. HFE is a framework for efficient and constructive thinking which includes methods and tools to help healthcare teams perform patient safety analyses, such as root cause analyses. The literature on HFE over several decades contains theories and applied studies to help to solve difficult patient safety problems and design issues. A case study is presented which illustrates the vulnerabilities of human factors design in a transport monitor. The subsequent analysis highlights how to move beyond the more obvious contributing factors like training to design problems and the establishment of informal norms. General advice is offered to address these issues and design issues specific to this case are discussed.

  • human factors engineering
  • patient safety

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

It is common to analyse adverse events by determining why the people involved made the wrong moves or made poor decisions. In the US, the UK, and Australia hospitals commonly use root cause analysis (RCA) teams to address these issues.1 RCA teams are supposed to look beyond blaming people and into systematic vulnerabilities. It is unfortunate that some RCA teams only look for data to support their misconception that they need to create a list of the errors and procedural violations. In fact, errors and violations are only the jumping off points to a different kind of journey, not the end game.2

Human factors engineering (HFE) is the discipline that uses methods and concepts to understand and build systems that are more efficient, comfortable, and safe (box 1).3 The concepts and tools of HFE can therefore help RCA teams to analyse more fully events and “close calls” that involve, for example, medical devices, software, and work areas.4 Instead of thinking why the people involved are so “odd”, we can use HFE to see why their decisions and actions make sense. The more we can think of why we would do or think the same way if we were in their situation, the more likely we are to arrive at productive and long-lasting interventions and redesigns. It is easy to say that “we need more ‘systems’ thinking”, but hard to do. HFE provides tools and concepts to help with a difficult job. An HFE analysis of the case study presented in box 2 provides many useful examples.

Box 1

Human factors engineering

Human factors engineering (HFE) is a discipline concerned with the design of tools, machines, and systems that take into account human capabilities, limitations, and characteristics. The goals are to design for safe, comfortable, and effective human use. Ergonomics, usability engineering, and user centred design are considered synonymous. HFE is based on design related aspects of several biomedical disciplines, including anthropometrics, biomechanics, sensation and perception anatomy and physiology, and cognitive psychology, which covers models and theories of human performance, memory, and attention.

An HFE process is the foundation of “user centred design”. This design process focuses on user needs, user characteristics, and end user testing of the human–machine interface. Another key characteristic of this user centred design approach is the concept of iterative design and testing. Simply put, the design is repeatedly refined throughout the design cycle based on feedback from user testing (or usability testing), which is also repeatedly conducted, starting from the early stages of the design cycle. This helps to ensure that the system being designed meets its intended purpose and operates in its intended manner. Early testing also helps to ensure that design deficiencies are identified and rectified before the system is fielded. The main points of this event come from a real case, but it is not necessarily a case from within the VA healthcare system.

Box 2

Case study*

Two 4th year medical students stood next to the nursing station chatting about the ICU census. With the new hospital policies for admitting and transferring patients, it seemed as if they were constantly involved in moving patients into and out of the ICU rooms. This was the end of their first week in this intensive care medicine rotation.

Just at that moment an unconscious patient was being brought into the ICU from a remote part of the hospital. He had been admitted 5 days earlier with worsening chronic obstructive pulmonary disease (COPD) and fever. Before his admission he had been living at home being cared for by his wife and visiting nurses for COPD and mild chronic heart failure (CHF). Sputum, blood, and urine cultures were inconclusive upon admission. The fever and increased white blood cell count had been reduced with empirical treatment with intravenous antibiotics. Initial physical examination and chest radiography revealed possible worsening CHF. ECG showed sinus tachycardia and right ventricular enlargement with no evidence of ischemia. Blood gases were consistent with COPD and similar to the values at his last discharge. Subsequent fluid input and output data and daily examinations seemed to confirm worsening CHF since admission. He had several instances of ventricular arrhythmias in the previous 12 hours and it was determined that he needed more invasive and closer monitoring.

For this intra-hospital transport he was receiving oxygen via a mask from a portable oxygen cylinder mounted under the transport bed. There were two peripheral intravenous lines delivering fluids through two infusion pumps. The transport monitor showed a blood pressure of 120/80 mm Hg and a heart rate of 72 bpm.

Both students joined the transport personnel, nurses, and ICU fellow who were moving the gurney into the open ICU room. There was nothing unusual about the fact that many people in this crowded room were doing several things at once to move the patient and transfer lines and equipment. A transport nurse remarked how unusually stable this patient was during transport.

One of the 4th year medical students had recently completed a month on the cardiology service. His attending (teacher) had constructively “counseled” him several times to treat the patient, not the “numbers”. This student noticed and commented to the group that the patient had a respiratory rate of 24 beats/min. Given the stable BP and heart rate on the transport monitor, this tachypnea seemed out of place. At nearly the same time the ICU nurse who was taking over the care of this patient hooked the leads up to the wall mounted cardiac monitor. Many in the room gasped in surprise to see the “real” heart rate of 140 bpm and BP of 80/60. One of the transport personnel realized she had seen this while working as an emergency medical technician (EMT): the transport monitor had been left in demonstration (demo) mode. Demo mode is a software program within the device that generates data to continuously display waveforms and numbers that demonstrate the capabilities of the monitor device. It is often used during training. The former EMT pointed out the small “D” on the monitor screen to the team who were now assessing and preparing emergency treatment for this unstable patient.

Over the next few hours the ICU team was successful in stabilizing the patient. Throughout these activities many of them wondered aloud: “How could the transport team have been so careless . . .?”

Postscript

Many interesting facts were later uncovered about how this hospital transported critically ill patients between units and other staffing issues. First, there was no overall staff shortage on the day of the event.

It was found that the hospital had informally created special transport teams because some of the transport equipment was “tricky” to use. On that particular day, since there were more transfers than the transport team could handle, nurses and other personnel were called in to perform some of the work. The members of this particular team had received in-service training on the transport monitor in the past 2 years. However, in this event, this newly formed transport team was using the transport monitor device for the first time. It was not clear under what conditions the transport monitor was placed in demo mode. However, it was clear from reviewing the manual and talking to the biomedical engineering personnel that the monitor could stay in demo mode “forever”. There was a small “D” in the lower right corner of the screen signifying this mode. Upon recreating the steps to place the device into and out of the demo modes, it was found that several unclear steps were necessary. In addition, it was hard to follow the steps of the process on the device display. Although it was likely that this had happened to many others, only a few citations in the form of letters could be found in a Medline search.5,6 The main points of this event come from a real case, but it is not necessarily a case from within the VA healthcare system.

HFE ANALYSIS

Most RCA teams assessing the case in box 2 would not discipline the person who programmed the demo mode or the transport team who did not notice the small “D” on the display. The RCA teams would probably spend their time answering detailed questions about when, where, and how this event happened. Their remedies would probably include adding more detailed policies to double check for demo mode or prohibiting the use of this mode. Some teams might even find out, to their chagrin, that these policies already existed in their hospital. They would reluctantly add additional targeted training on this “hard to see” pitfall to the basic in-service session on the transport monitor.

If an RCA team was applying HFE principles, the team would just be getting started when they found out that there were deviations in policies, that the training was not effective because of the complexity of the device, or that groups of people had devised ways to work around equipment issues.7

General issues

Many clinicians were probably doing “work arounds” to use this less than optimally designed transport monitor. Some of them may have confused the fact that the complexity of diagnosing and treating unstable patients does not mean that the operation and interpretation of medical devices are inevitably complex. Without knowledge of the consequence, some managers or peers were probably rewarding those providers who could master this needlessly complex transport monitor. It is also likely that this happened before and was noticed but not reported. Adapting to misbehaving equipment is one aspect of the “culture of low expectations”.8

Specific issues

Applying HFE principles of design, there are many specific vulnerabilities in the design of this transport monitor. These design issues exist for both the transport team that used the monitor and the person who set and left the demo mode actuated.9 The “D” signifying demo mode is too small to be easily noticed. If noticed by the transport team, it still needed to be decoded. Negative transfer of training could have occurred if the transport team had never used monitors capable of displaying anything but real data. There are no clearly marked exits that allowed the person who set the demo mode to navigate the device out of this mode. The person setting the demo mode also encountered inadequate feedback and situational awareness on this rarely used function. Most importantly, there was no interlock method to prevent a confusing mode to be left on permanently.

GENERAL RECOMMENDATIONS

In general, improving the design of this device is a better solution than training or labelling.10 One possible design improvement is to change the programming of this monitor so that it automatically goes out of demo mode after 15 minutes. Another design change could mimic that of the Space Shuttle and other aviation systems where the displays have a large “X” in the background when it is showing simulation data in real settings (for example, cockpits). Less effective countermeasures include labelling the monitors that are in demo mode, training end users about the hazard, and invoking procedures to use checklists or double checks each time the monitor is placed in demo mode.

Perhaps just as important is for this hospital to deal with the informal norms and work arounds that developed about this needlessly complex transport monitor. Warning signs include proposals to invoke a new policy where only special teams are allowed to use certain basic equipment. Many groups in a healthcare organization will have to manage harmful incentives, such as clinicians being rewarded when they figure out complex equipment with no training. It is important to start making changes in what Chassin and Becher8 call the “culture of low expectations”.

RECOMMENDATIONS TO SPECIFIC GROUPS

Clinicians (nurses, physicians) who use patient monitoring equipment should understand the inherent hazard with demo modes and whenever there are many modes for the same display. It is a very weak defence against future adverse events, but being more aware of this vulnerability and teaching it to others has some utility. They should also keep this in mind if they are members of procurement teams.

Biomedical and clinical engineers in hospitals have many roles and responsibilities for dealing with the above issues. Firstly, they have a leadership role in educating other key personnel involved in selecting and implementing transport monitors. They also need to be keen about the design issues if they are members of RCA teams.

Managers in relevant healthcare organizations (for example, emergency medical service, hospitals) who read this case study should communicate it with clinicians and engineers involved in transport, and request confirmation that it was received and acted upon. If some of the HFE concepts are foreign to key members of their staff, training opportunities should be supported with funding and leadership—for example, managers themselves should get trained if needed.

The management of the manufacturers of transport monitors should involve all their engineers, designers, product managers, and human factors engineers in the case and its analysis. Part of any usability testing on new versions of existing products or new products should include lessons from this case and analysis. It is both the right thing to do for business and may be required by FDA regulations.11,12

CONCLUSIONS

HFE must become a core competency of anyone who has significant involvement in patient safety activities. It is not just another set of principles and techniques; HFE provides a “tried and true” framework for building and strengthening that elusive safety culture. The process of HFE can also be applied to many patient safety activities in healthcare organizations, including procurement of medical equipment, RCAs, and patient safety training activities.13 In some well known healthcare organizations, significant effort has begun.14

Key websites on HFE are shown in box 3.

Box 3

Key HFE websites

Ergonomics Society

The main human factors engineering professional organization in the UK (www.ergo.ac.uk)

Human Factors and Ergonomics Society

The main professional organization in the US (www.hfes.org)

United States Food and Drug Administration Human Factors Section

Several documents available, including the very readable and useful “Do it By Design” (www.fda.gov/cdrh/humanfactors.html)

Key messages

  • Human factors engineering is a discipline about which all healthcare personnel should be aware.

  • Human factors engineering can help personnel involved in patient safety to analyse events and develop workable and effective countermeasures.

  • The case study presented here shows that problems in training personnel to use certain equipment are usually a warning signal of future adverse events.

  • The various recommendations for this case illustrate how many stakeholders are impacted by human factors engineering.

REFERENCES