Article Text

Download PDFPDF

  1. P F Jensen1,
  2. P Barach2
  1. 1Department of Cardiothoracic Anaesthesia, The Heart Centre, Rigshospitalet, University of Copenhagen, DK-2100 Copenhagen, Denmark; foege{at}
  2. 2Department of Anesthesia and Critical Care, University of Chicago, Chicago, IL 60637, USA

    Statistics from

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


    Once upon a time there were almost no formal investigations of the nature of human errors in hospitals. Today, however, there is an abundance of reports and a plethora of papers publishing robust data on this subject.

    The paper by Donchin and colleagues1 was one of the first publications to investigate the nature of human errors in the intensive care unit (ICU), adopting approaches developed by human factors engineering. The ICU environment is complex, dynamic, with a constant change in time and stress. There is an excess of high technology equipment to facilitate diagnosis, monitoring and treatment of patients, but this often creates additional unexpected demands. The definition of human error in this study was vague, but this is the case in many classifications and taxonomies of human error—both with regard to genotype and phenotype.2,3 Even with more precise definitions it is difficult to achieve satisfactory descriptions and explanations of error in humans, given the complex nature of human interactions. Decision making by healthcare providers is influenced by factors such as workload, economy, ethics, and safety issues, which make the true picture much more difficult to fit neatly into a fixed taxonomy.

    Looking through the newspapers from the last three decades, it is evident that human error is of crucial importance. It has been a critical factor in some of the most devastating events that have captured our attention, such as Chernobyl and Three Mile Island.4 Perhaps some of the most important lessons from these events were that they were the product of many different failures distributed widely in time. The errors or violations carried out by the operators on the night of 26 April 1988 in Chernobyl were just the last ingredients in the making of the disaster, actively breaching various barriers or safeguards. The contribution to the breakdown of a well defended but highly complex system can be divided into two categories: active and latent failures.2 Active failures are errors or violations by the persons who are directly working with the system, while latent failures are delayed actions or a lack of decisions related to design, organisation, or structure of the system. It was realised that nearly all accidents have roots in organisational and systemic root causes. These latent factors can be diagnosed and efforts towards avoiding repetition initiated.2

    Fortunately, it has been realised that the efforts and work of multidisciplinary teams in the analysis of the nature of human error in the industrial sector can be transferred to the medical environment. This was prompted by increased attention to medical errors and, especially in the US, by rising malpractice insurance premiums. If this development should change health care, it is through reducing the events and their causes, and by developing methods to prevent errors or to attenuate their effect.5

    Donchin et al1 applied task analysis in an intensive care environment to describe the activities around the patients in a medical-surgical ICU, which was one of six critical care units in a tertiary care teaching hospital of 650 beds. The activities were observed and described by a team of non-medical investigators who had been specially trained for the purpose. Simultaneously, the physicians and the nurses were prompted to report any kind of errors occurring in the ICU. Error was defined as “any deviation from standard conduct, as well as addition or omission of action related to standard operating procedures or routines of the unit”. The study demonstrated, on average, 178 activities per patient per day of which 0.95% were judged to be erroneous by the authors. The staff reported 554 human errors during the 4 month study period, 29% of which were graded as severe or potentially detrimental to the patient’s well being if not corrected in time; 54% of the errors were committed by the physicians and 45% by the nurses. This was surprising as task analysis revealed that the physicians carried out only 4.7% of the activities. The activities carried out by the nurses tended to be more of a routine and repetitive character. The physicians were called away for emergency consultations and their contact with the patients was intermittent, hence they had an increased probability for error. In another subset of the task analysis the investigators found that verbal communication between nurses and physicians was the main mode of communication in only 2% of activities, but in 37% of these cases they recorded errors. The authors did not report the number of cases where crucial information should have been communicated but failed, but information transfer and degradation was clearly a problem.

    Human factors engineering (HFE) is a discipline all healthcare personnel should know about in order to understand many aspects of patient safety. The concepts and tools of HFE can help an organisation to analyse adverse events and develop workable and effective countermeasures. HFE methods can also benefit healthcare personnel by moving them towards systems thinking and a culture of safety.6

    The study by Donchin et al deserves special attention as it is one of the first and best examples of using a traditional engineering tool which has been known for many decades but only recently applied to health care. Task analysis is extremely useful in revealing the true nature of human errors in complex healthcare settings such as the ICU using trained observers. It shows how error rates are influenced by the diurnal distribution of activities, poor communication, and whether the activities are novel or routine. These simple lessons have many implications for the safety and quality of care delivered to our patients. Although we have known these lessons for 20 years, we have yet to implement many of them into the design of our hospitals, medical devices, healthcare work schedules, supervision of trainees, and communication between different healthcare providers.