Article Text

Download PDFPDF

Checklists, safety, my culture and me
  1. Karthik Raghunathan
  1. Correspondence to Dr Karthik Raghunathan, Department of Anaesthesiology, Baystate Medical Center, Tufts University School of Medicine, 759 Chestnut Street, Springfield, MA 01199, USA; karthik.raghunathan{at}bhs.org

Abstract

The world is not flat. Hierarchy is a fact of life in society and in healthcare institutions. National, specialty-specific and institutional cultures may play an important role in shaping today's patient -safety climate. The influence of power distance on safety interventions is under-studied. Checklists may make power distance-hampered negotiations easier by providing a standardised aviation-like framework for communications and by democratising the environment. By using surveys and simulation, we might discover patterns of potentially hidden yet problematic interactions that might foster maintenance of the error swamp. We need to understand how people interact as members of a group as this is crucial for the development of generalisable safety interventions.

  • Anaesthesia
  • checklists
  • communication
  • human error
  • patient safety

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

The world is not flat. Hierarchy is a fact of life in society and in healthcare institutions. Culture may be defined as ‘a way of life’ (Webster's dictionary) distinguishing the members of one human group from another. Culture shapes how we interact within and across hierarchies. Having lived and worked in India and in the USA, I can attest to the vastly different cultural legacies that exist. Interactions within hierarchies (in a hospital or otherwise) may be significantly different across nations. Working with surgeons from different specialties, I can also appreciate the presence of certain specialty-specific cultures. As an anaesthesiologist my specialty's ‘culture’ plays an important role in shaping my interactions. In addition, having worked at different institutions over the years, I have come to appreciate the presence of certain institutional cultures as well.

National, specialty-specific and institutional cultures create the context within which patient safety interventions are applied. My primary hypothesis is that ‘cultural influences play an important role in shaping today's patient safety climate’. The following discussion presents support for this hypothesis and illustrates the rele-vance of this concept using practical examples.1 ,2 The aviation industry has served as a model for the patient safety movement in medicine.3 Lessons learnt about the importance of cultural influences in the cockpit may provide a framework for the development of culturally sensitive patient safety interventions in medicine.

National cultural legacy

The Dutch social psychologist Geert Hofstede, in a pioneering study of cultures across modern nations,4 described the four dimensions that are intricately linked to national identity: Power Distance Index (PDI), Individualism, Masculinity and Uncertainty Avoidance Index. While all of these dimensions may be important, I believe the PDI to be especially pertinent within hierarchical healthcare settings.

The PDI score reflects the ‘distance’ across power imbalance in hierarchies,4 essentially the inequality in a pyramidal organisational structure. It is the extent to which less powerful members in a hierarchy accept and expect that power is unequally distributed. The higher the PDI score, the greater the inequality. In high PDI nations, inequality and overt signs of deference to authority are common. The PDI score for India is at the higher end of the spectrum (=77). Respectful deference to authority was the norm when I attended medical college in Chennai, India. Mitigated speech was customary when conversing with superiors on clinical rounds or during case discussions. Data were fed up the totem pole and decisions were handed down. In this environment, specific safety concerns might have been understated. When I began my internship in USA, hierarchy was still apparent, perhaps, to maintain accountability. In general, the culture during my internship was for medical students and interns to be charged with data gathering and for residents, fellows and attending physicians to serve as the synthesisers. The team worked together to implement solutions. However, when an adverse event was reported, I felt that ‘I was being written up’ rather than ‘a problem being reported’. With this perception as part of my cultural heritage, I was often concerned that errors would be taken as a sign of incompetence. The idea that certain problems could be systemic did not seem intuitive. I may have been hesitant to readily report ‘near misses’ (errors that did not actually cause patient harm). As a result, the system lost important learning opportunities. By not being plainspoken about ‘near misses’—I was helping to maintain the error swamp.5 When such ‘near misses’ line up (in Jim Reason's Swiss Cheese error model5) actual but preventable patient harm might occur—error mosquitoes breed. ‘Never events’ such as a procedure on the wrong site or wrong patient or a wrong surgical procedure rarely occur de novo.

In his classic paper,5 Reason contrasts two approaches to the human error problem: the ‘person approach’ focusing on individual imperfections contrasted against the ‘systems approach’ scrutinising conditions under which individuals work. For example, consider an error where a tired resident trainee draws up the wrong drug and administers it. There might be several links in this error chain: identical phials being kept close together (systems issue), non-enforcement of duty hour regulations (systems issue) and/or an inattentive trainee (personal issue). In higher PDI cultures, prior near misses (wrong drug drawn up but not administered) might not have been reported. The ‘blame and shame’ debriefing session might translate as ‘ how did you do this?’. In lower PDI cultures, workers may not hesitate to raise a concern since this is not viewed as a solely personal failure. In this scenario, a prior near miss might be reported—resulting in system redesign to eliminate identical phials being kept together or to enforce duty hour regulations. The corresponding debriefing session might translate to ‘how did this happen?’. This cultural contrast has been described in the airline industry (eg, American/European vs Taiwanese pilots).6

Specialty-specific and institutional cultures

As an anaesthesiologist in an academic institution, I am on the front line—a daily witness to communications breakdowns. Interactions in operating rooms (ORs) are often exercises in demonstrating or negotiating PDIs. Much has been said about the traditional role of the American anaesthesiologist and the need to expand this role beyond the OR.7 The present culture does not involve routine anaesthesiologist involvement with patient management in the surgical intensive care unit (ICU). Consequently, patient handoffs may be incomplete.8 In high pressure settings, such as the ICU or the OR, individuals do not want to be perceived as being troublemakers or to imply that others are inefficient. Error or near miss reporting is unlikely to be embraced as an opportunity for improvement—rather it may be viewed as an admission of failure.

The reasons for how and why some organisations perform better than others may relate to personnel, system process and design elements but may also be attributable to perceptions of the institutional culture. Aspects of institutional values that have evolved to promote excellence remain under-studied. The Flawless Operative Cardiovascular Unified Systems—Locating Errors through Networked Surveillance project (FOCUS-LENS)9 is a study currently being run by the Society of Cardiovascular Anaesthesiologists. A central premise is the idea that although individuals will make mistakes, it is possible for teams to be flawless. Current processes, cultures and human factors in six cardiac surgical programmes were initially studied using observations and surveys. A 42-item survey instrument assessing 12-dimensions of the patient safety culture was administered to cardiac surgeons, anaesthesiologists, perfusionists, nurses and surgical technicians across participating sites. Questions, such as ‘would staff be afraid to ask questions if something does not feel right?’ and ‘would a mistake that could cause patient harm but does not, get reported?’ were asked. Open communication and teamwork is summarised and reported as composite scores for each site and then compared across sites. This laudable initiative may capture important elements of specialty-specific and institutional PDIs and the culture of safety.

Culture in the cockpit

Cultural legacies10 have informed the study of human error in aviation as outlined by Malcolm Gladwell in the bestseller Outliers.11 Gladwell discusses how errors made in Korean Airline cockpits may be a direct result of Korean cultural traditions affecting communication.11 Aviation safety experts reviewed hours of cockpit voice recorder transcripts when they were investigating the unusually high crash rate of Korean Airlines in the 1990s (several fold higher than the industry average). The insight gained was that critical cockpit communications were indirect and ambiguous. Excessive deference to the pilot's authority translated into a failure to avert disaster. It meant that serious safety threats were not being made explicit and were only hinted at in the cockpit. Although Korea does not specifically have a Hofstede PDI score, their closest neighbour China has a high PDI (=80). The intervention to improve safety involved retraining the crew in English—consequently allowing critical communications to occur explicitly. The high PDI in the cockpit was being mitigated by unfettered communication in English. A quantum decrease in serious accidents occurred within a decade. Critical communications might be hindered in high PDI hospitals, ORs and ICUs.

Acculturation in medicine

A quarter of all physicians in USA are international medical graduates (IMGs). The majority are from India, the Philippines and Pakistan.12 The PDI scores for India (77), the Philippines (94) and Pakistan (55) are much higher than the PDI scores in USA (40). At least initially such a PDI mismatch might mean that IMGs from these countries may be working under invalid cultural assumptions. Serious miscommunication might occur as a result. Acculturation involves overcoming this cultural discordance. As IMGs adapt and assimilate, cultural concordance may develop—direct communication becomes easier. Workforce trends in anaesthesiology13 show especially dramatic shifts in the IMG demographics between 1994 and 2002. Traditional cultural deference to authority may be magnified in ORs. Consider the impact of increasing global migration of nurses14 and the complexity of inter-professional doctor–nurse dynamics, and in an increasingly interconnected world it becomes easier to appreciate that ‘culture eats strategy for breakfast’.15

Checklists, safety and PDI

The Safe Surgery Saves Lives Study Group1 reported that the global implementation of a standard WHO surgical safety checklist led to a significant reduction in major post-operative complications. Such checklists have been reported to increase staff satisfaction16 and have also been shown to reduce in-hospital 30-day post-operative mortality.17 Checklist briefings have been shown to reduce the number of communication failures and promote teamwork and the mortality benefit of checklist implementation apparently depends crucially upon checklist compliance.17 Gawande et al1 discuss the likelihood that the sum of changes in systems, behaviours and team practices could account for the positive effects of checklist implementation. They also promote the notion that checklist programmes have the potential to prevent harm noting that effective improvement was not confined to high-income or low-income sites in their study.

Closer scrutiny of the results1 revealed that significant improvements in outcome actually occurred at only three sites and there were no significant changes at five other sites out of the total of eight included in the study. This was despite significant changes in checklist processes at all the sites. Viewing these study data through the lens of Hofstede's PDI scores1 ,4 might be illustrative. PDI scores in the study varied significantly: India (77), the Philippines (94), Tanzania (‘East Africa’, 64) and Jordan (‘Arab World’, 80) being similar to each other and obviously different than New Zealand (22), Canada (39), USA (40) and England (35).1 ,4 The before–after comparison data for selected checklist indicators were remarkably variable. For example, for a composite of six safety indicators: at one site compliance before and after checklist introduction remained 0%; at another site this was unchanged (94.1% to 94.2%); while at yet another site compliance improved from 46.7% to 92.1%. With such dramatic practice variation, presenting a single summary average may be impossible. The important role that specific national cultural legacies play in modulating safety efforts needs to be acknowledged. Checklists alone do not fully address safety gaps, especially in high PDI cultures. In these high-pressure environments, structural redundancies that promote safety might be perceived as impairing efficiency.

Medico-legal ‘culture’ and safety

Patient safety efforts might be influenced by the litigious medico-legal culture in America. Safety could be improved without increasing exposure to liability, yet error reporting systems remain underused by providers.18 Malpractice remains a contentious component of cost containment initiatives19 and a hostile medico-legal culture may result in providers being reluctant to participate in error reporting systems.20 A recent study showed that about one-third of all contributing factors in accepted surgical malpractice claims might have been intercepted by using a comprehensive surgical safety checklist.21 Also counter to conventional wisdom, legal claims and costs at a large US health system did not increase in the 10 years following introduction of a disclosure-with-offer approach to medical errors.22 Liability checklists23 may also augment safety efforts. Systems used to address safety concerns may need to take the legal culture into consideration. Findings from certain areas may not apply in others and research into the influence of medico-legal cultures is needed.

Solutions and conclusion

Culture permeates our interpersonal interactions. Hofstede's original studies4 were performed using surveys at IBM offices across different nations. Such surveys might need to be repeated in healthcare settings across the globe. By using such surveys, we might uncover high PDIs in ORs and ICUs, in certain nations and across institutions. Surveys are needed to uncover positive and negative aspects of institutional and national cultures. Information technology could be leveraged to enable the discovery of inter-institutional and international patterns in error reporting and safety practices.24 Surveys will help us understand cultural context.

Simulation might also be used to identify and overcome PDIs hindered communication. Residency training might need to include simulated situations requiring confrontation with superiors or the acknowledgement of error. Effective communication technique can be demonstrated to IMGs that might not be familiar with culturally accepted practices in the local environment. Team-building exercises and debriefing following simulated errors should stress smooth interpersonal dynamics.

We are aware of certain personality traits that seem to be shared within specialties (eg, the ‘surgical personality’).25 Formal research into PDIs within and across medical specialties is warranted. PDI-restricted negotiations are made easier with a standardised aviation-like framework for communications.26 However, checklists cannot distil all the complexity of knowledge, judgement and skill required to perform clinical tasks. Understanding the context in which interventions are being applied is critical. In summary, we need to understand how people interact as members of a group.27 ‘Knowing the culture’ is crucial to adapting safety interventions across various settings.

References

Footnotes

  • Linked article 000283.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Linked Articles