When frontline staff do not adhere to policies, protocols, or checklists, managers often regard these violations as indicating poor practice or even negligence. More often than not, however, these policy and protocol violations reflect the efforts of well intentioned professionals to carry out their work efficiently in the face of systems poorly designed to meet the diverse demands of patient care. Thus, non-compliance with institutional policies and protocols often signals a systems problem, rather than a people problem, and can be influenced among other things by training, competing goals, context, process, location, case complexity, individual beliefs, the direct or indirect influence of others, job pressure, flexibility, rule definition, and clinician-centred design. Three candidates are considered for developing a model of safety behaviour and decision making. The dynamic safety model helps to understand the relationship between systems designs and human performance. The theory of planned behaviour suggests that intention is a function of attitudes, social norms and perceived behavioural control. The naturalistic decision making paradigm posits that decisions are based on a wider view of multiple patients, expertise, systems complexity, behavioural intention, individual beliefs and current understanding of the system. Understanding and predicting behavioural safety decisions could help us to encourage compliance to current processes and to design better interventions.
- Cognitive biases
- Decision making
- Risk management
- Safety culture
Statistics from Altmetric.com
Studies frequently demonstrate a difference between hospital policy, what clinicians report happens, and what actually happens.1–3 Every day clinicians must decide upon the best safety/efficiency tradeoff that they think will achieve the best care for their patients.4 Sometimes those decisions are made in dynamic, time-pressured environments where clinicians revert to what they know rather than following an analytical step-by-step process.5 Violations and non-adherence therefore are common, not always conscious, not always planned, are frequently well meaning, and in many cases allow the system to run smoothly.6 Thus, simply blaming violations of safety procedures on ‘negligent’ or ‘difficult’ clinicians belies the complexity of decisions about safety processes. Safety is ultimately delivered by people—patients, their families, non-clinical workers and especially clinicians—not policies, processes or checklists. Thus, the relationship between what clinicians should do, what they think they do, and what they actually do may tell us how behaviours are created by the decisions that clinicians make, based on the systems that surround them and the constraints of the organisation. A method of understanding and predicting behavioural safety decisions could help us to encourage compliance to current processes, and to design better future interventions.
Compliance and violations in complex systems
Patient safety research initially focused on understanding the true extent and sources of errors, typified by case-note review,7 direct observation studies8 and implementation of basic quality improvement systems.9 ,10 Eventually this moved towards quantitative methodologies,11 ,12 the proposition of theoretical models,13 ,14 and more recently, several major evidence-based safety innovations.15–17 Reliable implementation of these new ways of working has been more difficult. Violations—failures to follow rules, processes and guidelines—are still frequent and quality improvements are not always apparent.18 Compliance to hand hygiene protocols may be observed as low as 2% in the operating room19; mask discipline can be extremely variable20; equipment counts can be disruptive and ineffective21 ,22; distractions are frequent23; deep vein thrombosis prophylaxis is notoriously variable24 ,25; checklists and timeouts may be used sporadically,26 with considerable variation in their uptake.27 Though the reader is directed towards detailed considerations of error and violations,28–30 five types of violation are often discussed:
erroneous violations—due to a lack of understanding or inexperience;
exceptional violations—when unusual circumstances require an unusual response;
situational violations—when the environment makes adherence difficult;
routine violations—when a shortcut is taken regularly;
optimising violations—when there is a desire to improve a work situation.
Rather than being deliberate negligence or poor practice, work in other industries illustrates that rules and procedures constitute fragile safety barriers, and violations usually reflect good intentions to complete work efficiently, even if the behaviours are eventually misguided and unsafe. Thus, the apparent simplicity of safer processes belies the complexity of their implementation within the healthcare system.31
Questions of safety must be framed within an understanding of the relationship between people and systems of work. Errors are frequent, accidents normal,32 and are predisposed by deficiencies within systems which may promote the wrong behaviours and allow errors to perpetuate to catastrophe.6 Indeed rather than being a hazard, people create safety in complex systems by providing the flexibility to function, despite the risks of errors in an uncertain, time-pressured resource-constrained world.4 Under financial and workload pressures, accepted ways of working can migrate toward unsafe boundaries33 through shortcuts and lowered accepted standards. Efficiency and throughput issues can create difficult working environments,34 where manager and clinician clash over the need to trade organisational goals (eg, efficiency) for safety goals for individual patients. New processes or checks need to be trained and implemented, which can add time, financial and managerial pressures to the system, with the monitoring of these requirements further increasing overhead cost.33 Effective implementation processes incorporate hands-on leadership, frontline decision making, dedicated resources, local modification and feedback.35 Sometimes, the implementation of guidelines is hindered by organisational constraints.36 Sometimes people are simply unaware of their deviations.37 Thus, in the same way that errors cannot necessarily be avoided through ‘trying harder’, seeing violations of safety processes as purely due to individual will does not account for the systemic pressures upon the individuals delivering care.
Systems models of compliance
A range of models and techniques are available to help us understand how the systems of work affect behaviour. The system of surgery has been examined in detail by several groups who have been able to observe the effects on behaviour and operative course of different configurations, specialties, distractions, disruptions and unintentional outcomes.11 ,12 ,14 ,20 ,38 These interactions can be complex39 but demonstrably influence outcomes.8 Models exist for describing the complex relationship between systems components13 ,40 ,41 but generally they acknowledge that people, tasks, equipment, tools and aspects of the organisation all interact to produce either successful or unsuccessful processes. From an organisational perspective, the dynamic safety model34 provides a framework examining organisational tendencies toward or away from unacceptable workload, economic failure and safety, thus allowing the qualitative consideration and potential quantification of high-level organisational issues.
Systemic predisposition to error—that is, how workplaces can create the opportunity for people to make and perpetuate mistakes—needs to be mediated by an understanding of individual behaviour and responsibility. A systematic review of violations across a range of industries (including, but not limited to, healthcare) suggests worker-centred design, training, competing goals and rule definition, as predictors as well as individual characteristics.42 For example, in medication administration, violations may be influenced by context, process and hospital location.43 Anaesthesiologists’ decisions to follow or deviate from guidelines are influenced by the beliefs they hold about the consequences of their actions, the direct or indirect influence of others, and the presence of factors that encourage or facilitate particular courses of action.44 Compliance can be observably influenced by case complexity and job pressure.45 Despite the focus on an evidence base, a systematic review of the use of guidelines finds that flexibility of recommendations to local context and concise recommendations are also important aspects of adherence.46 Systemic ambiguities which discourage a clear understanding of how things ‘should’ work may be particularly useful in understanding healthcare violations.47
Behavioural models of compliance
Improving safety is also about achieving behavioural change, which is influenced by knowledge, beliefs, decision processes, context and social influences.48 In fact, the use of procedures and processes may be viewed as counteracting the use of expertise and common sense,49 and thus senior staff can often be the trigger for violation that leads to a migration within the system as a whole.45 Indeed, even exceptional violations can eventually become accepted practice.33 Rule violations may also give the violators an impression of power.50 This is reflected in the different attitudes to rules that doctors and nurses share,51 and in the cognitive dissonance apparent in the responses of surgeons to the surgical safety checklist, with a higher proportion stating they would like the checklist used during their care than believed it improved safety, improved communication and prevented errors.52 The challenge is to encourage clinicians to make the right decisions that protect each patient they care for, above the financial, social and workload pressures from the higher echelons of the organisation, their peers, legal systems and even professional organisations.
At the ‘sharp end’ of the organisation, possibly the most influential theory in health psychology for understanding individual behavioural intention has been the theory of planned behaviour.53 In addition to generic health-related behaviours such as smoking, drug taking and dieting,54 it has been applied to hand hygiene prediction55 and venipuncutre.56 In its simplest form, the theory can be expressed as the mathematical function in box 1. This has been the basis for the Technology Acceptance Model,57 which is not without its critics,58 but illustrates how, with sufficient understanding, compliance and violation might be modelled. If we consider some broad analogies—when perceived behavioural control might relate to implementation processes and organisational pressures; social norms might be examined through teamwork and concepts of professional standards; and attitudes might be measured through consideration of evidence bases, prior experience and knowledge—then it may be possible to build a model of how the nature of change, systems design, and individual and professional influences can lead to violation decisions. However, the precise parameters, the nature of the model and their relationship would need to be defined through observation and experimentation.
Theory of planned behaviour expressed as a testable model
AB: attitude toward behaviour [(b): strength of each belief+(e): evaluation of the outcome or attribute]
SN: social norm [(n): strength of each normative belief+(m): motivation to comply with the referent]
PBC: perceived behavioural control [(c): strength of control belief+(p): perceived power of control factor]
W′: empirically derived weight/coefficient
Naturalistic decision making
Compliance is a classic systems problem, with initial violations predisposed to at the organisational or managerial level, and perpetuated through a range of measurable socio-technical parameters. The naturalistic decision making paradigm59 helps us to understand how human decision making is mediated by this organisational and environmental context. It posits that decisions are usually made in uncertainty, under time pressure, with ill-defined goals, and are based on expertise, pattern matching or recognition of a given situation, and are thus nonlinear, non-analytical and not necessarily logical, rational or based on risk/benefit considerations. Naturalistic decision making has been extremely influential in the human factors science of decision making, especially to understand and improve tactical decisions in military operations,60 and is beginning to be used in healthcare.
Direct observation in the emergency department (ED)61 found that only about 45% of the decisions were planned, with the rest being opportunistic or forced by interruptions. Moreover, decisions were not being made about individual patients alone, but their care within the wider view of the ED as a whole, and situational factors that defined the local, immediate demands for resources. Indeed, these different goal-directed behaviours can have an interfering influence on the performance of evidence-based behaviour.62 Thus, it is becoming apparent in healthcare systems, as in other industries, that the decision to adhere to guidelines, processes or safety procedures is not necessarily logical, linear and evidence based, but based on a wider view of multiple patients, expertise, systems complexity, behavioural intention, individual beliefs and current understanding of the system.
Behavioural change and violation reduction are keystones of the safety movement, but the mechanisms have been infrequently described and even less frequently quantified. It is in the quantification that we may be able to achieve robust scientific evidence, and a greater ability to predict and balance throughput, cost and safety. Though a great many approaches to human decision making and violations exist, the models selected here take different but complimentary perspectives. The Systems Engineering Initiative for Patient Safety (SEIPS) model examines the broad influences of behaviour upon systems, and lends itself to multi-parameter measurement through direct observation,63 but may not directly address or predict individual behavioural outcomes. It might therefore provide a useful way to quantify the context of a work system. The theory of planned behaviour and the technology acceptance model is quantifiable from a behavioural approach, but may be oversimplistic58and does not address the broad array of behavioural determinants of SEIPS. Finally, the dynamic safety model and naturalistic decision making paradigm, though largely qualitative, make the links between system and person, directly acknowledging the systemic tradeoffs between safety and efficiency that need to be made every day. They may therefore provide a framework in which systems context and individual behaviour might be brought together in a quantifiable form. Whether that is possible remains to be seen. Clearly this is a complex challenge.
Within a framework that recognises behaviours arise from systems of work,13 organisations are under a range of pressures,34 and that decisions need to be understood from a naturalistic point of view,59 it is possible to explore these relationships from a quantitative and qualitative perspective. The immediate utility would be in developing quantitative models that describe the influence of organisation, context, tasks and tools on systems safety behaviour. The longer-term benefit would be in understanding the mechanisms of safety change and the prediction of successful or unsuccessful interventions. This would provide a better understanding of how we can improve the safety of patients within complex and increasingly pressured healthcare organisations. It may be the piece of the puzzle we are missing.
Funding Dr Catchpole is supported by the Cedars-Sinai OR360 Initiative, funded by Department of Defense, Telemedicine and Advanced Technology Research Center grant W81XWH-10-1-1039, which seeks to reengineer teamwork and technology for twenty-first century trauma care. No other interests are declared.
Competing interests Dr Catchpole is an Associate Editor for BMJ Quality and Safety. He has received honorariums from a variety of healthcare improvement organisations for speaking about human factors, behavioural safety, and quality improvement in complex systems, but has no competing interests.
Provenance and peer review Not commissioned; externally peer reviewed.
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.