Article Text

A prospective, observational study of the effects of implementation strategy on compliance with a surgical safety checklist
  1. J A Hannam1,
  2. L Glass1,
  3. J Kwon2,
  4. J Windsor2,3,
  5. F Stapelberg1,
  6. K Callaghan2,
  7. A F Merry1,3,
  8. S J Mitchell1,3
  1. 1Department of Anaesthesiology, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
  2. 2Department of Surgery, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand
  3. 3Auckland City Hospital, Auckland, New Zealand
  1. Correspondence to Associate Professor Simon Mitchell, Department of Anaesthesiology, University of Auckland, L12, Auckland City Hospital, 2 Park Road, Grafton, Auckland 1023, New Zealand; sj.mitchell{at}auckland.ac.nz

Abstract

Background The reported benefits of using the WHO Surgical Safety Checklist (SSC) are likely to depend on compliance with its correct use. Compliance with SSC administration in centres that have introduced the checklist under a research protocol may differ from centres where the SSC is introduced independently.

Objective To compare compliance with SSC administration at an original WHO pilot study centre (Hospital 1) with that at a similar neighbouring hospital (Hospital 2) that independently integrated the SSC with pre-existing practice.

Methods This was a prospective, observational study. One hundred operations were observed at each hospital. We recorded: compliance with administration of SSC domains (Sign In, Time Out and Sign Out) and individual domain items; timing of domain administration; and operating room team engagement during administration.

Results Domain compliance at Hospital 1 and Hospital 2, respectively, was: 96% and 31% (p<0.0005) for Sign In; 99% and 48% (p<0.0005) for Time Out and 22% and 9% (p=0.008) for Sign Out. Engagement of two or more teams during Sign In and Time Out occurred more frequently at Hospital 2 than at Hospital 1.

Discussion Compliance with administration of SSC domains was lower at Hospital 2 which introduced the SSC outside the context of a strict study protocol. This finding mandates caution in extrapolation of benefits identified in SSC studies to non-study hospitals. Staff engagement was better at Hospital 2 where checklist administration leadership is strategically shared among anaesthetic, surgical and nursing team members as compared with exclusive nursing leadership at Hospital 1.

Study registry number Australian and New Zealand Clinical Trials Registry: Ref: ACTRN12612000135819, http://www.anzctr.org.au/trial_view.aspx?ID=362007

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

The WHO Surgical Safety Checklist (SSC)1 is a set of prompts designed to prevent errors or omissions in perioperative care, and to promote communication between operating room (OR) team members. The SSC is comprised of three domains, each consisting of a series of items: ‘Sign In’ is administered when the patient arrives in the OR; ‘Time Out’ is administered just prior to first incision; and ‘Sign Out’ is administered prior to the patient leaving the OR. Administration of domains is announced and designed to be treated as a process to which all OR team members give their undivided attention. Studies investigating either the WHO SSC1 or a suite of quality improvement strategies containing a similar variant2 have demonstrated absolute reductions of 0.7%1 ,2 in mortality and of 4–5% in complication rates.1 ,2

The potential benefits of using such a tool are substantial given that over 230 million operations are performed globally each year.3 It has been demonstrated that at least some of the safety benefits of the SSC are correlated to checklist compliance.4 It is notable that data demonstrating checklist benefits1 ,2 were gathered in highly supervised study environments in which staff were trained to ensure levels of compliance adequate for the evaluation of checklist efficacy. However, the quality of checklist administration at these centres may not reflect that of hospitals where the SSC was introduced outside of a research setting. Maintaining checklist practice, even after an effective introduction, is also an on-going challenge.5

Our institution was one of the original sites participating in the WHO SSC pilot study,1 and a modified version of the original SSC has been used for 5 years. Practices at this hospital may not be representative of other centres that have since adopted the SSC but were not involved in the WHO study. We assessed compliance and engagement in SSC administration at our institution, and compared this with another local hospital of similar size and activity that introduced the SSC independently. We hypothesised that if the circumstances under which the SSC is introduced to hospital practice do not influence compliance and quality in its administration, then there should be little or no difference between these sites.

Methods

This prospective, observational study was approved by the Northern Y Regional Ethics Committee (ref: NTY/10/EXP/077) and registered with the Australian New Zealand Clinical Trials Registry (ref: ACTRN12612000135819). Required consultative processes were completed for institutional approval. Data were collected concurrently at both sites between November 2011 and January 2012 using methods previously established by our group.6

Differences between sites and checklist modifications

The study hospitals were two large tertiary teaching hospitals in Auckland City. Prior to roll-out of the SSC, both hospitals used less comprehensive checks in the perioperative period. They employ medical and nursing staff from the same workforce pool with common training and registration requirements. Surgical and anaesthesia trainees rotate between the institutions, and both hospitals receive medical students from the same local university for training.

The first site (‘Hospital 1’) was a study centre for the initial WHO SSC initiative in 2006.1 The SSC was introduced to OR staff as part of participation through a combination of iterative seminars and written material. There was a sustained effort to establish and maintain compliance with the checklist, and it was widely understood among OR staff that SSC administration was being observed during the WHO SSC study. SSC use became standard practice for all surgical procedures following the study in 2008. The second site (‘Hospital 2’) was not part of the initial WHO SSC study, and use of the SSC was implemented independently in 2010 under the auspices of the hospital's surgical quality and safety committee. The SSC was integrated into existing surgical time-out practice and promoted through education seminars. Compliance was not audited as part of a research project as it was at Hospital 1.6

Both hospitals now use adaptations of the SSC modified for their individual OR environments. Although largely comparable, some differences between adaptations are notable. The adaptation used at Hospital 2 allows Sign In administration in the preanaesthetic room or the OR. It also specifies that Sign In should be led by the anaesthetist, Time Out by the surgeon and Sign Out by the circulating nurse. At Hospital 1, all domains must be completed in the OR, and the circulating nurse is responsible for initiating and ensuring SSC completion. Full checklist adaptations for Hospital 1 and Hospital 2, and a summary of checklist differences, are given in online supplementary appendices 1–3, respectively.

Data collection

Data were collected prospectively by direct observation of 100 surgical cases at each site (total 200 cases). Adult surgical cases requiring the presence of all three OR teams (anaesthesia, nursing and surgery) were eligible. Two medical students trained as observers rotated weekly between sites to avoid introducing interobserver bias. They attended OR lists primarily as medical students to minimise the impact of observation on staff behaviour. The observers (one at each site) were allocated to an OR list at the beginning of each day by the anaesthetic coordinator, and observed the entirety of that list wherever possible. It is customary for students to attend the non-elective ORs at Hospital 2, and no attempt was made to influence this. However, it must be emphasised that the cases observed were not emergency cases, and all patients had been admitted to a ward and properly evaluated prior to surgery. We recorded the procedure, surgical specialty, the elective or non-elective nature of the case, and which OR team led administration of each SSC domain. No information identifying patients, staff or the OR was collected.

We assessed compliance and engagement in SSC administration using a previously reported checklist Compliance Assessment Tool (CAT).6 We updated this tool to align it with the current SSC versions in use at both hospitals (see online supplementary appendices 1 and 2).

Compliance, engagement and timing

Compliance with administration of each domain item was defined as verbal communication of that item by the checklist administrator or other OR team member during SSC administration. We did not interpret communication of checklist items between team members outside the context of formal SSC administration as compliance with that item. However, since it is likely that such communication is better than omission of the item entirely, we did record and report instances where this occurred.

Timing of domain administrations, and OR team engagement, was recorded. Engagement was rated according to the number of OR teams engaged. At least one team member had to be engaged in SSC administration for the team to be considered engaged, and engagement was defined as listening or contributing to SSC administration with cessation of other activities and conversations. The presence of each team was recorded to ensure engagement was not confounded by team absence (eg, the surgical team was often not present during Sign In). Timing of domain administration was rated as compliant (or not) with the following recommended practices: Sign In to be administered in the OR (Hospital 1) or preanaesthetic room (Hospital 2) prior to any drug administration; Time Out to be administered after surgical site preparation and draping, but prior to the first surgical incision; and Sign Out to be administered while the surgical team was still present in the OR.

Data quality

Observers underwent training for 1 week prior to the study, during which they attended OR lists together with a senior investigator and completed CATs independently on the same cases. Observations during training were subsequently compared and discussed. Both observers attended the same OR for 10 (5%) cases during the study to allow evaluation of inter-observer reliability. Data from completed CATs were entered electronically. Ten (5%) cases were randomly allocated for re-entry to test data entry accuracy.

Endpoints and analysis

Primary outcomes were compliance (by hospital) with administration of SSC domains and individual domain items, given as ‘domain compliance’, ‘domain completion’ and ‘item completion’. Domain compliance was the percentage of cases in which the domain was administered. Domain completion was the percentage of all eligible items administered in domain-compliant cases. Item completion was the percentage of domain-compliant cases in which each individual item was administered. Those items that appeared on one hospital's SSC adaptation but not the other, and those items considered not applicable to the observed case, were excluded from analysis. Secondary outcomes were the percentage of cases in which the timing of domain administration complied with checklist recommendations, and the percentage of cases in which one, two or three OR teams were engaged during domain administration. The association between hospital and domain completion was investigated for each domain individually using logistic regression with adjustment for potential confounding factors (elective vs non-elective case status and surgical specialty). For this purpose, domain completion included those cases for which the domain was not completed (ie, 0=domain not completed, and 1=all applicable domain items completed).

Domain compliance, domain completion and item completion were compared between the two hospitals using a χ2 test or a Fisher's exact test. Compliance with prescribed SSC items within each domain was compared between the two hospitals using an unpaired t test. We did not correct our p values for multiple testing. Results of logistic regression analyses were expressed as ORs and 95% CIs. Analyses were performed using SPSS Statistics V.19.

Results

Cases

One hundred cases were observed at Hospital 1, and 104 at Hospital 2. The elective vs non-elective status and surgical specialty of the cases is shown in table 1.

Table 1

The distribution of cases at the two hospitals classified by acuity and surgical specialty

Primary outcomes

SSC domain compliance and mean domain completion are given in table 2. Surgical specialties were grouped for logistic regression analysis as: general surgery, orthopaedics and other (plastics, urology, vascular, neurosurgery and gynaecology). Hospital and surgical specialty were significant predictors of domain compliance and completion (where 0=domain not completed, and 1=all applicable domain items completed) for all domains. The likelihood of the Sign In domain being fully or partially completed was greater at Hospital 1 (OR 7.463, 95% CI 5.382 to 10.381, p<0.001) and for general surgery cases (OR 1.255, 95% CI 1.020 to 1.541, p=0.031). The likelihood of the Time Out domain being fully or partially completed was greater at Hospital 1 (OR 18.102, 95% CI 11.531 to 28.456, p<0.001) and for general surgery cases (OR 1.852, 95% CI 1.398 to 2.452, p<0.001). The likelihood of the Sign Out domain being fully or partially completed was greater at Hospital 1 (OR 9.895, 95% CI 4.595 to 21.285, p<0.001) and general surgery cases (OR 2.442, 95% CI 1.639 to 3.640, p<0.001), but less for orthopaedic cases (OR 0.096, 95% CI 0.028 to 0.325, p<0.001). Case acuity (elective vs non-elective) was not a significant predictor for domain completion for any domain (Sign In elective cases OR 1.077, 95% CI 0.776 to 1.493, p=0.657; Time Out elective cases OR 0.974, 95% CI 0.620 to 1.531, p=0.910; and Sign Out elective cases 0.604, 95% CI 0.298 to 1.228, p=0.164).

Table 2

Domain compliance and mean (range) domain completion

Item completion (the percentage of domain-compliant cases in which each individual item was administered) is given in table 3, which also shows the percentage of domain-compliant cases in which the item was not applicable, and non-compliance was, therefore, expected (column labelled ‘NA’ (not applicable)). For example, the surgical team was often not present during Sign In at either hospital, and so a response from this team to the statement of allergies was not expected. Finally, table 3 (column labelled ‘outside’) shows the percentage of the total cases in which an item was not formally administered as part of the SSC (either because the entire domain or the particular item was omitted) but was discussed at some point outside the context of SSC administration. Note that the denominator here is different to that for the item compliance or ‘NA’ columns, so the row totals may exceed 100%.

Table 3

SSC item completion

Secondary outcomes

OR team engagement during administration of the SSC is shown in table 4. No cases were observed in which all members of all teams present were engaged for the administration of a domain. Interpretation of these data must take account of the frequent and accepted absence of the surgical team at Sign In; a member of the surgical team was present in just 11% of cases at Hospital 1 and 19% of cases at Hospital 2. The compliance with recommendations around timing of domain administration is also reported in table 4.

Table 4

Team engagement during domain administration, and timing of domain administration

Data quality

Simultaneous observations for evaluation of interobserver reliability produced 266 assessable data points at Hospital 1, of which 96% were concordant and 275 assessable data points at Hospital 2, of which 90% were concordant. Re-entry of 5% of cases to evaluate data entry quality produced 99% accuracy for the resultant 570 data points.

Discussion

We recorded domain compliance of 96, 99 and 22% for Sign In, Time Out and Sign Out, respectively at Hospital 1. Domain compliance was considerably lower at Hospital 2 (table 2), where Sign In was conducted in under a third of cases and Time Out in less than half. Domain completion was also lower at Hospital 2, with the exception of Sign In items of which 69% were completed on average compared with 59% at Hospital 1. Failures to administer some items during formal use of the checklist at Hospital 2 were partly mitigated by discussion of those items at other times. For example, the item pertaining to thrombo-prophylaxis requirements was administered in 50% of observed Time Out domains and was discussed in 24% of cases at times not clearly linked to checklist administration.

Compliance with SSC administration is important because it appears to be associated with improved patient outcomes.2 ,4 De Vries et al2 introduced a patient safety system with a SSC as a crucial component in a controlled study, and reported complication rates of 7.1% when completion of checklist items was greater than the median of 80% versus 18.8% when it was less than the median. Van Klei et al4 reported a significant reduction in perioperative mortality when the WHO SSC was fully completed, but not when the SSC was incomplete or unused. We did not evaluate the effect of compliance on patient outcome. However, we believe these previous studies provide data which support the contention that compliance impacts on potential safety benefits.

Our study hospitals have demonstrated different levels of SSC compliance. One potential reason for the difference may be the way in which the checklist was introduced. Several studies have identified active leadership, a clear rationale for checklist use, exemplars of ideal practice and an ongoing process of discussion, training and feedback as important for successful implementation of checklists in an OR setting.7–9 Hospital 1 participated in the original WHO SSC pilot study.1 This involved extensive engagement with OR staff in an attempt to optimise SSC use during the study, including training seminars by the international study principals, and the wide dissemination of written materials on the premise of the SSC and its correct use. A study coinvestigator and a study nurse were present in the OR on a regular basis over a protracted period to field questions and audit practice. These implementation process factors almost certainly contributed to establishing a ‘checklist discipline and culture’ at Hospital 1 that has persisted over time.

Hospital 2, by contrast, was not part of the study and was largely left to its own devices when rolling out the SSC. A series of internally convened staff education seminars were held, but Hospital 2 did not have the benefit of the comprehensive SSC roll-out programme required for participation in the WHO study. This appears to have fostered an incomplete appreciation of the SSC as a tool that facilitates communication and teamwork in addition to simply preventing items being overlooked. We doubt this is unique to Hospital 2. One recent study made the observation that SSC Time Out appeared to be treated as ‘a double-checking routine that someone should go through (as opposed to a team effort)’.10 Some staff seem to believe that the discussion of checklist items outside the context of formal SSC administration (table 3) is acceptable, and this contributes to a perception of adequate compliance with SSC use at Hospital 2.

Neither hospital exhibited a high rate of domain compliance for Sign Out. In fact, we found poor compliance with Sign Out at Hospital 1 (administration in only 2% of cases) in 2010.6 This observation was fed back and discussed at an OR staff forum to identify a potential solution. There was consensus that, unlike the other domains, Sign Out was not clearly linked to an easily identifiable OR event. The resulting ambiguity around when it should occur frequently resulted in it not occurring at all. This issue has also been noted by others,11 and an attempted resolution at our centre involved linking Sign Out to completion of the first swab and instrument count. The process of feedback, discussion and identification of a potential solution has resulted in an improvement in Sign Out compliance from 2% in 2010 to 22% in 2011. This minor improvement illustrates both the challenges of achieving behavioural change and the requirement for more comprehensive interventions when attempts are made to establish and maintain new practice.

Ideally, all OR staff would be fully engaged in checklist administration. We defined ‘acceptable’ engagement as at least one member of each team participating in checklist administration without other activity or conversation. Even by this definition, engagement was generally poor at both hospitals. However, one unanticipated finding was better team engagement in Sign In and Time Out at Hospital 2. Sign In and Time Out domains were administered more often at Hospital 1, but when they were completed at Hospital 2 we observed that two and three team engagements occurred more frequently (table 4). All domains at Hospital 1 were led by a circulating nurse, whereas at Hospital 2 a member of the anaesthetic and surgical teams led Sign In and Time Out, respectively. This tactic ensured the involvement of at least one member of those teams most central to the processes occurring at that time, and whose failure to properly engage made ideal completion of many checklist items impossible. It had the added advantage that several senior members of the OR team lead checklist administration by example. The improvement in engagement achieved by this approach is potentially very important. For example, engagement is crucial in preventing the communication break-down described by Lingard et al12 as ‘audience failures’.

Strengths and weaknesses

We used trained observers to directly observe SSC administration in real time, as opposed to retrospectively reviewing SSC forms completed by OR staff. Our data reflect not only how often SSC domains and items are vocalised, but also when, by whom and to whom. The observers attended roughly equal numbers of cases at both sites, and inter-rater reliability was high. We used multiple tests to investigate our study hypothesis, and this is associated with an increased risk of Type I error.

The balance of elective and non-elective cases was substantially different between sites (Hospital 2 cases were predominantly designated ‘non-elective’). This arose because Hospital 2 runs several OR suites, and cases at the suite attended by students are mostly non-elective. However, no cases involved emergency surgery per se where time pressure might affect compliance with the SSC. As previously noted, all patients had been admitted to a ward and evaluated before surgery. It follows that the elective versus non-elective status should not have influenced SSC use. Indeed, domain compliance was similar in the elective and non-elective cases observed at Hospital 1, and regression analysis did not identify case acuity as an independent influence on domain completion for any of the domains. Re-evaluation of data from our 2010 study at Hospital 1,6 which used identical outcome measures, also showed no difference in domain compliance between the 54 elective and 46 non-elective cases observed at that time.

The SSC has been in use for 5 years at Hospital 1 as against 2 years at Hospital 2, and so a maturation effect may have confounded checklist use at the two sites. However, our experience suggests that enthusiasm and diligence around checklist administration declines rather than increases over time, and others have reported that checklist use declines after active encouragement from a research team or similar group is withdrawn.9 ,11 Another potential confounder is that a small number of staff probably have worked (or do work) at both hospitals. However, this confounder would also serve to reduce any differences between the hospitals rather than magnify them. Finally, we cannot definitively exclude the possibility that other unapparent site contextual factors may have influenced relative compliance at the two sites.

Summary

We investigated compliance and quality in administration of the WHO SSC at two tertiary teaching hospitals. Domain compliance was higher at Hospital 1 where the SSC was introduced as part of a formal study that brought comprehensive training, strong leadership and early support for its use. This finding suggests that checklist roll-out strategies can make a difference, and Hospital 2 is likely to be more representative of institutions implementing the SSC independent of a study protocol. As a consequence, it cannot be assumed that outcome benefits of the SSC demonstrated in formal studies will be achieved simply by SSC adoption unless there is careful attention to maximising compliance.

Comprehensive education, motivation, feedback and engagement of relevant leadership ‘champions’, akin to that which took place as part of the original WHO SSC study,1 is probably a model of what is necessary to establish good checklist practice. However, Hospital 1 has several deficiencies (including poor compliance with Sign Out and suboptimal team engagement) suggesting that, even with a well-planned implementation, repetitive audit and feedback is needed to maintain good practice. Optimisation of checklist design to suit local circumstances may also be important and provides an opportunity to involve checklist users with its development. On the basis of these findings, our group is in the process of developing and evaluating educational strategies that could be adopted for use during roll out of the SSC at a new site, or for reinvigoration of its use at established sites where compliance is poor.

What is already known on this topic

  • Surgical checklists can reduce perioperative complications and mortality, but the magnitude of such benefits appears greater if compliance with checklist item administration is optimised.

What this study adds

  • Compliance with the WHO surgical safety checklist at a centre where the checklist was introduced as part of a comprehensive investigation (the original Safe Surgery Saves Lives Study) differed from that at a similar and colocated centre which adopted the checklist independently of a study protocol. Compliance at institutions involved in checklist research does not always reflect that in the wider hospital community.

  • Successful integration of checklists into existing operating room systems requires a careful and comprehensive implementation strategy. Strategies used in the inception of the Safe Surgery Saves Lives Study at our institution, such as comprehensive staff education, recruitment of influential checklist ‘champions’, a clear definition of what constitutes correct checklist compliance, and strategic involvement of key team members, may help achieve better checklist compliance.

Acknowledgments

The authors would like to acknowledge Dr Matthew Pawley for his assistance with statistical analyses.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

    Files in this Data Supplement:

Footnotes

  • Contributors KC, JH, AM, SM, FS and JW were responsible for study design. LG, JH and JK were responsible for data collection. FS and SM provided clinical oversight of data collection at the two study sites. SM, AM and FS provided general oversight of the study processes. JH, SM and JW wrote the initial draft of the manuscript. All authors provided critical revision of manuscript.

  • Competing interests AFM was the anaesthesia lead in the WHO Safe Surgery Saves Lives initiative and is Chair of the Board of the Health Quality and Safety Commission New Zealand.

  • Ethics approval This study was approved by the Northern Y Regional Ethics, New Zealand. Approval number NTY/10/EXP/077.

  • Provenance and peer review Not commissioned; externally peer reviewed.