Article Text

Download PDFPDF

Learning from incidents in healthcare: the journey, not the arrival, matters
  1. Ian Leistikow1,2,
  2. Sandra Mulder1,
  3. Jan Vesseur1,
  4. Paul Robben2,3
  1. 1Medical Specialist Care, Dutch Healthcare Inspectorate, Utrecht, The Netherlands
  2. 2Department of Health Policy and Management, Erasmus University, Rotterdam, The Netherlands
  3. 3Dutch Health Care Inspectorate, Utrecht, The Netherlands
  1. Correspondence to Dr Ian Leistikow, Medical Specialist Care, Dutch Healthcare Inspectorate, PO Box 2518, 6401DA Heerlen, The Netherlands; ip.leistikow{at}igz.nl

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Incident reporting is widely recognised as an important method for improving safety in healthcare, and many countries have established their own incident reporting systems.1 However, the actual value of these systems is increasingly subject to debate.2 Reporting systems, both local and national, are overwhelmed by the volume of reports and fall short in defining recommendations for improving healthcare safety: ‘We collect too much and do too little’.3

The purpose of these systems is also under debate. The UK, for example, struggles to clarify whether incident reports should be used to help healthcare organisations learn, or whether they should help regulators and funders to make judgements.4 As healthcare inspectors tasked with running a national hospital incident reporting system (IL, SM and JV), we recognise the issues described above. In this article, we show how the theories in the evolving scientific literature on incident reporting apply to our situation. Our work since 2012 acts as an empirical example of how reporting systems could have an effect if they focus on the learning process within hospitals instead of on solutions for reported safety issues. As TS Eliot is quoted as saying: “The journey, not the arrival, matters.”

Learning how to hit a moving target

The conception of ‘incident’ changes over time

Vincent and Amalberti argued that safety in healthcare is a moving target, because innovation and improving standards in healthcare alter the conceptions of both harm and preventability.5 This dynamic view of healthcare safety can be illustrated by the 1996 Dutch legal definition for a sentinel event (SE), the most serious class of incidents that healthcare organisations are mandated to report (see box 1). Since rising standards of care influence the way incidents are judged, an incident in 2005 can be judged differently in 2015 using the identical definition.6 Moreover, the standards of the healthcare quality community change faster, because they stay up to date on quality improvement innovations, than the standards of many frontline personnel. As a consequence, there is an endless discussion between hospitals and the healthcare inspectorate as to which incidents are classified as SE and should therefore be reported. Another consequence is that our national incident reporting system cannot provide the inspectorate with a representative view of clinical safety issues, or a means to measure safety improvement over time.

Box 1

Sentinel event

A sentinel event (Dutch: calamiteit) is defined in the 1996 Healthcare Organisations Quality Act as an unintended and unexpected event, related to the quality of care and having caused death or serious harm to the patient. All healthcare organisations are mandated to report sentinel events to the Healthcare Inspectorate within 3 days after discovery.

Standards for corrective actions change over time

Just as the standards for what is defined as incident shift, so do the standards for corrective actions. For example, Behr et al7 conducted research into three healthcare incidents that attracted significant public attention. These were a paediatric cardiology case in 2001, a cardiothoracic case in 2005 and a neurology case in 2009. Although the cases were similar in many respects, the authors found a shift in the way the cases were perceived. The 2001 case was seen as a professional problem, the 2005 case a managerial problem and the 2009 case a governance problem. Each perception led to corrective actions fitting that view: removal of the professionals, sacking of the board and problems for the regulator. What was deemed an adequate corrective action in 2001 was deemed inadequate 4 years later. If we are to keep up with advancements in safety-related insights then the goal of reporting systems should not be the corrective action itself, but the ability to determine appropriate corrective actions.

Healthcare providers need to be able to hit moving targets

Incidents are not something in themselves; they are symptoms of a larger problem. Instead of targeting the ‘symptoms’, it seems better in the long term to target the ‘causes of the disease’. That is to have the healthcare providers learn how to cope with the infinite variability of safety issues by learning how to analyse these and how to devise corrective actions that fit their local setting. Incident reporting systems should therefore lead to social and participative learning at the local level.3 Safety issues will keep sneaking up on healthcare providers from all directions. Hitting a moving target over and over again requires a different set of skills from hitting a fixed target once. Being engaged in analysing incidents can serve as a catalyst for changing the way healthcare providers think about risk and increase their vigilance.8 Healthcare providers will only learn to hit this continuous stream of moving targets once they properly analyse their incidents.

Shifting the goal of incident reporting

Focusing on the learning, not the outcome

Dutch hospitals are mandated by law to report all serious incidents, defined as SEs, to the Healthcare Inspectorate (IGZ) (see box 2). The IGZ receives about 800 SE reports annually from the 93 Dutch hospitals. In 2012, the IGZ decided to shift its focus from what hospitals learn from their SEs to how hospitals learn from their SEs. The inspectors felt that this would improve the effect of their work. In 2010, the IGZ had made a similar shift in their supervision of suicide reports, turning the focus on the organisational learning ability of mental healthcare institution.9 Organisational learning is defined as the process of creating and applying valid knowledge to enable an organisation to improve.10 The IGZ expected that the learning ability of hospitals would improve by addressing conditions for learning from SEs, which would contribute to safety.11 ,12

Box 2

IGZ

The Dutch Healthcare Inspectorate (in Dutch: Inspectie voor de Gezondheidszorg; IGZ) is part of the Ministry of Health, Welfare and Sport. The IGZ oversees and regulates all Dutch healthcare providers and professionals, as well as all medicines, medical devices and medical technology. The IGZ is mandated to use enforcement measures if those inspected do not comply.

The quality of the learning process is now quantified using the 2012 WHO draft report ‘Concise Incident Analysis’, supplemented with extra items on patient engagement (see box 3).13 Each SE analysis report receives a score of between 0% and 100%, indicating the percentage of items addressed adequately. Since July 2013, the scores have been added to a database showing the quality of SE analysis reports over time. These figures are benchmarked to give the IGZ insight into the quality and rate of improvement of each individual hospital compared with other hospitals (see figure 1). These data are discussed with each hospital individually during an annual meeting between the hospital board and the IGZ as one of many agenda items. If the data are a cause for concern, then the IGZ plans a separate meeting with the board and the hospital's SE investigating committee. The data are not publicly accessible.

Box 3

Questionnaire for scoring used by the IGZ

These questions are used by inspectors to judge the quality of a sentinel event analysis report. Points can be scored with each question. Sometimes a question is irrelevant, for example, ‘Was external expertise consulted?’, when external expertise would not add anything of value to the analysis. The total amount of points is divided by the total amount of relevant questions, leading to a percentage. This percentage is the overall score of the sentinel event analysis report.

Process

▸ How soon after the event was identified did investigation start?

▸ Is the investigating committee multidisciplinary?

▸ Were any members of investigating committee involved in the incident?

▸ Is the method for analysis specified? (eg, root cause analysis (RCA))

▸ Was input sought from all personnel directly involved?

▸ Was input sought from other staff with knowledge about the care process?

▸ Was input sought from the patient/relatives?

Reconstruction

▸ Does the description of the event give a complete picture of the relevant ‘scenes’?

Analysis

▸ Has the question ‘why’ been asked extensively enough to analyse the underlying cause and effect?

▸ Have the investigators searched relevant scientific literature?

▸ Does the report state whether applicable guidelines/protocols were followed?

▸ Was external expertise consulted?

▸ Does the report state whether the medical indication for the provided care was correct?

Conclusions

▸ Does the report identify root causes?

▸ Do the root causes fit the reconstruction and analysis?

▸ Are contributing factors considered and/or identified?

▸ Are contributing factors, not under the control of the hospital, considered and/or identified?

Recommendations

▸ Does the report document recommendations for improving processes and systems?

▸ Do these corrective actions address the identified root causes?

▸ Have the corrective actions been formalised? (eg, Specific, Measurable, Attainable, Realistic and Time-Sensitive (SMART))

▸ Does the hospital have an evaluation plan to determine if the recommendations are implemented?

▸ Will the impact of the recommendations be evaluated?

Aftercare

▸ Is the aftercare for the patient/relatives described?

▸ Is the aftercare for the professionals involved described?

▸ Has the report been shared with the patient/relatives?

▸ Reaction of hospital board

▸ Is the reaction of the board adequate?

Figure 1

Example of one hospital. The columns represent the quality score (range 0% to 100%) of the 24 individual sentinel event (SE) analysis reports by Hospital X from July 2013 to October 2015. The reports are in chronological order from left to right. The dark line is the moving average, the mean score of Hospital X over the past five reports. The grey lines are the national highest, average and lowest scores (n=1675 SE reports). Hospital X scored below average and then showed a strong increase in the quality of its SE analysis reports, eventually dropping back to an average score. The national average is increasing.

Between July 2013 and August 2015, the overall average score for the quality of SE analysis reports moved from 64% to 78% (n=1675 SE reports). Shifting the focus from what hospitals learn to how hospitals learn has several advantages:

  • It is independent of changes in the conception of ‘incident’.

  • It gives healthcare professionals and boards the room to devise and apply tailor-made corrective actions.

  • Changes in the learning curve can be tracked with each new analysis report.

  • Data make it possible for the IGZ to focus on hospitals that lag behind.

  • The effect of interventions by the IGZ on specific items (eg, patient engagement) can be tracked over time.

Many have stressed the value of multidisciplinary input, as well as physicians' and hospital boards' engagement, for optimising learning and reinforcing the importance of safety.11 ,14 Based on the SE analysis reports, the annual meetings with hospitals and the on-site inspections, we can conclude that all Dutch hospitals now have multidisciplinary investigating committees. All committees include physicians (often as chair) and all hospital boards provide support for their committees and reporting systems.

Several explanations can be given why this learning process has engaged physicians, although we cannot support these scientifically. One possible reason is that the IGZ has more or less mandated hospitals to engage physicians and checks this regularly. Other factors that might play a role are increased awareness of patient safety and greater engagement of physicians in other patient safety activities, creating more interest among physicians to contribute to SE analysis.

Limitations of focusing on learning

However hard the IGZ strives for a ‘just culture’, one that is defined as a focus on learning opposed to punishing, there could always remain a sense of fear among healthcare providers that could ultimately influence their openness and thus the validity of their SE analysis reports.15 Since 2013, the IGZ has been pushing hospitals to engage the patient or a patient representative in SE analysis. This provides the IGZ with an external check on the report's validity, but it might also increase the hospital's unease. This aspect deserves further research.

Another concern is that the focus on the process instead of the content makes the IGZ miss signals of recurrent safety issues. This is in part mitigated by the team that judges the analyses because they can recognise emerging themes. An example of such a theme is serious harm to infants after anaesthesia, which was independently reported by different hospitals and led to the IGZ informing the Dutch Society of Anaesthesia.

Relying on memory is clearly not a sustainable method for a regulator, and it is reasonable to assume that the IGZ misses relevant safety issues. We also recognise that SE analysis is just one step in the path to improvement. The next step is actually implementing the recommended corrective actions and demonstrating whether the improved learning ability of hospitals leads to safety improvement. Furthermore, learning from SEs is just one of the many activities needed to improve patient safety, and the score we assign to the quality of SE analysis reports is not a measure for the overall safety in that hospital.

Conclusion

Focusing incident reporting systems on the local learning process of healthcare providers could mitigate many of the problems that have been attributed to reporting systems in the literature. We have redesigned the Dutch national incident reporting system to rate each individual hospital's learning process by scoring the quality of their SE analysis reports. Using the data this generates, we can benchmark hospitals' learning curve, act on hospitals that lag behind and track their subsequent improvement over time.

Although we describe work that is still in progress and under research, preliminary data suggests a measurable improvement in Dutch hospitals' ability to learn from their own SEs. This raises several questions, the most important being whether the quality of SE analysis reports is a true reflection of a hospital's learning process. While the effect on patient safety has yet to be proven, shifting the goal of incident reporting systems from solving specific safety issues to improving the process of learning seems a promising strategy.

Acknowledgments

The authors would like to thank the two anonymous reviewers and Naonori Kodate for their comments and feedback on earlier versions of this article.

References

Footnotes

  • Contributors All authors contributed to the manuscript. SM, JV and IL are involved in the supervisory work described.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.