Article Text

Establishing a global learning community for incident-reporting systems
  1. Julius Cuong Pham1,
  2. Sebastiana Gianci2,
  3. James Battles3,
  4. Paula Beard4,
  5. John R Clarke5,
  6. Hilary Coates6,
  7. Liam Donaldson7,
  8. Noel Eldridge8,
  9. Martin Fletcher9,
  10. Christine A Goeschel10,
  11. Eugenie Heitmiller11,
  12. Jörgen Hensen12,
  13. Edward Kelley13,
  14. Jerod Loeb14,
  15. William Runciman15,
  16. Susan Sheridan16,
  17. Albert W Wu17,
  18. Peter J Pronovost18
  1. 1The Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
  2. 2World Health Organization, World Alliance of Patient Safety, The Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
  3. 3Agency for Healthcare Research and Quality, Rockville, Maryland, USA
  4. 4Canadian Patient Safety Institute, Edmonton, Canada
  5. 5Drexel University, ECRI Institute, Plymouth Meeting, Pennsylvania, USA
  6. 6Royal College of Surgeons, Dublin, Ireland
  7. 7World Alliance for Patient Safety, Geneva, Switzerland
  8. 8Veterans Health Administration, Washington, District of Columbia, USA
  9. 9National Patient Safety Agency, London, UK
  10. 10The Johns Hopkins University Schools of Medicine, Nursing and Public Health, Baltimore, Maryland, USA
  11. 11The Johns Hopkins University School of Medicine, Baltimore, Maryland, USA
  12. 12National Board of Health, Copenhagen, Denmark
  13. 13World Health Organization, Geneva, Switzerland
  14. 14The Joint Commission, Oakbrook Terrace, Illinois, USA
  15. 15Australian Patient Safety Foundation, Adelaide, Australia
  16. 16Consumers Advancing Patient Safety, Chicago, Illinois, USA
  17. 17The Johns Hopkins University Schools of Medicine and Public Health, Baltimore, Maryland, USA
  18. 18The Johns Hopkins University Schools of Medicine, Public Health and Nursing, Baltimore, Maryland, USA
  1. Correspondence to Dr Julius Cuong Pham, The Johns Hopkins University School of Medicine, 1909 Thames Street, 2nd Floor, Baltimore, MD 21231, USA; jpham3{at}


Background Incident-reporting systems (IRS) collect snapshots of hazards, mistakes and system failures occurring in healthcare. These data repositories are a cornerstone of patient safety improvement. Compared with systems in other high-risk industries, healthcare IRS are fragmented and isolated, and have not established best practices for implementation and utilisation.

Discussion Patient safety experts from eight countries convened in 2008 to establish a global community to advance the science of learning from mistakes. This convenience sample of experts all had experience managing large incident-reporting systems. This article offers guidance through a presentation of expert discussions about methods to identify, analyse and prioritise incidents, mitigate hazards and evaluate risk reduction.

  • Incident-reporting systems
  • patient safety
  • risk management
  • medical mistakes
  • safety reporting systems
  • medical error
  • qualitative research

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Conceptual model

Incident-reporting systems (IRS) are a cornerstone of patient safety improvement efforts.1–4 Such systems are loosely defined as mechanisms for documenting occurrences that are not consistent with routine healthcare operation or patient care.5 During the past decade, organisations in several countries have established local, state or national IRS as repositories of information about hazards, medical mistakes and system failures to learn valuable lessons that will help prevent the recurrence of similar incidents. Through the Patient Safety and Quality Improvement Act of 2005, the USA has established Patient Safety Organizations (PSOs) and a Network of Patient Safety Databases (NPSD).6

Healthcare organisations have learnt and benefited from IRS. Incident-reporting systems have been associated with improved safety culture7 and increased awareness about patient safety, and fostered a better understanding of the extent and profile of safety shortcomings. However, translating reporting into learning and using this knowledge to improve the safety of frontline care is a daunting challenge. Compared with other high-risk industries, healthcare IRS are often fragmented, isolated and relatively immature. There is limited evidence regarding their effectiveness, and no best practices for their implementation and utilisation.8 Adverse events occurring in one institution recur in other institutions, often with the exact same causes and contributing factors. Systems are not learning from each other locally, much less internationally. Clinicians and researchers are struggling to advance the science of patient safety, understand its epidemiology, clarify priorities, implement scientifically sound yet feasible interventions and develop measures to evaluate progress.9

With support from the WHO World Alliance for Patient Safety (WAPS), patient safety experts from eight countries representing active or planned IRS convened in Baltimore, Maryland on 26 and 27 June 2008 to establish a global community to learn how IRS could improve patient safety (see online appendix A lists the participants). This convenience sample of experts all had experience in managing large incident-reporting systems. Participants strategised ways to improve the effectiveness and efficiency of reporting systems.

This paper summarises the experiences of various organisations managing IRS and describes a global community to help advance the science of learning from mistakes. While formal methods to establish consensus were not used, the paper notes areas in which group consensus was obvious (box 1). We used a conceptual model for learning from adverse events to organise these findings (figure 1), and explore how reporting systems are used to identify, analyse and prioritise incidents, to mitigate the hazards discovered, and to evaluate whether risk-reduction interventions were effective.

Box 1 Summary of areas with group consensus

  1. IRSs in healthcare are immature compared with similar systems in other high-risk industries.

  2. Data from IRS do not represent rates of event occurrences; they should not be represented as such; nor should these data be expected to perform such a function.

  3. Although free-text data are difficult to analyse, they provide rich information that is an essential component of IRS.

  4. Improvements in patient safety are difficult to demonstrate due to the lack of a clear definition and surveillance system for such events.

  5. The scope and expectation of an IRS should be clearly defined. IRS should be appropriately funded and staffed to meet these expectations.

  6. Some resources should be directed at measuring the effectiveness and (intended and unintended) consequences of IRS interventions.

  7. Nationally and internationally, IRS have much to benefit from sharing resources and experiences.

Figure 1

Patient safety learning communities relate to each other in a gear like fashion: as the identified hazards require stronger levels of intervention to achieve mitigation, the next learning community is engaged in action, eventually feeding back to the group that provided the initial thrust. Each group (unit, hospital, industry) follows the same four- step process, but they engage unique matrices of stakeholders to mitigate hazards that are within their locus of control.

How do reporting systems identify incidents?

Current identification methods

There is variation among systems/countries relative to who identifies and reports incidents, and what information is collected. Incidents are reported by clinicians/frontline staff, hospital administration safety or clinical risk officers. Generally, clinicians have more detailed first-hand knowledge of the incident, while safety/risk officers have more time and resources to investigate and analyse the incident. Recently, several IRS started allowing patients and their families to report safety incidents.10 11 This provides an exciting and unique perspective that is often missed by healthcare providers and can provide important information on quality.12 13

Systems collect either a broad range of event types or one type of particular interest to the managing organisation (table 1). These events often occur in, but are not limited to, hospitals. Depending on the system, reports can come from outpatient clinics, skilled nursing facilities, nursing homes and rehabilitation centres. Moreover, the information these systems elicit and collect about an incident ranges from descriptive recounts to a structured taxonomy and comprehensive classification system (eg, incident types). Structured data are easier to analyse. Nevertheless, participants highlighted the value of narrative data for understanding the ‘story’ of the incident. Finally, systems may collect incidents irrespective of degree of harm (including near misses), or only events that lead to severe harm, disability or death.16

Table 1

Descriptive characteristics of incident-reporting systems

Factors that influenced identification

Whether the IRS was mandatory or voluntary strongly influenced how hazards were identified. Mandatory systems often had a specific purpose that was governed by a regulatory or other entity that implicitly or explicitly held someone accountable.17 Such systems wanted full disclosure of information, and often only examined adverse events leading to severe patient harmed. Conversely, voluntary systems typically covered a wide range of reportable events, were often anonymous or confidential to encourage reporting and focused on learning from all adverse events.2 With all systems, tension existed between reporting and accountability; if the latter led to punitive action, reporting decreased.

How do incident-reporting systems analyse and prioritise hazards?

Current analysis methods

An in-depth analysis of individual reports is the most common method of analysis.18 Procedures for in-depth analysis include clinical expert screenings, informal investigations and systematic root cause analyses (RCA).19 Properly conducted RCAs can yield valuable information. While this method may be feasible for local low-volume reporting systems, it is too labour-intensive for national high-volume systems.

To discern information from large amounts of data, organisations reporting and collecting incidents often generate summary rates using a variety of denominators (time, discharged patient, bed day and dispensed dose).20 They may stratify rates by incident type, contributing factors, hospital work area or incident outcome (eg, proportion of incidents resulting in death). This analysis method is done to make some sense of the data and provide a broad overview of events in the system. However, these rates are sometimes mistakenly used by regulators and those outside the reporting organisation to interpret the success of safety improvement efforts. Meeting participants agreed that information in IRS should not be expressed or interpreted as a valid rate.21 22 Valid measurement of rates requires a robust definition for the event (numerator) and those at risk for the event (denominator), and a standard and active surveillance system. An IRS needs all three attributes to evaluate progress in patient safety. While data patterns (eg, contributing factors, types of incidents and outcomes) over time may offer some insights, reporting bias limits the ability to gauge progress in improving patient safety.14 This is an area that is ripe for further exploration.

Participants describe informal surveillance mechanisms to detect recurring patterns of harmful or fatal events (eg, ‘…reviewing events on a regular basis…’). This sense-making method allows for the analysis of a large number of incidents and preserves the valued in-depth reviews. However, it relies on the astute attention and knowledge of individuals that is likely non-sustainable, and adds bias and variance that is inherent with multiple reviewers. A few systems use or are developing data-mining software to cluster incidents into categories to identify hazards that need further analysis.23

Factors that influenced the analytical method

Participants described several factors that affected the analytical method chosen. Foremost was the quality of data. Systems with rich and detailed ‘free-text’ data undertook in-depth single event analysis, while those with more structured data were limited to summary statistics but able to categorise and recapitulate data more efficiently. Resource availability (dollars and analytic personnel) also played a significant role in an organisation's ability or method of analysing data. The quantity of data, maturity of a system, and experience of the organisation allowed for more complex analyses, such as the Harm Susceptibility.24

Factors that influenced methods to prioritise hazards

Participants were asked, ‘How do you prioritise which hazards you will attempt to mitigate?’ Most agreed that severity of harm was a key factor, with unanticipated deaths or severe patient harm often receiving immediate priority for mitigation. Second was the frequency of occurrence or probability of future recurrence of the event. Therefore, high-harm events with a high probability of recurrence often received the highest priority for risk-reduction efforts. Several IRS organisations (eg, Australia, Denmark) used the US Veterans Administration's risk matrix, and some (eg, England and Wales) examined reports to determine whether a serious incident was a one-off occurrence or among a wider pattern of risk.

Another factor was risk to the organisation's reputation.25 Media attention from high-profile events (eg, MRSA26) and incidents that could serve as nationwide teachable moments often invoked institutional responses. Finally, system-wide problems that consumed resources and could yield broad patient safety dividends were given higher priority than local problems.27 Overall, IRS organisations used a mix of simple, but often subjective, methods to prioritise hazards that are rudimentary compared with the sophisticated mathematical processing methods used in other industries.28

How do incident-reporting systems mitigate hazards?

Current mitigation methods

While there are variations in how event reporting leads to improvements in patient safety, everyone recognised that a stronger link was needed between identifying and mitigating hazards. Methods currently used to draw attention to important issues identified through IRS are outlined in table 2. Tools have been developed at the organisation level to aid them in risk mitigation.32

Table 2

Current mitigation methods by incident-reporting systems and reporting organisations

Factors that influenced mitigation methods

Several factors influenced the methods organisation's used to mitigate hazards. One factor was the level (local unit, hospital, regional, national) at which the hazard existed. For example, events occurring in multiple healthcare organisations required an equally broad-based fix (eg, redesign of equipment by a manufacturer), while a local unit problem required a unit or hospital fix (eg, new procedures). Another factor was the experience/maturity of the organisation. Organisations typically used risk mitigation methods with which they had prior success. While participants stated that their methods were successful, evaluations were largely anecdotal. Using IRS data to implement patient safety improvement efforts is relatively new.

An IRS or reporting institution's philosophical approach to intervening, either implicitly or explicitly, influenced the types of interventions used to mitigate risks. Moreover, whether the institution employed IRS for accountability or for education influenced the method used.33 Organisations motivated by accountability generally instituted performance evaluation methods, while education-oriented organisations generally analysed incidents and disseminated learning. Although the importance of individual accountability (responsibility for actions and decisions) should not be downplayed, attaching punitive actions to reporting systems will likely result in under-reporting. Accountability systems only work if the lessons learnt from incidents are directed towards quality improvement and patient safety.

Available resources also affected an organisation's approach to mitigate hazards. Most IRS organisations had limited resources and faced difficult choices when attempting to meet stakeholder/policymaker expectations, or to lead substantive risk-reduction efforts, particularly given local implementation challenges. Another factor was the scope of influence and political strength of an organisation. Organisations that could influence laws, create policies, affect reimbursement and impact licencing employed more active risk mitigation mechanisms (eg, auditing performance). For example, The Joint Commission requires that all accredited facilities engage in failure mode and effects analysis (FMEA) for at least one high-risk process.34 Finally, some incident types are more amenable to specific avenues of risk mitigation. The best solution for a device defect, for example, would be a product recall or regulations requiring that manufacturers redesign it.

How do reporting systems evaluate the effectiveness of risk-reduction interventions?

Current evaluation methods

Overall, IRS organisations believe their work improves patient safety. Qualitative assessment and timely, actionable feedback are used in healthcare and other industries to evaluate IRS effectiveness.35 However, organisations acknowledge that this improvement in patient safety is a challenge to demonstrate objectively. In figure 2, for example, The Joint Commission found an increase in wrong-site surgery reports after implementing a programme to decrease these events. The increase is likely due to better reporting. However, the public and policymakers often misinterpret these data as rates, and use them to evaluate patient safety progress. Few organisations have rigorous methods to accurately interpret the extent to which preventable harm has decreased from their efforts.

Figure 2

Wrong-site surgeries reported by year.

Factors that influenced evaluation methods

Participants discussed difficulties facing IRS when evaluating the effectiveness of their interventions. First, most safety indicators cannot (without substantial resources) be validly measured as rates, since IRS lacks clear definitions for the numerator and denominator, and a robust mechanism to identify both.21 36 Second, collecting data to evaluate progress can be difficult and expensive.21 22 Reporting systems must rely on participating organisations to submit data, which may be of insufficient quality for valid analysis. Moreover, a small percentage of incidents are reported, and the type of events submitted varies over time, limiting the use of IRS data to make inferences about progress in patient safety. Also, few IRS organisations have the technical and financial resources to manage data quality and evaluate patient safety improvements. Given the large volume of events reported and interventions undertaken to mitigate identified risks, it would be challenging to evaluate many or all of them. Just as organisations need to prioritise hazards, they need to prioritise what they evaluate.

Third, the dynamics of a country's national health system may limit IRS activity. Some systems do not have the political weight or resources to implement large-scale efforts that evaluate the effectiveness of interventions. Finally, many IRS organisations have funding solely to collect incidents, but policymakers expect them to learn from these events, develop and implement interventions to reduce the risks, and evaluate the effectiveness of the interventions.

Challenges of incident-reporting systems

The participants identified several challenges that are slowing improvement efforts that should be addressed if IRS is to play a vital role in improving patient safety. First is the debate over using IRS to learn from incidents or to hold someone accountable, which has created ambiguity about its purpose. Policymakers and system managers must first set clear and realistic goals for IRS. In theory, participants felt that learning should be the primary focus. In practice, many systems are seeking to balance learning and accountability goals, although the latter may cause a drop in reporting practices.

Another challenge is clarifying the scope of IRS and those responsible for developing methods that reduce the risks discovered during data analysis. Hospital leaders and their organisations should partner with IRS organisations to develop and test methods to reduce risk and achieve specific goals. The conceptual model describing patient safety communities could provide guidance (figure 1).37

Organisations managing IRS are also challenged by insufficient technical and financial resources to execute the following goals: develop a standard terminology for event coding (numerator) and standard methods to acquire the denominator (eg, CPT codes from billing databases); manage and analyse the data; and develop, implement and evaluate the effectiveness of interventions. The roles and responsibilities of the organisations managing an IRS, hospitals reporting to the system and entities that perform other crucial roles (eg, measurement of patient safety) should be clarified to accomplish these goals. It is shortsighted to support reporting without also supplying sufficient resources to support efforts to mitigate hazards.

Learning from the data is yet another challenge. Current methods to analyse and prioritise, among a large number of incidents in IRS, those to address are nascent. A multifaceted approach that incorporates a variety of methods, such as clinical audits, a pooled analysis of findings from incident investigations, sense-making from narrative data and proactive identification of risks, is needed. Also needed are new statistical and analytical methods to prioritise events from large volumes of data and to determine the level of intervention. These methods should recognise the inherent characteristics of incident reporting data: under-reporting, reporting and surveillance bias, unknown denominators, missing and poor quality data, and reporter variation.

It is also important to monitor risks or unintended consequences stemming from interventions to mitigate harm.38 Those implementing interventions should monitor for potential benefits, harms and costs, build in safeguards and seek to improve the efficiency and effectiveness of their risk-reduction efforts.

Another challenge is determining how to prevent future harm. The untested methods currently used by organisations to mitigate risk should evolve and include research to determine interventions that will work best for specific causes, contributing factors, types of events and levels of defect.

Path forward

Early experience with IRS has advanced the science of reporting. However, it is inefficient, less effective and costly for organisations to learn in isolation. Establishing a global community for continued learning and collaborating could accelerate the speed with which this science grows.

The WHO's WAPS can coordinate this global community and be nimble in learning, identifying and sharing system fixes. They could facilitate efforts to advance the science in each domain illustrated in the conceptual model (figure 1). Specific short-term goals discussed were: (1) exploring an international serious adverse event list, (2) exploring and mitigating an international patient safety issue, (3) developing new partnerships with professional societies, (4) undertaking joint pilot testing of interventions, (5) exploring an efficient and effective IRS design and (6) developing the WAPS international classification efforts. The forum will be useful in learning, identifying and sharing system fixes.


There was group consensus that new and existing reporting and learning systems should collaborate. Participants will explore ways to expedite quality and safety improvement and will lead others down this path. This new virtual global learning community will provide unique opportunities to address all of the issues outlined in this paper. This collaboration is an auspicious beginning in advancing the science of patient safety.

What this forum adds

  • Incident-reporting systems (IRS) in healthcare are fragmented and isolated, and function without best practices for implementation and utilisation. A global community of organisations that manage IRS is needed to advance the science of learning from mistakes, announce safety warnings and take action to reduce risks and improve patient safety.

  • IRS can play a vital role in patient safety improvement if clear and realistic goals are set, the scope of IRS and the roles and responsibilities of IRS stakeholders are defined, and sufficient technical and financial resources to develop an infrastructure are provided.

  • Equally important to the success of IRS are effective and efficient methods to analyse and prioritise incident data, develop and implement mitigation interventions, evaluate the effectiveness of interventions to reduce future harm and disseminate effective interventions around the world.


The authors wish to thank CG Holzmueller, BLA, the Medical Writer/Editor for the Johns Hopkins University Quality and Safety Research Group, for her assistance in editing this manuscript.


Supplementary materials


  • Funding The WHO World Alliance for Patient Safety (WAPS) did not directly influence the meeting agenda, the drafting of the manuscript or the content of the manuscript.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Linked Articles

  • Quality lines
    David P Stevens