Article Text
Abstract
The notion that hospitals and medical practices should learn from failures, both their own and others’, has obvious appeal. Yet, healthcare organisations that systematically and effectively learn from the failures that occur in the care delivery process, especially from small mistakes and problems rather than from consequential adverse events, are rare. This article explores pervasive barriers embedded in healthcare’s organisational systems that make shared or organisational learning from failure difficult and then recommends strategies for overcoming these barriers to learning from failure, emphasising the critical role of leadership. Firstly, leaders must create a compelling vision that motivates and communicates urgency for change; secondly, leaders must work to create an environment of psychological safety that fosters open reporting, active questioning, and frequent sharing of insights and concerns; and thirdly, case study research on one hospital’s organisational learning initiative suggests that leaders can empower and support team learning throughout their organisations as a way of identifying, analysing, and removing hazards that threaten patient safety.
- PSSC, Patient Safety Steering Committee
- leadership
- learning
- mistakes
- problem-solving
- teams
- PSSC, Patient Safety Steering Committee
- leadership
- learning
- mistakes
- problem-solving
- teams
Statistics from Altmetric.com
Highly publicised breakdowns in the healthcare system argue for the necessity of learning from failure so as to avoid recurrence. In the past few years, newspaper accounts of medical error have proliferated. Following the 1998 publication of the Institute of Medicine report detailing the prevalence of quality problems in the United State’s healthcare system, few within or outside the healthcare professions can remain unaware that many patients are harmed each year in hospitals.
Recognising the need to understand and learn from failures, physicians, managers, and regulators tend to advocate for investigative bodies to uncover and communicate causes and lessons from visible and often tragic healthcare failures, such as those seen in the press. In some cases, however major, investigations are of limited use for the goal of organisational learning from failure. Large or consequential failures typically have multiple causes that are deeply embedded in the organisations where the failures occurred, have been ignored or taken for granted for years, and are rarely simple to correct.1 Therefore, an important part of learning from failure in complex organisations such as hospitals is attention to small, everyday process failures, rather than only to sentinel events and formal investigations.2–8 Small failures are early warning signs which, if detected and addressed, may be the key to avoiding consequential failures in the future.9
Failure is defined as deviation from expected and desired results, to include both avoidable errors and unavoidable negative outcomes of experiments or uncertain actions.4 As such, failure encompasses both mistakes (human error) and problems (obstacles and other deviations that thwart expected work processes). Here, as elsewhere,4,7 I have defined failure broadly to include both large and small failures and to range from obvious technical error (for example wrong site surgery) to invisible breakdowns in communication (for example a nurse’s failure to challenge a physician’s questionable medication order). An organisation’s ability to learn from failure is measured by how it deals with both large and small failures, not just by how it handles major, highly visible crises or accidents.4,9
An example illustrates how organisational learning from small failures can work. Recently, at Kaiser Permanente, Kim Adcock, a physician, sought to better understand physicians’ failures reviewing mammograms. Owing to inherent difficulties in reading mammograms accurately, a 10–15% error rate was expected, even among expert readers. Consequently, discovering that a reader has missed one or even several tumours does not necessarily say anything about that reader’s diagnostic ability and may not provide much incentive for learning from failure. However, extensive longitudinal data can reveal meaningful patterns. Departing from tradition, when Adcock became radiology chief at Kaiser Permanente, Colorado, he utilised longitudinal data from the health maintenance organisation’s records to proactively identify failure and produce detailed, systematic feedback including bar charts and graphs for each individual x ray reader.10,11 For the first time, each reader could learn whether he or she was falling near or outside of the acceptable range of errors. He also provided readers with the opportunity to return to the misread x rays to investigate why they missed a particular tumour and to learn not make the same mistake again.
The notion that hospitals and medical practices should learn from failures—small and large; their own and others—has obvious appeal. Yet healthcare organisations that systematically and effectively learn from failures occurring in the care delivery process are rare. There are pervasive barriers embedded in healthcare’s organisational systems that make shared, or organisational, learning from failure difficult.
My central argument is that hospitals don’t learn from failure because of two interrelated organisational issues. Firstly, at the frontlines of patient care in hospitals the interpersonal climate often inhibits speaking up with questions, concerns, and challenges that might have contributed to catching and correcting human error before patients are harmed.2 Moreover, the culture of medicine more generally discourages admission of error, thereby greatly diminishing a given hospital’s potential to learn from mistakes, both consequential or not. Secondly, features of the work design and culture of most hospitals make workarounds and quick fixes the dominant response to failures, rather than root cause analysis and systematic problem solving,7 which contribute to organisational improvement and innovation. This article also suggests that both causes of and solutions to these organisational learning failures ultimately lie in leadership.
UNCOVERING THE ROLE OF INTERPERSONAL CLIMATE IN LEARNING FROM FAILURE
In the early 1990s, driven by interest in how organisations learn, I joined a team of Harvard researchers studying the rate and type of medication errors in the hospital setting, focused on assessing the incidence of error in the delivery of drugs to hospitalised patients.12 My role in this larger study was intended to be a straightforward one; to assess the team properties of the nursing units in the study and to relate these measures to the error rates being independently measured by trained medical investigators. I expected to find a negative relationship between teamwork and error. Existing theory had suggested that better coordination among team members should reduce the rate of error.13–15 I used previously validated measures of team properties,16 modifying the wording slightly with help from my nurse and physician collaborators, to be meaningful in the hospital setting, as part of my strategy for testing the presumed negative relationship between teamwork and error rates.
The central hypothesis, that better teamwork would be associated with lower rates of preventable adverse drug events, motivating the collection of data on team attributes was not supported. Instead, I stumbled into quite a different discovery. The statistical results obtained were the opposite of those predicted. Well led nursing teams with good relationships among unit members were apparently making more mistakes, or at least that was one interpretation of the significant correlation between teamwork and error rates, in what I initially considered to be the wrong direction.
This presented a puzzle. Did better led nursing teams really make more mistakes? It was difficult to accept this conclusion. Why else might stronger teams have higher error rates? One possibility was that they were more experienced and thus given tougher patients. To test this, I controlled for the severity of patient illness. The essential result not only did not change, the effect size got slightly stronger.2
Another possibility was that in well led teams a climate of openness made it easier to report and discuss error—compared with teams with poor relationships among colleagues or with punitive leaders. Perhaps the good teams don’t make more mistakes, they just report more. If better led teams were more willing to discuss errors than were other teams, they would also be in a better position to learn from and prevent future occurrences of error. This capability is profoundly important in organisations that wish to engage in continuous improvement of work processes. When small failures are neither identified widely, nor discussed and analysed, it is very difficult for larger failures to be prevented.7
Further, this interpretation of the data suggested that the larger error study might not be finding the definitive error rate, as it was intended to do, and that error might be systematically under-reported in those units in which the interpersonal climate was most oppressive. It was also a possibility that those units were most vulnerable to adverse events, given the inherent interdependence during the medication delivery process. Drug administration in a modern hospital involves multiple hand offs in the journey from physician decision making through to the receipt of a medication by a patient. Bates and Leape17 have identified 10 points at which an error can occur (or be caught): (1) physician prescription, (2) initial delivery to a unit secretary who (3) transcribes the order, which then (4) must be picked up by a nurse who (5) verifies and transcribes again and (6) hands off to the pharmacist who (7) dispenses the medication and (8) sends it back to a nurse who (9) administers to a patient who (10) receives the drug.
If people were uncomfortable speaking up with questions and challenges, patients might be more vulnerable. Certainly, these units would be less able to learn from those mistakes that did occur than units with more open climates.
Subsequent analyses supported my interpretation of the unexpected result. An independent researcher, blind to both the survey data and the error data, interviewed nurses and quietly observed the way things worked in each unit, spending a couple of days with each group. He found that the social environment was noticeably different across the units, in ways that were critically influenced by nurse managers, whose leadership styles varied widely. He identified behavioural patterns related to mistakes, as well as other ways the units differed from each other. At the end he rated each unit on willingness to speak up about difficult issues, especially medication errors. The original table (with an error identified by an observant reader of the 1996 article now corrected) is reproduced to show these patterns (see tables 1 and 2).18
Altogether, the results suggested that people at work tacitly assess the interpersonal climate in which they work and that these assessments profoundly affect behaviour such as the discussion and analysis of mistakes and problems, which is so integral to organisational learning.19
A key insight from this study, since replicated in other contexts,5,6,20 was the palpable differences across work groups (within the same strong organisational contexts) in shared beliefs about the social consequences of speaking up about sensitive topics like error. In some teams, people openly acknowledged them and discussed ways to avoid their recurrence; in others, they kept their knowledge of an error to themselves. These beliefs about the interpersonal context could be characterised as tacit; they were automatic, taken for granted assessments of the “way things are around here”. They constitute a kind of organisational microculture—strong interpersonal climates that characterise a work group or microsystem of interdependent healthcare workers.21 In several studies, I have shown that these differences in climate across groups within organisations are statistically significant.2,6,20 Hospital cultures, in short, are patchwork quilts rather than uniform, smooth fabrics where learning culture, or what some have called patient safety culture, is concerned.22 This variation is primarily driven by local leadership behaviour, which in both overt and subtle ways shapes the climate for learning.2,23
A key implication of the localness of psychological safety in hospitals is that senior managers often do not know which group has which culture, making it difficult to ascertain whether and when they are getting the true data on errors. Often, those groups that look worse (report more errors) are in fact better than those from whom no error data are forthcoming.2,24 The local nature of a learning culture presents a conundrum for the goal of organisational learning from failure.
These results matter for discussions of patient safety.25 Reporting is unlikely when a climate of fear dominates a given hospital unit or other relevant group of caregivers.24,26 Thus, to understand and promote patient safety—centrally a collective learning process—considerable attention must be given to the organisational and psychological issues that inhibit error reporting, rather than just to the technical and medical factors, such as sleep deprivation, lack of expertise, or loss of information at change of shift. This brings us to the broader issue of the design and flow of work processes in most hospitals, and ways in which this may inadvertently inhibit learning from failure.
WORK DESIGN AND PRODUCTION PRESSURE AS BARRIERS TO LEARNING FROM FAILURE
From detailed ethnographic study of hospital nursing care processes in nine hospitals, selected for an explicit emphasis on quality improvement, Tucker (this section is drawing from collaborative work with operations management expert, Anita Tucker) and I found that, despite an increased emphasis on quality improvement and system causes of accidents in health care in general, and explicitly in the hospitals we studied, the hospitals were not, in fact, learning from the problems and errors encountered by their workers.
Process failures in healthcare delivery
The observed process failures included both problems and errors, but the majority (86%) were problems—by definition far more readily observed than errors, such that this result is not seen as a meaningful index of the true proportions. An error was defined as an unnecessary or incorrectly executed action that would be avoided with appropriate distribution of pre-existing information. For example, a patient was prepared for colonoscopy at significant expense to the hospital and discomfort to the patient before the specialist reviewed her case, revealed that the patient was not an appropriate candidate for the procedure, and cancelled it. As noted above, an emphasis on errors that lead to severe consequences can obscure the subtler phenomenon of errors that take place within the care delivery process everyday, such as an unnecessary pre-operative preparation, as well as innumerable errors that lack apparent negative consequences. Indeed, most errors are caught and corrected before patients are harmed; however, we found that they are rarely learned from.
Problems, the second type of failure, were defined as disruptions in a caregiver’s ability to execute a prescribed task because either something he or she needs was unavailable in the time, location, condition, or quantity desired, such that the task cannot be executed as planned, or else something interferes with the designated task.7,8 Examples of problems included missing supplies, information, or medications. Unlike errors, the problems faced by healthcare employees had received little attention in the literature or press, but present a valuable source of information about ways in which the system is not working.
Workers are aware of the problems they encounter; they are obvious, disruptive, and frustrating; problems prevent workers from smoothly continuing their tasks. In contrast, people are unaware of their own errors while making them. Moreover, discussion of problems encountered is less interpersonally threatening than discussion of errors committed or encountered.2,6 Tucker and I thus argued that problems constitute an important largely untapped learning opportunity in most hospitals.7 Similarly, in ethnographic research, Orr found that Xerox repair technicians, armed with inadequate repair manuals, often kept knowledge of problems and fixes to themselves despite awareness of its relevance and value, because channels for communicating such knowledge were cumbersome.27
Initially, Tucker, as principal investigator in the ethnographic study, did not consider it unrealistic to expect nurses to respond to problems in such a way that hospitals could learn from them. With substantial work experience as a manufacturing engineer prior to becoming an organisational researcher, she had successfully engaged plant operators in responding to problems using root cause analysis and other systematic problem solving techniques. Nurses have far greater levels of education and intrinsic motivation than the plant workers with whom Tucker worked and are also experienced and capable problem solvers, comfortable with discretionary decision making by virtue of the work they do.28–30 Moreover, nurses must continually evaluate what needs to be done and reprioritise their tasks accordingly, as the situation changes throughout the workday. Finally, the sample for this study, deliberately selected to include only hospitals recommended for nursing excellence, further reinforced the expectation of seeing active root cause problem solving and organisational learning. This expectation proved to be in quite stark contrast to what we found.
To begin with, the incidence of process failures was close to one per nurse per hour, making these dedicated caregivers’ days a constant navigation through small (and sometimes large) obstacles to simply get the job done. This was quite unlike the manufacturing plants in which we had worked or studied, where problems tend to be infrequent and, with the proper training and motivation, considered worthy of serious attention. The stunning frequency of work process problems in hospitals, however, made the inability of nurses to track each of these many problems down to its root cause unsurprising. Although a fuller description of the nature of the failures is beyond the scope of this article, most of them did qualify as system problems, that is, they were likely to have originated in locations other than where they showed up to disrupt the work process.
The vast majority of work process failures in this study elicited quick fixes and workarounds, or what quality improvement experts call first order problem solving, rather than system based learning or second order problem solving.31,32 First order problem solving removes the immediate obstacle to patient care (for example by getting the supplies or information needed to finish a task), but does nothing to alter the chances of problem recurrence. Worse, this response frequently creates problems elsewhere, such as when missing supplies are remedied by simply taking some from the next unit over.
Nurses took the quick fix route for the overwhelming majority of the failures observed (93%); although these responses allowed patient care to continue, neither the hospital nore the other employees or departments who may have contributed to the problem were able to learn from these small process failures. First order problem solving, natural human and organisational behaviour that it is, serves to keep communication of problems isolated so that they do not surface as collective learning opportunities. Although most of the problems observed were small, requiring only a few minutes of time to resolve, the cumulative impact of these workarounds can be substantial. In this study, on average 15% of the time of the 26 nurses observed was spent coping with system failures. Not only is it a poor use of well paid professionals’ time, the constant struggle against a tide of small annoying problems takes a toll on the nurses over time, leading to frustration and burnout.
The study took a lenient approach to classifying a nurse response to a system failure as second order problem solving behaviour. Technically defined as occurring when a worker takes action to address underlying causes (in addition to a quick fix enabling the immediate task to be completed), second order problem solving involves serious effort to figure out what went wrong and why, and if possible to do something about it. This almost invariably requires the cooperation of others, often from other departments or professions in the hospital. Because of this collaborative nature, nurses received credit for second order problem solving in this study if they brought it to the attention of anyone remotely in a position to do anything about it. None the less, only 7% of nurse responses met even these lenient criteria.
Second order problem solving can have positive consequences for workers as well as for the organisation. If the worker’s action is successful and the problem does not recur, they will not have to face similar obstacles in the future. As a result, second order problem solving is a way that real change is achieved. The organisation can benefit from higher productivity, customer satisfaction (because service is not interrupted), and worker satisfaction (feelings of gratification for successfully overcoming an obstacle and providing good care to a patient nonetheless).
Barriers to second order problem solving and organisational learning
The nurses studied were extremely dedicated and capable, often possessing advanced degrees, and all had worked for more than three years on their unit. In this study, and more generally, a lack of organisational learning in hospitals cannot be attributed to a shortage of experience, motivation, or intelligence on the part of the workforce. Instead, subtler, even counterintuitive, factors are at work.7
Firstly, health care’s emphasis on individual vigilance encourages nurses and other healthcare professionals to take personal responsibility to solve problems as they arise. This creates barriers to organisational learning because it is considered a weakness to seek help and rude to bother other busy people to let them know something has gone wrong and that their group might have contributed to the problem! Ironically, however, these behavioural norms encourage independence at the expense of system learning. Nurses are allowed, and even encouraged, to resolve problems alone without having to consider the impact on the system. The chances of organisational improvement and change being catalysed through such efforts are slim.
Secondly, efficiency is seen as critical in the increasingly cost conscious world of health care. Nursing labour is expensive and constrained. Understandably, hospitals can ill afford to have nurses routinely working with slack resources. This staffing model leads to an organisational design where workers do not have time to resolve underlying causes of problems that arise in daily activities. Instead, nurses are barely able to keep up with the required responsibilities and are in essence forced to quickly patch problems so they can complete their immediate responsibilities. Thus, in this situation it is possible for an individual worker to be working non-stop while the content of the work technically adds little value to the customer’s experience because of the amount of rework and unnecessary step. Nurses, sadly, have little time to ask themselves whether a stitch in time might save nine—or the hassle of multiple future workarounds.
Thirdly, empowerment of workers is often seen as a solution for quality and productivity problems.33 When empowerment means the removal of managers from daily work activities, however, nurses and other healthcare professionals are on their own to resolve problems that may stem from parts of the organisation within which they have no control and only limited interaction. Reducing the degree to which managers are available to frontline staff thus can be a loss for improvement efforts, especially when these individuals are already overburdened by existing duties. Managers tend to have a broader perspective than frontline workers, possess status necessary to resolve problems that span organisational boundaries, and are capable of implementing solutions on a wider basis. This is not to say that nurses are not capable of engaging in such activities, but rather that the immediate nature of their duties precludes them from spending large amounts of time away from patient care. Without a readily available nurse manager, they are left without anyone to assist in making these connections.
This detailed field research provided a realistic portrait of work on the frontlines of care delivery. Motivated, dedicated, caring people who want to help their organisations learn from the small and large process failures they experience every day find themselves unable to do so, with one hand tied behind their backs by organisational norms and production pressures.
WHAT CAN BE DONE? LEADING ORGANISATIONAL LEARNING
This section offers recommendations for how leaders in health care—both at the top of organisations and at the frontlines of patient care—can inspire and encourage organisational learning from failure. Organisational learning is a journey; circuitous, uncertain, and characterised by trial and error. Fuel for the journey is found in three inter-related factors: a compelling vision of the destination; a learning environment or culture; and a team based learning infrastructure.24 All are centrally dependent on leadership.
A compelling vision
A compelling vision engages people in the effortful process of departing from their routine.34 For example, Morath, COO of Minneapolis Children’s Hospitals and Clinics, put forward this vision for patient safety:
“The culture of health care must be one of everyone working together to understand safety, identify risks, and report them without fear of blame. We must look at ways of changing the whole system when we manage to zero defects.”35,36
The result of this compelling image of a desired future state is shared purpose.
Interestingly, this vision was not imposed on the hospital. Instead, Morath organically generated commitment to improving patient safety—despite initial deeply held reluctance on the part of employees to believe or accept that medical errors might be a problem. Instead of seeking to forcefully convince people that she was right, Morath invited employees to reflect on their own experiences in the previous week. “Was everything as safe as it could have been,” she inquired gently, “in your unit this week? For your patients?” Quietly, nurses, physicians, and pharmacists started showing up in her office asking what they could do.37
A learning environment
A learning environment, as described above, is one in which people feel comfortable and capable of speaking up with interpersonally difficult observations and questions. This environment is not, however, created by topdown mandate, but rather is created locally, one clinical area or patient care unit at a time.5,38 To help this local climate of psychological safety come about, Morath instituted a formal blame free reporting policy and also worked hard to change the language used in the organisation from threatening terms such as errors and investigations to more psychologically palatable and productive terms such as accidents and analysis.
A learning environment is also one in which bearers of bad news are embraced rather than shunned. Although most managers and physicians prefer not to hear bad news, they and others who influence an organisation’s culture must learn to value its learning content. Tucker and Edmondson proposed the chart, shown in table 3, as a partly tongue in cheek portrayal of the ideal employee in a learning organisation. The adaptive conformer in table 1 is inspired by the capable and dedicated nurses we observed. And, when managers are honest, they will admit that the adaptive conformer is who we prefer to have on staff. However, without the disruptive questioner, organisations cannot learn.
At first glance, paradoxically, a critical aspect of Morath’s attempts to create a culture of psychological safety included being clear about what constituted punishable misconduct. If people do not know where the boundaries are they do not feel psychologically safe. We all know that there is a line over which we cannot cross, but in many organisations we don’t know where that line is. To help staff at the hospital feel free to report error, therefore, Morath also publicised punishable misconduct, including such obvious elements as reckless behaviour or the use of alcohol, and such subtle elements as knowingly working way beyond one’s boundaries. In other words, failure to ask for help when one is uncertain as to what is unacceptable behaviour.
A team based learning infrastructure
Finally, organisational learning from failure occurs through a team based learning infrastructure. Organisational learning is the cumulative product of the learning of small groups or teams.5 Often, this starts with a change leadership team, with appropriate cross sectional and cross level representation, to ensure buy in and relevance of the organisational learning efforts.
Equally important, teams throughout the organisation must engage in local learning processes. Specific activities through which these local entities learn include identifying and analysing potential hazards as well as trying new actions and reflecting on the results.39 Like all cooperative activity, learning in teams involves “the wilful contribution of personal effort to the completion of interdependent tasks”.40 As this observation suggests, collective learning generally must be inspired and organised. That is, it must be led by dedicated, learning oriented, frontline leaders such as nurse managers, attending physicians, and even senior resi dents and fellows.
At Children’s Hospital, Morath developed a Patient Safety Steering Committee (PSSC). Not only was the PSSC proactive in seeking to identify failures, it ensured that all failures were subject to analysis so that learning could take place. For example, the PSSC determined that focused event studies would be conducted not only after serious medical accidents but after much smaller scale errors or near misses. Focused event studies were forums designed explicitly for the purpose of learning from mistakes by probing deeply into their causes.
Next, team based learning at the frontlines happened spontaneously in the form of safety action teams that sprung up in oncology and elsewhere, and then were encouraged and supported by the organisation, enabling other teams to follow the example.35 One clinical group developed something they called a good catch log to record information that might be useful in better understanding and reducing medical errors. Other teams in the hospital quickly followed their example, finding the idea compelling and practical. Although leadership at the top is not sufficient to ensure organisational learning, this case does suggest that it is essential. If a climate of learning is not supported at the top, local team efforts can be subject to the “Cinderella” syndrome, where other groups are jealous and seek to undo the good work carried out by the high performing team (RM Westrum, Personal communication, April 2004).
Key messages
-
Process failures (errors and problems that occur in care delivery processes) present learning opportunities for healthcare organisations.
-
Process failures in hospitals have systemic causes, often originating in different groups or departments from where the failure is experienced, and so learning from them requires cross departmental communication and collaboration.
-
Production pressures, organisational structure, and the culture of healthcare support quick fixes to problems, which hide information and allow the underlying causes to persist.
-
Employee psychological safety allows collaborative problem solving and organisational learning.
-
Leadership is essential for creating a climate characterised by psychological safety and for promoting collaborative problem solving focused on patient safety.
Similarly, it was not high level management support that allowed successful implementation of a new technology for minimally invasive cardiac surgery, but instead the way the surgeons fostered an atmosphere of learning including acknowledgment of doubt, encouragement of communication, and real time team learning.34 Any experiment carries the risk of failure, but every failure is a learning opportunity, especially small failures.9 Local leaders are critical for encouraging and supporting local reflection and communication of lessons learned from failure.
CONCLUSION
This article describes two powerful organisational factors that inhibit collective, shared, systematic learning from failure in health care. Organisational cultures lacking psychological safety for speaking up about ambiguous, small issues of potential concern (as opposed to large issues of obvious concern) and an overarching work design that emphasises production pressure and worker independence inhibit organisational learning from failure. Yet healthcare organisations can undertake a learning journey to begin to shift some of these deep rooted barriers.
An organisation learns when its teams learn. This learning process often, but not always, starts at the top with a compelling vision and the creation of a change or learning leadership team, followed by the encouragement and support of local learning initiatives. Essential to this process is a learning environment characterised by psychological safety.
The Children’s Hospital example illustrates one such journey, and clearly, despite its compelling features, the journey is not an overnight success. To a manager seeking to get the job done, this process of enrolling employees in a learning journey might at first seem laborious and slow. This effort to engage people as active thinkers and learners, however, pays off. At Children’s Hospital, employees’ attention and interest turned quickly into willingness to listen, which later took shape in the form of independent initiatives at the frontlines of patient care.
Learning from failure in healthcare requires substantial effort to create a foundation for new beliefs and behaviours throughout the organisation, particularly where patients are being treated. Efforts to establish an environment of psychological safety are critical and must be ongoing and tireless. Implementation of a process of organisational learning from failure also includes providing support and guidance for early and continued learning activities and helping to spread the new ideas and practices that emerge out of the work of the dedicated professionals on the frontlines of patient care.