Article Text
Abstract
Background The Veterans Health Administration has had a comprehensive patient safety program since 1999 that includes conducting root cause analysis (RCA) of adverse medical events. Improving the quality and timeliness of the RCAs at the local level has been a continual challenge.
Methods We initiated a non-monetary program called the Cornerstone Award into our patient safety reporting system to recognise facilities conducting high-quality and timely RCAs containing deterministic corrective actions that are implemented and evaluated for effectiveness.
Results Since the Cornerstone Program began in 2008, the per cent of RCAs completed in a time-critical manner (≤45 days) has increased from an average of 52% pre-Cornerstone to an average of 94% post-Cornerstone. The per cent of action plans with stronger deterministic actions and outcomes has increased from an average of 34% pre-Cornerstone to an average of 70% post-Cornerstone.
Discussion Implementing a non-monetary recognition award that was tied to specific improvement goals greatly improved the timeliness and quality of the RCA reports in the Veterans Health Administration System.
- Root cause analysis
- patient safety
- quality improvement
- compliance
Statistics from Altmetric.com
The Veterans Health Administration (VA) is a national healthcare organisation comprised of 153 medical centers and serving over 5.5 million veterans per year.1 In VA, all medical adverse events that cause harm to patients or have the potential to cause significant harm are investigated using root cause analysis (RCA). RCA is an effective method for examining the underlying causes of an adverse event such as a hospital-related death, surgical error or suicide. The focus of RCA is on the systemic and organisational factors that may have contributed to an adverse event including environmental factors, breakdowns in communication, non-standardised processes for assessing or treating patients, training and fatigue.2 3 RCA is a central feature of a robust patient safety program.2 4 Originally developed in high-hazard industries, RCA has been mandated for investigation of all Sentinel Events in Joint Commission accredited hospitals since 1997.5
The VA approach to RCA that was developed by the VA National Center for Patient Safety (NCPS) has been in use since 1999 and has been adopted by a number of foreign countries and facilities throughout the USA.6 NCPS was selected by the Agency for Healthcare Research and Quality to disseminate this RCA approach to all 50 states' departments of health. While RCAs have been used widely throughout the USA and the world to investigate healthcare-related adverse events for well over a decade, in VA, the ability to complete the RCA in a timely and effective manner resulting in the implementation and evaluation of corrective actions has been a challenge. In VA, an extensive effort has gone into developing a comprehensive RCA system to achieve these ends. This system encompasses a prioritisation method to determine what actual events or close call events require an RCA, Human Factors Engineering (HFE) tools to perform an RCA7 8 as well as a computerised national system for tracking the effectiveness of the actions resulting from the RCAs. The process is informed by principles of systems analysis and human factors analysis9–11; nevertheless, there remained extensive variability among the 153 VA medical centres in their ability to complete a high-quality RCA in a timely manner.
To improve the quality, timeliness and ultimate effectiveness of RCA reports nationally, VA tracks time-to-completion of RCA, number of RCAs done per year, implementation and follow-up of corrective actions specified in the RCAs. These data have been provided to all VA staff on a website since 2004 and discussed regularly with the regional-based and facility-based patient safety officers and managers. Additionally, senior leadership at the national, regional and facility levels have been regularly apprised of these data. Individual facilities' data are compared openly to their peers as well as past performance and their future expectations. In addition, the Chief Network Officer of the VA incorporated these performance results into quarterly reviews of individual facilities' performance. These reviews included the top facility management and factored the RCA performance into their annual performance evaluations. During 2007, regional and facility patient safety professionals were organisationally aligned to report directly to top management, and in 2007/2008, NCPS provided several 1-day sessions on patient safety to VA regional and facility leadership. Despite this multipronged approach, overall performance did not markedly improve. Consequently, we decided to explore other approaches for implementing change.
To begin the new process, the Chief Patient Safety Officer in VA directly discussed with facility directors what they would like to see as an inducement for improving performance. Facility director feedback indicated that an award, not monetary, strictly for recognition, for satisfying RCA-related requirements, would be helpful. Based on this feedback, we decided to institute a non-monetary award system that would recognise not just minimum performance but also recognise and differentiate minimum performance from exemplary performance with the overall goal of improving patient safety. VA Patient Safety Officers and Managers throughout the country also supported the concept of a recognition program and indicated they thought it would advance the good work done for patient safety at facilities. The challenge was to create criteria that would be meaningful and promote improved RCAs in timeliness, quality and the implementation and evaluation of corrective actions while avoiding unintended consequences such as creating simpler and fewer RCAs.
The award itself was, by design, to be purely for recognition with no monetary or other material components. In this way, we hoped to reinforce the culture of safety where the responsibility of improving patient safety is fundamental to all employees' performance. We chose to call it the Cornerstone Award. This paper describes the development and implementation of the Cornerstone Program throughout the VA system.
Measuring cornerstone impact
The impact of the Cornerstone Award was measured in two ways: timeliness and presence within the action plan of a “strong string” (defined as a RCA report with a causal statement, with an intermediate or stronger action, an associated quantifiable outcome measure and management concurrence). Timeliness is determined by the number of days that were required to complete an RCA, from the date the RCA was determined as needed to the time the RCA was completed with the facility director formally accepting the RCA report and agreeing to implement the recommended corrective actions. Timeliness is critical because the longer it takes for an RCA to be completed, and its actions implemented, then the longer patients are subjected to continued unmitigated hazard. In addition, each root cause/contributing factor that was found in the RCA analysis was evaluated to make sure it had at least one corrective action connected with it. Further, the strengths of corrective actions were assessed based on HFE principles. Past work has shown that stronger or intermediate actions which are systems based or based on HFE principles are more likely to be implemented, be effective and be sustained.12–15 To measure whether the RCA was well directed towards reducing the risk factors identified in the analysis, we developed the concept of the “strong string”. A “strong string” is comprised of a causal statement coupled with an intermediate or stronger action, an associated quantifiable outcome measure and management concurrence. An ineffective RCA may have a well-written causal statement with only weaker actions implemented or an action without a quantifiable outcome measure, making it impossible to understand its effectiveness; or the causal statement was well written and had a potentially outstanding and innovative action, yet nothing would ultimately be done because there was no management concurrence and therefore not implemented. To address the importance of the association of these factors, the label “strong string” was coined. Tables 1 and 2 show the bases for action strengths and how action evaluations are made through their outcome measures.15 Promoting the use of “strong strings” through Cornerstone was also supported by the NCPS Database which showed that “strong strings”, prior to Cornerstone, were associated 72% of the time with better or much better outcomes, while strings not considered strong at 53% (p<0.05). Reported outcomes of “much better” alone were reported for 25% of “strong strings” and only 15% for non-strong strings (p<0.05).
Finally, but not least important, we assessed whether the actions that were specified in the RCA were actually implemented and evaluated for their effectiveness, referred to as “action-close-out”. A key factor in the work of an RCA lies in follow-up. If actions are not implemented and their effectiveness evaluated, the rest of the RCA process is a waste of time. Facilities report how the actions impacted the identified risks and score the effectiveness of the action on a five-level scale from much better to much worse. This also creates a feedback loop so that additional improvement efforts can be pursued as needed.
It should be noted that a year before the Cornerstone Program was implemented, VA established a minimum number of eight RCAs (including aggregate type) that had to be completed annually at each VA hospital16). Each VA hospital is required to have an individual whose full-time focus is on patient safety. We instituted a minimum requirement since it had become evident that in some cases, there was an under-reporting of events. We could say this with virtual certainty since the VA system requires that an RCA be performed for those events, including close calls, which are rated as having a significant risk potential when scored according to the risk matrix as defined by the SAC criteria.2 7 16 This requirement is incorporated into the Cornerstone criteria, as the minimum award level, to reinforce VA patient safety program requirements. This and Cornerstone implementation stirred concern of possible unintended consequences resulting in RCA counts regressing to smaller Cornerstone-induced numbers, or too much focus on a “strong string” requirement may cause more simplistic RCAs with fewer root causes/contributing factors that in total do not fully address all the root causes/contributing factors warranted by the event. These issues are addressed further in the analysis.
The Cornerstone established three levels of award: gold, silver and bronze, in decreasing order of merit (see table 3). At its kickoff, all regional and facility level patient safety managers and management were educated about the Cornerstone Program. Training sessions and one-to-one consultations for individuals at VA facilities have been provided since the program began.
Statistical methods
Interrupted time series regression models17 are used to analyse the impact of Cornerstone on the metrics described above across quarterly time periods beginning at 2001 and up to the most current and complete data period for a given metric (third or fourth quarters of 2010). A four-parameter model is used to measure the data series' intercept, the pre-Cornerstone slope, the immediate change in level at Cornerstone intervention and the change in slope after Cornerstone intervention. Because it is a time-series analysis, the data are investigated for autocorrelation via the Durbin–Watson statistic, and when significant (p<0.05), a Cochrane–Orcutt type differencing is performed on the data series using an estimate of the autocorrelation parameter. Once the autocorrelation is removed, regression parameters are re-estimated on the differenced data series. Regression diagnostics of residuals are used to determine that non-constant variance of the model error terms is a valid assumption. Hypothesis testing about non-zero regression parameters are performed at a p=0.05 significance level.
Results
Figure 1 displays the RCA count data from 2001 to present. The plotted data and regression line visually suggest an increase in RCA count around the time of the Cornerstone intervention. Large variation among the data points prevents the model from revealing any statistically significant immediate change in level or slope in RCA count resulting from Cornerstone intervention. An increase in RCA count was not generally expected, but more importantly, the concern of a decrease in RCA count (due to unintended consequences as a result of the intervention) is removed.
Figure 2 displays the average number of root cause/contributing factors per RCA from 2001 to present. The plotted data points and regression line show a statistically significant decrease in the average count across time. But most importantly, there is no statistical evidence from this analysis indicating that the minimum RCA requirement in 2007, or the Cornerstone intervention in 2008, had any impact on the average count per RCA.
Figure 3 shows no statistically significant trend in average days to RCA completion in the pre-Cornerstone period. There is a statistically significant (p<0.05) immediate change in level at Cornerstone intervention with the model showing a decrease of 20 days. The change appears stable after the intervention with no statistically significant change indicated in the pre-slopes to post-slopes. Variations within the quarterly time periods are illustrated by the dotted bracketed lines stretching ±3 SEs around the quarterly averages. Of note also is the decrease in the variation around the average days to completion post-Cornerstone as illustrated by the lengths of the dotted bracketed lines.
Figure 4 shows a significant rate increase (p<0.05) across pre-periods and post-periods in the per cent of RCAs with strong strings. At Cornerstone, there was no significant immediate change in the level, but the rate increased significantly post-Cornerstone (p<0.05). Extending the pre-Cornerstone regression line in figure 4 reveals that the per cent of RCAs with at least one strong string would have taken more than twice as long to achieve without the Cornerstone intervention. As noted above, this result also significantly impacts the likelihood that RCAs implemented after Cornerstone will have improved outcomes. The impact is illustrated in table 4 where RCA reports types comprising 80% of all RCAs received at NCPS are listed with the percentage having “much better” improvement from pre-Cornerstone to post-Cornerstone as reported by facilities in their action close-out reports. Viewed across all RCA types, actions taken that resulted in “much better” results increased from 19% to 34% from pre-Cornerstone to post-Cornerstone as reported by facilities in their action close-out reports. The increase in “much better” outcomes supports the effect Cornerstone had on improving the quality of RCAs. Also shown in table 4 are the types of actions that were found to be most common in effecting “much improved” outcomes. While at this point in time, we cannot draw definitive conclusions about any change in the types of RCA reports due to Cornerstone; it is not apparent in the data or in the day-to-day observations of NCPS analysts reviewing and categorising the reports.
Discussion
As evidenced by the above analysis, the timeliness of RCAs, as well as improvements in quality, was affected to greater measure by Cornerstone than by any efforts made previously. Since the Cornerstone Program began in 2008, the per cent of RCAs completed in a time-critical manner (≤45 days) increased from an average of 52% pre-Cornerstone to an average of 94% post-Cornerstone. The per cent of action plans with stronger deterministic actions and outcomes increased from an average of 34% pre-Cornerstone to an average of 70% post-Cornerstone.
It was very interesting to see that actions that constituted a “standardisation of processes” were the top category of actions rated by the local patient safety managers as resulting in “much better” results. A standardisation usually results when the RCA team discovers that some part of a clinical process is conducted differently by different clinicians, leaving greater room for human error to adversely affect patient care. For example, a facility may have a policy that all abnormal results are communicated to patient within a specific time frame, but the exact process for who communicates the results and how the results are communicated (eg, letter, phone call, in-person) may be left to the clinician. Standardising the final process adds a mechanism/method that forces compliance to a regular, consistent and routine action beyond written policy/procedure, which in turn increases the likelihood that the most reliable method of communication is used by all clinicians. Other actions that were rated as leading to “much better” results include actions that improve the communication process between clinicians and improving the computerised medical record through software upgrades.
Concerns about Cornerstone causing reduced RCA counts and creating more simplified RCAs was not borne out in the analysis. The trend revealed in figure 2 towards fewer root causes/contributing factors per RCA may or may not be a concern, but it was not magnified by Cornerstone. One possible explanation for the observed trend may be from our continued teaching that RCAs should not be too ambitious. We would frequently caution RCA teams and their Patient Safety Manager advisors to be judicious and focused in their selection of topics for intervention and not try to “solve world hunger”. Our RCA training teaches that overly complex RCAs can be as ineffective as RCAs that are too simplistic, especially when their complexity inhibits implementation of the actions and follow-up.
A shortcoming of our analysis was the inability to compare the pre-rates versus the post-rates of RCA action implementation and effectiveness measurement, referred to by us as “action-close-out”. This was a result of a database design that only recorded when close-out was due and not when it was actually completed. So we were unable to separate pre-Cornerstone from post-Cornerstone “action-close-out”. A recent change to our database allows us to now measure post-Cornerstone rates of “action-close-out” at 91%.
It is impossible to say with certainty that the Cornerstone Program was the only influence responsible for the dramatic increase seen in performance. However, with no additional financial incentives, and no increased incentives for performance other than motivation to achieve the Cornerstone award, it does seem reasonable to conclude that the Cornerstone Program played a strong role in the improvement. The impact of positive recognition by way of an award demonstrates the strong value of positive reinforcement and recognition as a motivator to accomplish objectives.
Conclusion
Most organisations provide financial incentives as a way to achieve organisational performance goals. While these incentives certainly can have impact, as shown here, they are not the only tool that is available and not the only tool that may have impact. Selection of appropriate criteria and positive reinforcement through recognition can be extremely effective and possibly even more effective than financial incentives alone. This is not a new concept and has been used routinely by the military in the form of medals and has been discussed in management texts18 19 but is often forgotten. Simply put, you get more flies with honey than you do with vinegar, and the “honey” doesn't have to be money.
Footnotes
Funding This material is the result of work supported with resources and the use of facilities at the Department of Veterans Affairs National Center for Patient Safety at Ann Arbor, Michigan, and the Veterans Affairs Medical Centers, White River Junction. While NCPS completes the data analysis and sponsors the award program, the effort and recognition to improve RCAs resides with the staff at VA facilities and Veterans Integrated Service Networks (VISNs). It is through the accomplishments at each individual facility that the nationwide RCA program is enhanced and the VA is able to claim overall improvements. The Research and Development Committee, White River Junction, VA Medical Center approved this project and the Committee for the Protection of Human Subjects, Dartmouth College considered this project exempt. The views expressed in this article do not necessarily represent the views of the Department of Veterans Affairs or of the United States government.
Competing interests None to declare.
Ethics approval This study was conducted with the approval of the VAMC White River Junction Vermont R and D committee.
Provenance and peer review Not commissioned; externally peer reviewed.