Article Text
Abstract
Background For patients with critical laboratory abnormalities, timely clinical alerts with decision support could improve management and reduce adverse events.
Methods The authors developed a real-time clinical alerting system for critical laboratory abnormalities. The system sent alerts to physicians as text messages to a smartphone or alphanumeric pager. Decision support was available via smartphone or hospital intranet. The authors evaluated the system in a prospective controlled stepped-wedge study with blinded outcome assessment in general internal medicine units at two academic hospitals. The outcomes were the proportion of potential clinical actions that were actually completed in response to the alert, and adverse events (worsening of condition or complications related to treatment of the condition).
Results The authors evaluated 498 laboratory conditions on 271 patients. Overall, only 50% of potential clinical actions were carried out, and there were adverse clinical events within 48 h for 36% of the laboratory conditions. The median (IQR) proportion of potential clinical actions that were actually completed was 50% (33–75%) with alerting system on and 50% (33–100%) with alerting system off (p=0.94, Wilcoxon rank sum test). When the alerting system was on (n=164 alerts) there were 67 adverse events within 48 h of the alerts (42%). When the alerting system was off (n=334 alerts), there were 112 adverse events within 48 h (33%; difference: 9% higher with alerting system on, p=0.06).
Conclusions The provision of real-time clinical alerts and decision support for critical laboratory abnormalities did not improve clinical management or decrease adverse events.
- Decision support systems
- clinical
- therapy
- computer assisted
- designs
- experimental
- decision-making
- effectiveness
- information technology
Statistics from Altmetric.com
- Decision support systems
- clinical
- therapy
- computer assisted
- designs
- experimental
- decision-making
- effectiveness
- information technology
Background
Gaps in the management of patients with serious laboratory abnormalities are common.1–5 Physicians' lack of timely information to support their clinical management contributes to adverse events.6 Automated systems that communicate critical laboratory values may reduce the time to treatment.7 8 There are no published evaluations of the combined delivery of critical alerts and decision support to physicians' pagers or smartphones.
We previously developed a system for automatically sending critical laboratory values to physicians' pagers or smartphones, which led to a statistically non-significant reduction in response time from 38 to 15 min.9 We noticed a significant variation and numerous gaps in the clinical response to the laboratory abnormalities in that study. We hypothesised that providing both an automated alert and relevant decision support to physicians would improve clinical care and reduce adverse events.10 11 Therefore, our objectives were to link the automated critical laboratory alerts with decision support, deliver this decision support to the clinician responsible for the patient at the point of care and evaluate the impact of this system on clinical management and adverse events.
Methods
Setting
We conducted this controlled step-wedge study on the general internal medicine inpatient teaching units at University Health Network (UHN) and Sunnybrook Health Sciences Centre (SHSC) from 7 January to 30 June 2008. These teaching units comprise four medical teams. Each medical team includes a staff physician, one senior resident, and two or three junior residents. All patients were considered for the study unless they were receiving exclusively palliative care.
At UHN, Misys CPR (now acquired by QuadraMed12) was the electronic patient record system, while Misys Insight13 (based on Sunquest Clinical Event Manager) served as the automated real-time alerting system. At SHSC, the corresponding systems were Oacis EHR Solution14 and New Age Systems.15 Teams at both hospitals developed common alert rules and decision-support guidelines.
Development of alerting system and decision support
Each alert contained information about the specific abnormality and patient, as well as an URL to a web page with decision support specific for that alert (figure 1). During normal working hours, alerts were sent to an internet-enabled smartphone held by the senior resident. The senior resident could review the decision support by clicking on the web link, and forward the alert to the appropriate junior resident. At UHN, junior residents carried smartphones and could review alerts and decision support on their device. At SHSC, junior residents carried alphanumeric pagers, and could review the decision support at any hospital computer workstation (figure 1).
During weekends and nights, there were differences in process between study sites. At UHN, the senior resident handed their smartphone to the on-call physician, usually a junior resident from another team. The on-call physician would receive the alerts and access decision support in the usual way. At SHSC, the senior residents did not hand over their smart phones to the on-call physician. Therefore, on-call physicians learnt of critical laboratory abnormalities in the usual fashion by a telephone call from the lab and could access decision support from a hospital desktop as described. At SHSC, the senior resident was on call one night out of four, and would receive the alerts on their smartphones on those occasions.
Alerting conditions and decision support
We developed alerting rules for six common critical laboratory values: low serum potassium (3 mEq/l or less), low serum sodium (125 mEq/l or less), high serum sodium (160 mEq/l or more), high international normalised ratio (INR, five or more), low haemoglobin (70 g/l or less) and high serum potassium (6 mEq/l or more), based on our prior study.9 We also developed an alert for patients with low serum potassium (3.0–3.5 mEq/l) with an active drug order for a potassium-losing diuretic (furosemide, metalozone, hydrocholorothiazide or indapamide) (online appendix).
Study physicians drafted decision-support recommendations after reviewing the available literature and existing hospital policies. Local specialists reviewed the recommendations. The final decision-support content is provided in the online appendix.
Usual clinical care
The standard practice of a telephone call from the lab to the hospital ward for all critical laboratory values continued throughout the study period at both study sites. After this telephone call, ward staff located the responsible physician. This telephone process applied to all alerting conditions except low serum potassium (3.0–3.5 mEq/l) with an active order for a potassium-losing diuretic, which was not considered a critical value at the time of the study.
Evaluation of real-time clinical alerting with decision support
Participating resident physicians were unaware of the study design or measures. The first study period at SHSC, from 7 January to 2 March 2008, served as a control period. We then implemented the alerting system to physician teams using a step-wedge method at SHSC from 3 March to 5 May 2008 (Intervention Period 1), and at UHN from 5 May to 30 June 2008 (Intervention period 2).16 The order of implementation to physician teams at the two hospitals was randomly determined (figure 2).
Data collection
The research nurse reviewed the admission lists for the general internal medicine service, monitored laboratory results for every patient during the entire study period, and received alerts through email from the alerting system. She confirmed the laboratory abnormality, then reviewed the electronic and paper chart for demographics and comorbidities, potential clinical actions, completed clinical actions and adverse events. The research nurse was unaware of the study design and was blinded to the on/off status of the automated alerting system.
Main outcome measures
In advance, we defined clinical actions (processes of care) that could be carried out in response to each alerting condition. For each alert, we determined how many of these actions could have been carried out. For example, we defined four clinical actions in response to critical low serum potassium. However, for a particular patient with critical low serum potassium, two actions may have already been implemented prior to the alert, leaving two potential clinical actions that could be taken in response to the alert. We then determined how many potential clinical actions were actually completed for each alert. Finally, we calculated a completion ratio (potential clinical actions completed/potential clinical actions) for each alert (tables 1, 2).
In advance, we specified adverse events for each alerting condition. These events reflect worsening of the alerting condition, or complications related to the treatment of the alerting condition. All data were independently reviewed by two members of the research team, with disagreements resolved by discussion. All reviewers were blinded to the on/off status of the automated alerting system. We are missing adverse event data for one patient.
Ethics
The research ethics board at each study hospital approved the study.
Analysis
We summarised continuous data using the mean (SD) or median (IQR). We used χ2 tests (for categorical variables) and t tests and Wilcoxon rank sum tests (for normally and non-normally distributed continuous data, respectively) to compare characteristics of patients and alerting conditions when the alerting systems were on and off.
Our primary outcome measure was the completion ratio (potential clinical actions completed/potential clinical actions); our secondary outcome measure was adverse events. We evaluated the impact of the alerting system with a linear mixed model for completion ratio, and with a generalised linear mixed model for adverse events. Each model included the physician team, and the patient. These models are necessary to account for clustering of data, because the intervention is aimed at the physician team, whereas the primary outcome is assessed at the level of a critical laboratory abnormality for a patient. We conducted analyses with and without the data from the baseline period at Sunnybrook. The results were similar, so we chose to report the analysis that includes data from the baseline period. We evaluated interobserver reliability of the completion ratios by calculating intraclass correlation coefficients.
We used a two-sided p value of 0.05 as the threshold for statistical significance. We used SAS 9.1 (Cary, North Carolina) and R version 2.7.2 for all analyses.
Results
Characteristics of alerts and patients
There were 1731 admissions to the study units during the evaluation period (1203 at Sunnybrook, and 528 at UHN), for which there were 577 alerting conditions. We excluded 79 alerting conditions because of missing data on the process of care (29); duplicated laboratory values and alerts (27); erroneous laboratory values (11); patient determined not to be under care of a study general medicine team (6); patient receiving exclusively palliative care (5); or no clinical actions could be taken in response to the alert (1). We are missing adverse event data for one alert (tables 3, 4).
There were 498 alerting conditions on 271 patients remaining for analysis (table 3). The most common alerts were critical low serum potassium (53%) and critical low haemoglobin (13%). The distribution of alerting conditions was similar when the alerting system was on or off. Patients were predominantly female (56%), with a mean age of 73, a mean Charlson comorbidity index of 4.4 and a median hospital length of stay 16 days (table 4). Overall, 13% of patients died in hospital. Patients triggering alerts when the alerting system was ON had a higher mean Charlson comorbidity index (5.5 vs 3.8, p<0.0001).
Clinical actions and adverse events
Overall, the completion ratio was 50%, meaning that only half of clinical actions that could have been taken were actually taken. There were adverse events within 48 h for 36% of the alerting conditions. The unadjusted median (IQR) completion ratios were 50% (33–75%) with the alerting system on and 50% (33–100%) with the alerting system off (p=0.94, Wilcoxon rank sum test). The effect of the alerting system was not significant in the linear mixed model that accounted for clustering of data (OR 1.1, 95% CI 0.82 to 1.51, p=0.45). The interobserver reliability for completion ratios was good to excellent, with intraclass correlations coefficients ranging from 0.51 (for high serum sodium) to 0.88 (for low serum potassium) (table 5).
When the alerting system was off (n=334 alerts), there were 111 adverse events within 48 h of the alerts (33% of alerts). When the alerting system was on (n=164 alerts), there were 68 adverse events within 48 h of the alerts (42% of alerts) (difference 9% higher with alerts on, p=0.06). The effect of the alerting system was not significant in the generalised linear mixed model that accounted for clustering of the data (OR 1.9, 95% CI 0.78 to 4.82, p=0.15).
In a post-hoc analysis, the completion ratios and adverse events appeared similar for each alert. We did not identify a qualitative positive impact of the alerting system on the completion ratios for any of the alerts. We did not attempt any statistical subgroup analyses given the small sample sizes for many of the subgroups.
Discussion
Our major findings are that only 50% of potential clinical actions in response to a serious laboratory abnormality were actually completed, and that adverse events (worsening of condition or complication of its treatment) occurred after 36% of these abnormalities. We also developed and successfully implemented a real-time alerting system and simultaneous decision support, but found that the system had no impact on completion of potential clinical actions or adverse events.
There are several reports of systems that send critical laboratory data without decision support to physician pagers or personal digital assistants,17 18 but no reported evaluations of such alerts bundled with decision support. In our previous study of automated paging of critical laboratory values without decision support, the time to treatment was reduced non-significantly, but the impact on clinical care was not evaluated.9 Prior studies of laboratory abnormalities in acute hospital settings have found variable rates of adherence (40–76%)1–5 to clinical protocols, but few have explicitly sought evidence of a worsening clinical condition or complications of treatment. In one study of automated paging of critical laboratory values, the adverse event rate was 25%, but the critical values and the adverse event definitions differed significantly from our study.7
We believe that the lack of effect of our alerting system and decision support is primarily explained by four factors. First, the additional benefit of reminder systems is likely smaller than appreciated. A recent Cochrane review of 28 studies found that computerised point-of-care reminders achieved a median absolute improvement in processes of care of about 4%.19 Second, our alerting system was not fully integrated into clinician workflow. It could not reliably deliver an alert to a responsible junior resident on nights and weekends, and residents at one hospital who used text pagers needed a desktop computer to review decision support. Third, some residents commented that the usability of the alerts would be improved if the link to the decision support were placed at the top of the email message to obviate the need for scrolling. Finally, some residents felt that they did not need to review decision support for abnormalities where they were already confident in management, such as low serum potassium, although our results suggest that this confidence is not warranted. Further study of the role of overconfidence is warranted.
Our challenge to seamlessly integrate advanced clinical decision support is not unique. Few hospitals in either the USA or Canada presently meet advanced clinical-decision-support criteria as defined by Healthcare Information and Management Systems Society Stage 4 or higher.20 Since this evaluation was completed, we have made several improvements to the alerting system, including a fully integrated schedule that routes the alert to the responsible resident at all times.21 We will deploy smartphones to all residents, to allow seamless access to decision support at the time the alert is received. All these enhancements are elements of an ideal critical laboratory notification system.22
Our project had several strengths. We conducted the study at two sites using two different electronic patient record systems. We developed and applied a case-review methodology that identified potential and completed clinical actions. The case-review methodology had good to excellent inter-rater reliability and will be useful for future studies. We had a very high rate of follow-up for the study outcomes. Finally, we used an outcomes assessor blinded to the status of the alerting system.
Our study had several important limitations. The alerting system requires enhancements to optimally integrate into clinician workflow, and should be re-evaluated after these enhancements. Our sample size was not large enough to detect smaller, but potentially clinically important effects of the alerting system. Finally, we developed our definitions for processes of care and adverse events based on a local expert review, existing policies and prior similar studies, but other clinicians may not agree with our defined clinical actions.
In summary, we developed and implemented a real-time clinical alerting system with decision support. We found that this system had no effect on process of care or adverse events. The low rates of appropriate completed clinical actions and high rates of adverse events justify ongoing efforts to improve care of patients with critical laboratory abnormalities. Further research should focus on improving integration of the alerting system into clinician workflow.
Acknowledgments
This manuscript is dedicated to our late co-investigator, W Sibbald. We thank N Nikaido for her support with data collection. In kind support for the project was provided by Sunnybrook Health Sciences Centre, the University Health Network and New Age Systems.
References
Supplementary materials
Web Only Data bmjqs.2010.051110
Files in this Data Supplement:
Footnotes
Funding This study was supported by a research grant from the Canadian Patient Safety Institute. In kind support for the project was provided by Sunnybrook Health Sciences Centre and the Univesity Health Network.
Competing interests None.
Ethics approval Ethics approval was provided by the Sunnybrook Health Sciences Centre, and University Health Network.
Provenance and peer review Not commissioned; externally peer reviewed.