Article Text
Abstract
Objective Reference tests, also known as send-out tests, are commonly ordered laboratory tests with variable costs and turn-around times. We aim to examine the effects of displaying reference laboratory costs and turn-around times during computerised physician order entry (CPOE) on inpatient physician ordering behaviour.
Design We conducted a prospective observational study at a tertiary care hospital involving inpatient attending physicians and residents. Physician ordering behaviour was prospectively observed between September 2010 and December 2012. An intervention was implemented to display cost and turn-around time for reference tests within our CPOE. We examined changes in the mean number of monthly physician orders per inpatient day at risk, the mean cost per order, and the average turn-around time per order.
Results After our intervention, the mean number of monthly physician orders per inpatient day at risk decreased by 26% (51 vs 38, p<0.0001) with a decrease in mean cost per order (US$146.50 vs US$134.20, p=0.0004). There were no significant differences in mean turn-around time per order (5.6 vs 5.7 days, p=0.057). A stratified analysis of both cost and turn-around time showed significant decreases in physician ordering. The intervention projected a mean annual savings of US$330 439. Reference test cost and turn-around time variables were poorly correlated (r=0.2). These findings occurred in the setting of non-significant change to physician ordering in a control cohort of non-reference laboratory tests.
Conclusions Display of reference laboratory cost and turn-around time data during real-time ordering may result in significant decreases in ordering of reference laboratory tests with subsequent cost savings.
- Hospital medicine
- Cost-effectiveness
- Laboratory medicine
- Quality improvement
- Decision support, computerized
Statistics from Altmetric.com
- Hospital medicine
- Cost-effectiveness
- Laboratory medicine
- Quality improvement
- Decision support, computerized
Introduction
Rising healthcare expenditures in the USA in the past two decades have highlighted a need for physicians to provide quality patient care while being mindful of the high costs associated with healthcare delivery. Therefore, development of new strategies for healthcare cost containment to eliminate wasteful spending is crucial.1 One particular area of interest is the appropriateness of ordering laboratory tests.2 Laboratory tests can be instrumental in diagnosis, monitoring, screening, prognosis and confirmation of a clinical suspicion, but unfortunately many physicians order laboratory test inappropriately. Reasons may include hospital or legal requirements, defensive or medical legal protection, academic curiosity, pressure from the patient or family, and/or personal reassurance.3–5
In order to rectify this problem, interventions such as physician education and cost feedback have been proposed. Physician education, initially thought of as a possible mechanism to decrease inappropriate laboratory usage, has yielded mixed results. Some studies show that it is insufficient as a single intervention to change physician ordering behaviour,6 while others report that it may be an avenue to curb inappropriate ordering,7–9 especially when combined with additional interventions, such as restricting available emergency laboratory tests and frequency of repeated orders.10 In addition to these interventions, the awareness of the cost of laboratory tests at the time of clinician ordering has shown promise in its ability to decrease in physician ordering without compromising the quality of patient care.11–13
One proposed intervention to decrease inappropriate laboratory usage and cost is the Computer Physician Order Entry (CPOE) system, electronic systems which allow direct provider input of diagnostic testing or medication orders. Adoption of these systems may offer many institutional benefits including reduced test turn-around time (TAT), improved test usage, and better adherence to practice guidelines.14–19 Specifically, these systems may alert physicians to test-specific inappropriate ordering, provide alternative tests to consider, list cost information, and suggest order sets for bundled ordering.15 ,20 Use of CPOE alert systems have been shown to improve ordering of certain tests including cardiac markers in certain appropriate situations,21 as well as to improve compliance with the American Heart Association guidelines for cardiac monitoring.22 The data on implementation of a CPOE system to decrease laboratory cost and usage is mixed. One study on the cost of Enoxaparin before and after initiation of a CPOE system showed no reduction in daily cost of therapy.23 Another study showed that a CPOE system could decrease usage through electronically limiting the number of times a common laboratory test could be ordered within a 24 h period.24
In addition to cost, TAT may also have an effect on physician ordering. TAT is defined as the average amount of time between ordering a test and receiving its results, and its effects on physician ordering have not been well studied. For example, would knowledge of a long TAT for a particular test encourage physicians to curb its ordering if the patient's expected length of stay is significantly shorter than the time it takes for this test to be resulted? Much of the current research on TAT involves the implementation of a CPOE system with subsequent decreases in TAT for laboratory and radiology results as well as medication administration.25–31 Currently, there are no published studies examining TAT feedback and its effects on physician ordering.
In our study, we examine the ordering of reference laboratory tests. Reference laboratory tests, also known as send-out tests, compromise a major component of hospital clinical laboratory services. Although reference test volumes represent a small percentage of a hospital's total test volume, reference tests account for the majority of a hospital laboratory test menu and a disproportionate percentage of laboratory costs.32 ,33 Additionally, reference tests are processed by an outside laboratory, and the actual cost of each test is explicitly set. Stated another way, the charge, price and cost are all identical, so the financial impact on the medical institution is clear. Furthermore, many of those outside laboratories (eg, Quest, Labcorp) are also used by many other ordering institutions across the nation, making results generalisable.
In sum, we aim to further explore the cost saving opportunities of a CPOE system through display of both cost and turn-around information for reference laboratory tests in an inpatient teaching hospital. We hypothesise that display of reference test cost and TAT data to physicians during real-time ordering will lead to decreased ordering of these tests with subsequent cost savings.
Methods
Setting
This study was conducted at Stanford Hospital and Clinics in Stanford, California. The inpatient hospital is a 613-bed tertiary care facility with over 25 000 inpatient admissions annually. The outpatient clinics include primary care and various outpatient specialty services with over 643 000 clinic visits annually. The electronic health record is Epic (v. 2012 IU 2), a fully integrated electronic medical software which spans clinical, access and revenue services across both the inpatient and outpatient settings, and functions as the primary CPOE system for the inpatient hospital services.
Data acquisition, control cohort and intervention
Through collaboration with our Department Of Pathology and our Clinical Laboratory, we used our Laboratory Information System to collect reference laboratory ordering data for physicians over the period of September 2010 through December 2012. Physician ordering information was obtained for reference laboratory test orders including: the name of the reference lab, the name of the reference test, the authorising physician, whether the test was ordered in the outpatient or inpatient setting, and the date and time the order was placed. Information on costs and TATs for reference tests was also obtained from existing reference laboratory information databases. Data on average length of stay, inpatient days at risk, and case mix index was obtained from the Department of Quality, Effectiveness, and Patient Safety. A control cohort of non-reference inpatient laboratory tests was obtained using similar methods over the same time period including information on laboratory test name and monthly ordering volume.
This study was submitted to the Institutional Review Board and determined to be exempt from review according to the policy of our institution. During February and March of 2012, in collaboration with our Epic service team, our CPOE system was programmed to display cost and TAT information to all inpatient providers when placing orders for reference laboratory tests. Specifically, the Epic cost and TAT display were implemented in the following manner: a list of all orderable reference labs was created through information from our laboratory directory and our Epic service team. TAT and cost data were obtained from the laboratory directory. Missing data was obtained manually from the reference laboratory websites. Tests were grouped into categories and organised into a master spreadsheet which was sent to our Epic service team to populate the ‘process instructions’ section of the test order, a free text field where cost and TAT information could be displayed alongside the Epic orderable. Maintenance of this intervention was done through periodic review of the master spreadsheet against the existing laboratory directory, which was revised continually. New test cost and TAT information was updated and merged with current Epic orderables. The initial implementation was straightforward, requiring <20 h of labour spread across several weeks with each maintenance update requiring a few hours for evaluation and revision of the master spreadsheet.
As seen in figure 1, the information was displayed after the provider searched for and selected the desired test, but before the order was signed. Cost and TAT information were displayed to ordering providers as general ranges because of inadequate laboratory information at the time of implementation to show an exact value. Cost and TAT of displayed ranges can be found in the online supplementary appendix. Implementation of the intervention was completed by April 2012. This modification of the CPOE was advertised at the Internal Medicine Housestaff Noon Conference and at laboratory and clinical services collaborative meeting in March 2012.
Analysis of data
For the purposes of data analysis, the displayed cost and TAT ranges were collapsed to mean values. For example, an order with a cost range of US$100–300 was assigned the cost of US$200. The preintervention control period was defined as September 2010 through January 2012 (17 months total). The implementation ‘buffer’ period was defined as February 2012 through March 2012 and was censored from analysis. The intervention period was April 2012 through December 2012 (9 months total). Data from the intervention period of September 2012 could not be obtained and therefore was excluded from the final analysis.
There were a total of 17 107 reference laboratory test orders during the study period, of which 13 570 orders could be matched to a cost or TAT. The remaining 3537 orders were excluded from analysis as they did not have associated cost and/or TAT information due to incomplete reference laboratory information. Another 1064 orders were excluded from the buffer period, leaving 12 506 orders over our study period, for which 68 had missing TAT values and 88 had missing cost values.
After buffer period censoring, the control cohort consisted of 3 310 803 non-reference laboratory test orders from the same study period and accounted for an average of 132 432 orders monthly. These orders represented inpatient, on-site laboratory tests which spanned all departments and specialties.
Two-sample t tests were used to test whether there was a significant change in ordering behaviour between the control and treatment periods. The family-wise error rate was adjusted using the Bonferroni–Holm method. Correlations were done using Spearman's Rank-Order, given the presence of outliers. Results are expressed in mean number of monthly orders per patient day at risk. Patient day at risk is defined as the number of occupied bed-days accounting for a patient's length of stay. The following measures were tested: cumulative patient days at risk, cumulative orders per month, cumulative cost per month, and cumulative TAT per month.
Using the same method, a standard stratified analysis of cost and TAT was conducted to see if there was a significant change in physician ordering if cost or TAT was low versus high. Cost was separated into two categories: <US$100 and US$100–300 to reflect representative ranges displayed to ordering physicians; 99.3% of laboratory test cost were distributed into either one of these two buckets; 185 orders with a cost >US$300 were excluded from analysis. TAT was also stratified into two categories: ≤5 and >5 days as 5 days was the mean length of stay at Stanford Hospital over the duration of our study period. Five laboratory tests were excluded due to long tail, defined as having a TAT greater than 40 days.
Two-sample t tests were also used in an ancillary analysis to evaluate number of monthly orders placed, mean cost per order, and mean TAT per order between all physicians versus overlapping physicians between control and treatment periods.
Results
Overall, there was a significant decrease in average number of monthly physician orders and average reference test cost with our intervention. No differences were seen with average reference TAT per order. These findings occurred despite non-significant change to physician ordering in our control cohort of non-reference laboratory tests.
A summary table is presented in table 1. As seen in figure 2, the mean number of monthly physician orders per patient day decreased by 26% (51 vs 38, p<0.0001). In figure 3, we noticed a decrease in mean monthly test cost per order from US$146.50 to US$134.20 (p<0.0004) with a mean savings of US$12.30 per test. Overall, there were no significant differences in TAT per order over our intervention period (5.6–5.7 days, p=0.059).
To better understand the effect of test cost and TAT on physician ordering, we performed a stratified analysis in table 2. We showed that there are significant decreases in mean monthly orders per patient day at risk with our intervention in both the <US$100 bucket (38 vs 29, p≤0.0001) and US$100–300 bucket (13 vs 9, p≤0.0001). A similar effect was seen with TAT ≤5 days (26 vs 21, p<0.0001) and >5 days (24 vs 17, p<0.0001). Reference test cost and TAT variables were poorly correlated (r=0.2).
No significant changes in mean monthly physician orders per patient day were seen in the control cohort of non-reference laboratory tests (8583 vs 8475, p=0.343). A summary table is found in online supplementary appendix 1.
Overall with this intervention, we saw a 23% decrease in mean monthly orders for inpatient reference tests from 744 to 607 orders (p=0.002), and a mean decrease in test cost from US$146.50 to US$134.20. Using mean costs, this projected savings of US$27 537 per month resulting in an annual savings rate of US$330 439.
To better understand physician and interdepartmental ordering practices during this study, we performed an ancillary analysis. The tests were authorised by 553 attending physicians before our intervention, and 375 attending physicians after our intervention, with an overlap of 322 physicians in both groups. Results were similar when comparing monthly orders placed, mean cost per order, and mean TAT per order between all physicians versus only overlapping physicians, as seen in online supplementary appendix 2.
Finally, in online supplementary appendix 3, we observed a decrease in mean ordering across most departments at Stanford University Medical Center, with a significant decrease in mean cost per test (p=0.007) across all department though no significant decrease in mean TAT (p=0.28). A list of the top 10 most commonly ordered laboratory tests is available in online supplementary appendix 4.
Discussion
These results support our hypothesis, that display of cost and TAT data for reference laboratory tests during real-time ordering leads to decreased ordering of these tests with subsequent cost savings. Key findings include a decrease in monthly physician orders per patient day by 26%, and a 9% decrease in average test cost with a savings rate of US$12.30 per test. Overall, we calculate a modest annual savings rate of US$330 439 for our hospital attributable to our intervention.
While several prior studies have evaluated the effects of cost display on provider ordering behaviour, to the best of our knowledge, this study is the first to focus around ordering of reference laboratory tests based on display of cost and TAT information to physicians. We add to a growing body of literature supporting that physician behaviour may be influenced by a CPOE system to optimise ordering of laboratory tests.14 ,16 ,17 ,34 We show that TAT is poorly correlated with cost and may be an independent factor influencing physician ordering. In our stratified analysis, there are significant decreases in physician ordering regardless of whether the cost data was stratified to low versus high cost, and whether TAT was stratified by less than or greater than 5 days. No ‘dose-related response’ was seen in the stratified analysis. That is, higher cost and TAT laboratory tests had similar decreases in ordering postintervention as lower cost and TAT laboratory tests. One possibility may be that display of cost and TAT data indiscriminately reduces physician ordering behaviour. Said another way, it may be that physicians will void the ordering of a laboratory test when confronted with the cost and TAT information, but may not be more or less likely to order the test based on the actual content of that information.
Compared with the existing literature, we find similar results that display of laboratory cost data during real-time ordering has a modest impact on physician ordering. For example, a randomised study by Feldman et al35 showed a modest 8% decrease in commonly ordered tests per patient day in 61 diagnostic laboratory tests with display of Medicare fee data over a 6-month intervention period. An interrupted time series analysis by Horn et al36 on the effects of showing Medicare reimbursement data revealed a decrease rates in physician ordering, ranging from 17% to 19%, for 10 of 27 selected laboratory tests. A literature review failed to reveal any currently published literature on the effects of TAT on physician ordering. We suspect the lack of significance in mean TAT with our intervention is due to the large number of TAT ranges in our dataset, with poor representation of a test's true TAT using our analysis method of collapsing these values to the mean.
Of note, the group of reference tests with TAT greater than 5 days is an especially important area for future intervention, as these tests would not be resulted by the time of hospital discharge at Stanford and could arguably be ordered as an outpatient. This has the potential for cost shifting from an inpatient to an outpatient setting, and would be preferable as inpatient reference labs are often not reimbursable under Medicare diagnosis-related group billing model. Additionally, allowing outpatient physicians to order these tests provide easier follow-up of the results, as many reference tests that are not resulted by the time of discharge may not be followed-up by the inpatient physician.
There are several limitations to consider in this study. First, we recognise that the absence of a control group in our study limits the ability to make causal inferences, though we have attempted to provide some level of control by following a cohort of non-reference labs. Second, we could not obtain associated cost and/or TAT information for 21% of the tests, and these records were censored from the analysis. Third, our study is unable to show whether our intervention led to the withholding of otherwise medically indicated laboratory testing, and whether such withholding impacted patient outcomes. Furthermore, while mean length of stay remained the same throughout our study period, any financial benefit from a particular hospitalisation would be quickly attenuated by a potential increase in length of stay from delays in reference test ordering. Fourth, this study was conducted at a major academic centre with a tertiary care hospital where resident house staff enter the majority of orders in consultation with attending physicians who authorise these orders. Therefore, the results may not be generalisable to other healthcare institutions. Fifth, there may be limitations to the long-term durability of these changes, and it remains to be seen whether ordering behaviour extends beyond the study period. Finally, this study was only conducted in the inpatient setting and does not evaluate physician ordering habits or financial impact in the outpatient clinics.
These limitations aside, our study has several strengths. In contrast to the complex accounting for equipment, supplies, capital depreciation of instruments, and personnel overhead that is required in the calculation of costs for in-house laboratory testing, reference laboratory costs are direct, and unambiguous costs billed to our hospital by outside reference laboratories. These costs are poorly reimbursed by our payers, and the State of California does not allow mark-up of reference laboratory costs. Thus, the cost savings shown here are close to true costs for our institution, and the calculated yearly savings attributable to our intervention reflect true financial savings. Additionally, if this intervention encourages shifting the ordering of reference laboratory tests from the inpatient to the outpatient setting, this is more desirable for both follow-up and reimbursement. Furthermore, reference laboratory costs are relatively standardised with many institutions using the same reference laboratories across the USA and incurring the same unambiguous costs. Finally, our intervention to display test cost and TAT at the time of ordering is easy to implement, non-burdensome to view, and does not require additional education or incentivisation from the physician.
Conclusion
Physicians and other healthcare providers are faced with the challenge of providing high-quality medical care while also being aware of the rising costs of healthcare in the USA. Cost consciousness and elimination of wasteful spending are key components to meet this challenge. Through our study, we observe that display of reference laboratory cost and TAT data to physicians during real-time ordering may result in significant decreases in ordering of reference laboratory tests with subsequent cost savings. Meaningful next steps include conducting a larger randomised study, possibly at several academic centres, to confirm these results. Ultimately, we need more research to develop effective, easy to implement, and widely generalisable interventions which aim to decrease costs and eliminate wasteful spending in healthcare.
References
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Files in this Data Supplement:
- Data supplement 1 - Online supplement
- Data supplement 2 - Online supplement
Footnotes
-
Contributors PDL, DZF, GS and LS had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis. Study concept: GS and LS. Acquisition of data: DZF, GS, AF, JYC. Data analysis and Interpretation: DZF, GS, DG, PDL and LS. Drafting of manuscript: DZF, GS. Manuscript revision: All Authors. Study supervision: LS.
-
Competing interests None.
-
Provenance and peer review Not commissioned; externally peer reviewed.