Article Text
Statistics from Altmetric.com
Background
In the USA, healthcare costs are far greater than those in any other industrialised country.1 Currently, they comprise almost 18% of the gross domestic product and 30% of government expenditures.2 US healthcare costs encroach on all other areas of spending public and private. Cost drivers are multifactorial but up to 30% of Medicare spending is potentially avoidable without worsening health outcomes3 ,4 with overuse and misuse of tests and treatments accounting for approximately 10%. Physicians have a responsibility to ensure that the diagnostic tests and treatments they order are safe, effective and provide value.5 ,6 Hence, they can lead the effort to preserve scarce resources while promoting quality, reducing harm and controlling cost.7
Initial calls for action from the medical profession came following an Institute of Medicine report describing the need to reduce costs and improve outcomes.3 ,6 Initiatives were launched in the form of the National Physicians Alliance ‘Top 5’ in primary care8 and by the American College of Physicians High-value Cost Conscious Care Initiative.9 ,10 National attention was focused on this issue in April 2012, when the American Board of Internal Medicine Foundation convened a group of nine national medical specialist organisations and Consumer Reports to mobilise the effort to reduce overuse or misuse of tests and procedures that provide little benefit or can cause harm.10 This ‘Choosing Wisely’ campaign has now been embraced by more than 70 specialty societies and 18 consumer-oriented organisations.10 However, processes to implement and sustain programmes in local, state and national settings are rarely described.11 We report here on our initial 2 years’ experience, which took a grass-roots approach to developing a high value care (HVC) programme. The goals of this project were to improve the care and experience of patients, reduce harm, cost and educate our resident trainees. We describe the process, projects and lessons learnt.
Assessment of problems
Programme development
In June 2012 the chair of the Department of Medicine asked the faculty to submit ideas for HVC projects using the American Board of Internal Medicine (ABIM) ‘Choosing Wisely’ lists as examples. The stated goal was to identify and ultimately reduce harmful, unnecessary or low value care under physicians’ control. Any programme had to be integrated into educational and research efforts, create a process which could be applied in other departments or other academic medical centres, preserve finite medical resources and improve care provided to patients.
Project selection
All project proposals were evaluated by a Department of Medicine Operational Efficiency Committee, which is composed of faculty from all medicine divisions. The committee used the following Guiding Principles for selection:
Non-controversial and evidence-based change
Measure available electronically
Meaningful outcome that would add value (reduce harm, reduce cost, improved patient outcome or experience)
Intervention that would not increase physician workload
Selected proposals (table 1) became a project and were submitted to the institutional review board, designated as non-research and approved for implementation as a quality improvement effort.
Project team
Each team included a faculty member as the project champion, a resident trainee, a quality consultant, and other faculty or staff. The project team membership was dynamic depending on the needs of individual projects.
Project structure
Workflow for each project generally followed four phases with a goal of completing each project in 4–6 months
Phase 1 involved collecting baseline data, the initial team meeting and developing specific measurements. The goals were the completion of a data request form which improved the efficiency of the baseline data abstraction process, and the first draft of a project charter.
Phase 2 assessed the results of the baseline data abstraction. The team met and asked: “Is there opportunity for improvement based on the initial baseline data?” The team discussed the data and the project champion decided if the project should proceed. Next, team members decided what other departments should be involved in the project, for example, Pathology, Laboratory, Radiology, Nursing or Electronic Health Record (EHR). The team and the project manager also finalised a project charter.
Phase 3 comprised a meeting of core project members and additional members identified in phase 2. There were three goals: First to review, amend (if necessary) and accept the project charter. Second, establish and define the current process as it relates to each project. Third, develop potential interventions to enact system change.
Phase 4 focused on implementation, which typically involved two processes: Education of ordering providers and system-based change. System-based change involved our EHR but also involved workflow changes dependent on the requirements of the individual project.
Once the implementation was completed projects were monitored for a change in core measurements and results. These were reported at prespecified time intervals selected by the project champion and HVC coordinators based on the frequency of the test or intervention. If the intervention was successful, projects were moved to a completed status where the measure was reported quarterly. After 2 years of tracking, projects are reviewed by the HVC coordinators to determine if the change is acceptable and being sustained, if further data reporting is required or if the project should be retired.
Cost analysis
The Business and Quality departments performed a cost analysis. This analysis strived for conservative estimates and did not use charge master unit costs.12 Instead, we evaluated decreased revenue and decreased direct costs to the system for each project individually. This accounting calculated costs from a medical centre and payers’ perspectives and represents an estimate of funds saved by either the health network or the payers. For example, if the patient was an inpatient and covered under a Diagnosis Related Group (DRG) payment the total payment would not change however, and the system would save the incremental cost of the test which was not performed. If the procedure was an outpatient procedure or service performed we calculate revenue lost as the cost to the system. Some projects had a revenue and cost component. Once cost per procedure was calculated it was multiplied by the number of tests, procedures or services which were reduced over time from our baseline data for an estimate of cumulative cost savings.
Results of assessment and measurement
Twenty-four suggestions from 11 subspecialty areas were submitted over a 2-week period. Eight were selected and we completed interventions on seven projects. One project, reducing screening colonoscopy for patients aged >75 years was found to have a low initial rate and the project champion elected not to begin a formal project. However an informal process was implemented to educate ordering physicians about current guidelines. Five projects have been moved to a completed and monitoring status. Summaries of each project, baseline data, implementation strategy and results are outlined in table 1 and cost savings estimates in table 2. Highlights of successful projects included:
72% reduction in the use of blood urea nitrogen (BUN) and creatinine lab testing in patients with end stage renal disease who are on haemodialysis and hospitalised
90% reduction in dual-energy X-ray absorptiometry (DXA) on women <65 years who did not have clinical risk factors for early osteoporosis documented
71% reduction in the use of portable chest X-rays in mechanically ventilated patients who were not intubated that day and did not have a procedure performed.
Nine faculty members were project champions and 10 resident trainees participated in the eight HVC projects. Individual projects and their outcomes have been presented internally at resident research day, as well as externally by faculty and resident trainees at regional and national conferences.
The time to implement change varied significantly from a low of 3 months to over 1 year for a system-based intervention. In one project, where staging imaging on asymptomatic women with low stage breast cancer was not performed, we were unable to implement a system change because of limitations in our ability to electronically identify the patient population. Because the oncologists dictated the oncological problem list into EHR, staging did not reside in discrete data fields within the problem list. Lack of discrete staging data limited our ability to abstract, monitor or to provide targeted electronic interventions using the EHR. In this project we focused on education which included a letter from the project champion and resident trainee which outlined recommended criteria for staging imaging. However, we were unable to monitor for a response to this education because it would require manual chart review which we did not have the resources to undertake.
Intervention: education and system change: developing an electronic intervention tool kit
Education took multiple forms depending on the project (table 1). For example the project attempting to reduce BUN and creatinine testing on haemodialysis patients, the resident trainee shared reports on the use of this testing during internal medicine morning report. For the DXA utilisation project, the director of the osteoporosis centre provided education by sending a letter to all ordering providers outlining evidence-based risk factors for early osteoporosis.
Education was accompanied by a change to the current process or electronic health system. Each intervention was built with the following principles:
Targeted patient population
Does not increase physician workload
Low false-positive rate
Information is available at the time the order is placed.
Our first system change intervention used an electronic best practice advisory (BPA). This alert appeared when a BUN or serum creatinine was ordered on a patient who had end stage renal disease (International Classification of Disease (ICD)-9 code 585.6) on their problem list. This allowed the BPA to activate only when the target labs were ordered on the target patient population. Since that time, we also implemented BPAs that look at prior lab results and activate if a prior test was positive (eg, hepatitis) or was ordered within the past year (eg, echocardiogram). The BPA provides the date of the prior test and the result or link to the result. We developed a partnership with our EHR department that made it possible to reduce the timeline for development and implementation of electronic interventions. We also reuse interventions that are already developed. This ‘intervention tool kit’ can be leveraged on future projects.
In addition, we communicated through multiple mediums including presentation of updates and results of these projects to the department of medicine, administrators, medical students, resident trainees and the public. Since 2012 we delivered 10 presentations internally, seven externally and four blog posts. Eleven posters were generated for internal, regional and national conferences by our resident trainees. We are now sharing the concepts of HVC and the projects with 3rd year medical students prior to them beginning their medicine rotation.
Cost analysis
We have completed cost analysis for three projects for which we have sufficient time and data post intervention to report (table 2). Over eight quarters we estimate to have saved $326 974. This estimate does not include additional costs of downstream testing or potential future savings from changed practices. Nor does it quantitate the reduction in harm or patient inconvenience.
Lessons and messages
We developed an HVC programme that improves the value our medical system delivers while providing real life examples to resident trainees who are developing their own approaches to the practice of medicine.
Bottom up versus top down
The HVC programme used a ground up approach where project ideas were generated by clinical champions using their experience of delivering care to patients. Some proposed projects matched the ABIM Choosing Wisely lists10 while others were based on direct observations. We believe this method had advantages as it allowed those who proposed projects to participate as clinical champions which minimised controversy. This framework improved the ‘Diplomacy of Cost-Reduction’13 as these projects were generated by clinicians who had a stake in the test being ordered. For example, the director of our osteoporosis centre evaluated DXA scans and our chief of cardiology evaluated echocardiography utilisation which could have had an effect on their departmental budgets. Having projects generated and supported by clinical champions changed how their colleagues viewed these efforts and allowed for a greater degree of support than if the project were generated by a central committee. We did not experience any resistance based upon potential impact on revenue or departmental budgets. We attribute this to the clear goals of the project champions to reduce waste and improve patient experience, as well as institutional senior leadership support.14
Senior leadership support
Senior leadership of our institution was fully supportive of these projects, which was a key to their success. There was no conflict around how such projects could affect revenue in a fee for service payment model. Senior leaders who were aware of and supportive include:
President and Chief Executive Officer (CEO)
VP of Quality (who provided analytical and project management support)
Chief Medical Officer
Chair of the Medicine Department (who initiated the project).
At our institution all leaders are focused on improving the quality and efficiency of the care we deliver. They are also aware that payment models are changing and that in the coming years an increasing percentage of revenue will be either at risk or fully capitated. To prepare for this new payment system they are trialling multiple different projects aimed at improving quality, eliminating waste and improving the efficiency of care delivery.14 Finally, the state of Vermont provides financial incentive in its regulatory structure to cap revenue expansion of health systems to less than 3% annually.15
Leveraging project management and an EHR intervention toolkit
Projects were supported by project management, data analysis, HVC steering committee and EHR resources. This infusion of resources as well as our expanding toolkit of EHR-based interventions allowed project champions to quickly gather data which they were then able to present to their colleagues.
Resident trainees as participants
Resident trainees provided important insights into workflows and the current state of the system. They educated other resident trainees and generated workflow and system intervention ideas. Resident trainees involved in projects also generated their own ideas for future projects. HVC projects changed the conversation within our internal medicine residency programme and the Department of Medicine. Since the initiation of this programme, the notion that more care is better care13 has come under increasing scrutiny by resident trainees and faculty. Resident trainees are asked to justify why a test was not ordered and to justify why one was ordered, and explain how this will change the diagnosis or management for the patient.
Setbacks and lessons learnt
Through the course of implementing and managing this new HVC programme we have learnt and adapted based on successes and setbacks of individual projects.
Setbacks
Not all projects went smoothly. For example, initially, the project focused on reducing repeat positive ANA testing did not have an acceptable EHR intervention. Because repeat positive ANAs can occur infrequently and with significant time between tests, we did not feel education alone would be successful. However, the following year, we were able to develop an EHR intervention for the project aimed at reducing repeat positive hepatitis A and C antibodies. This new functionality was then applied to the prior project.
Chart review performed by residents, while helpful initially, also had downsides. In the project reducing low stage breast cancer imaging there was no specific field in our EHR which allowed us to digitally identify the stage of breast cancer. We relied on chart review, however, the resident who performed the focused chart review left the institution for a fellowship and we were not able to perform a postimplementation analysis. As a result of this project and other research priorities, the Oncology group has begun a project to digitally code the stage of all new cancer diagnoses.
Lessons learnt
We have learnt from each project and hope to decrease the time required to reach future system interventions. We focused on project selection, the development of an EHR intervention tool kit and maximising resident trainee involvement at the onset of each project with a goal of quickly moving to automated data reporting.
In this model for establishing an HVC programme, each project had a unique set of factors that affected project success or failure. We have begun to group these factors under the categories of complexity, value and controversy and plan to use this framework to guide future project selection. While we tried to use as much objective data as possible, the scoring for the projects presented here was subjective based on our experience with education, EHR, data abstraction requirements and departmental politics.
Complexity
The first factor involves the overall complexity of a project. This may include the workflows involved, the number of departments and our ability to electronically and accurately abstract and analyse the data as it relates to a target population. Manual chart review was initially required for a number of projects, making them time-intensive and labour-intensive. An electronic measure is therefore preferable. Overall, as project complexity increases and timeframe lengthens, additional resources are required and there is a higher probability of not attaining the desired change.
Value
Each project has a value as it relates to our medical system. Value includes monetary savings, reduction in patient harm or inconvenience and improved patient experience. These latter factors are difficult to quantify. One example of this concept was the project that reduced daily chest X-rays in the intensive care units. In addition to monetary savings we added value to the system of care by reducing patient discomfort. A CXR in the intensive care unit required intubated, lightly sedated patients to be aroused at 05:00 and positioned for an X-ray plate; this caused discomfort and disturbed sleep. It is difficult to place a monetary cost on reducing this practice, but lessening discomfort is a worthy goal and may promote faster recovery.
Controversy
Each project rests on a spectrum of potential controversy that may affect its success. Our initial HVC project to reduce BUN/creatinine on haemodialysis patients was without controversy as no clinicians could justify why these tests should be routinely required. On the other end of the spectrum, our project that evaluated staging imaging studies in patients with asymptomatic, early stage breast cancer revealed variation in practice among providers and resulted in controversy within the oncology department on how to best care for this patient population.
Using these three subjective measures we developed a framework to visualise where projects fall in a spectrum. We used two x-y axes where the x axis was either value or controversy and the y axis complexity (figure 1). In the value plot (figure 1A) we sought projects that were high value but low complexity (in the right lower quadrant), and for projects that have high value and some level of controversy (figure 1B) we needed a strong clinical champion and senior leader support. We also anticipated that these projects would require additional time and resources to enact change and move to a completed status.
Generalisability and limitations
There are unique attributes of our state and institution that may not be directly transferable to other settings. First, our state has enacted a regulatory process that seeks to cap expansion of hospital revenue to 3% annually.15 This incentivises our medical system to find and eliminate low value care. Second, we had support from senior leadership at our institution. Third, faculty participated in and promoted the projects to their colleagues. This combination of factors contributed to the success of our HVC programme.
A limitation of a bottom up approach is that project submissions and selections may not align with the most cost-effective project or the goals of a department, division or institution.
Conclusions
We have created an HVC programme that uses a physician-initiated approach to project idea generation and implementation of educational and system-based change. We involved internal medicine resident trainees in the process providing them practical experience and education in systems-based practice and quality improvement. Overall our programme has changed the conversation about testing for resident trainees and faculty, provided a model for ongoing HVC activities and is becoming institutionalised as a programme at our medical centre.
Acknowledgments
Many individuals have contributed their time and expertise to these projects. The authors thank Dr Polly Parsons, the Chair of the Department of Medicine for her leadership and support and Allen Mead for his administrative support and membership of the programme steering committee. The clinical champions for each project: Dr David Schneider, Dr Gill Allen, Dr Ryan Clouser, Dr James Vecchio, Dr Steven Lidofsky, Dr Edward Leib, Dr Bonita Libman and Dr Marie Wood. Resident Trainees: Dr Elizabeth Hall, Dr Maria Burnett, Dr Sadi Raza, Dr Samreen Raza, Dr Benjamin Keveson, Dr Heather Shank, Dr Adedayo Fashoyin, Dr Tim Leclair and Dr Sam Merril. the authors also received support from Laboratory and Pathology: Dr Mark Fung, Dr Greg Sharp, Jocelyne Stocker, Michelle Baker and Luke Purvis. The James Jeffords Institute for Quality supported us through ongoing data analysis and analytics: Anna Noonan, Jason Minor, Patricia Bouchard, Deirdre LaFrance, Mike Nix, Mike Gianni, Cynthia Gagnon, Heidi Guevin and Melissa Holman. The electronic health record team: Dr Doug Gentile, Debra Dulac, William Eaton and Randy Ensley.
Footnotes
Contributors JMS-D and VLH are the coordinators of the HVC project and collectively wrote this paper and performed data analysis. PGS performed data analysis, reviewed the paper and participated in all the outlined projects.
Competing interests None declared.
Provenance and peer review Not commissioned; externally peer reviewed.