Application of quality improvement strategies in 389 European hospitals: results of the MARQuIS project
- 1Academic Medical Center, Department of Social Medicine, University of Amsterdam, Amsterdam, the Netherlands
- 2Avedis Donabedian Institute, Autonomous University of Barcelona, and CIBER Epidemiology and Public Health (CIBERESP), Barcelona, Spain
- Dr M J M H (Kiki) Lombarts, Academic Medical Center, Department of Social Medicine, University of Amsterdam, Meibergdreef 9, PO Box 22700, 1100 DE Amsterdam, the Netherlands;
- Accepted 12 November 2008
Context: This study was part of the Methods of Assessing Response to Quality Improvement Strategies (MARQuIS) research project investigating the impact of quality improvement strategies on hospital care in various countries of the European Union (EU), in relation to specific needs of cross-border patients.
Aim: This paper describes how EU hospitals have applied seven quality improvement strategies previously defined by the MARQuIS study: organisational quality management programmes; systems for obtaining patients’ views; patient safety systems; audit and internal assessment of clinical standards; clinical and practice guidelines; performance indicators; and external assessment.
Methods: A web-based questionnaire was used to survey acute care hospitals in eight EU countries. The reported findings were later validated via on-site survey and site visits in a sample of the participating hospitals. Data collection took place from April to August 2006.
Results: 389 hospitals participated in the survey; response rates varied per country. All seven quality improvement strategies were widely used in European countries. Activities related to external assessment were the most broadly applied across Europe, and activities related to patient involvement were the least widely implemented. No one country implemented all quality strategies at all hospitals. There were no differences between participating hospitals in western and eastern European countries regarding the application of quality improvement strategies.
Conclusions: Implementation varied per country and per quality improvement strategy, leaving considerable scope for progress in quality improvements. The results may contribute to benchmarking activities in European countries, and point to further areas of research to explore the relationship between the application of quality improvement strategies and actual hospital performance.
Quality and safety of patient care are high on the European policy agenda, as evidenced by various commitments by European health ministries. Patient mobility has clearly been a triggering factor. Governments may fear that differences in the perceived quality or costs of health services may encourage patients to cross borders to obtain healthcare.1–3 The fact that member states are now talking about what is still their responsibility has increased the need for information about the cross-border movement. The Methods of Assessing Response to Quality Improvement Strategies (MARQuIS) research project aims to be instrumental in providing a better understanding of this movement, by investigating and comparing different quality improvement (QI) strategies in healthcare systems across the European Union.
In this article we focus on the degree to which QI strategies are applied at European hospitals, by their own report. As presented in the Health Care Quality Strategies in Europe study, we identified seven national QI strategies.4 5 Our primary focus is on implementation of the strategies at the EU level; data at the country level are reported for reference purposes. The QI strategies we investigated were:
organisational quality management programmes;
systems for obtaining patients’ views;
patient safety systems;
audit and internal assessment of clinical standards;
clinical and practice guidelines;
performance indicators and measurements;
The countries participating in this study were Spain, France, Poland, Czech Republic, the UK, Ireland, Belgium and the Netherlands.
MATERIAL AND METHODS
We conducted a web-based questionnaire survey. The questionnaire was developed to measure QI, defined as the application of quality policies and procedures, quality governance structures, and quality activities used to close the gap between current and expected levels of quality.4 To determine the distinctive aspects of QI we used several sources, such as existing QI questionnaires,6–9 a review of the quality literature,10–12 an analysis of accreditation manuals,11 13 14 and the results of previous MARQuIS studies including a literature review covering QI strategies in member states of the EU, and an analytical framework defining areas of QI policies and strategies. A glossary of quality concepts and tools was made available to participants.
The questionnaire consisted of four sections: one section focused on QI at the hospital level, the other three on quality management for specific medical conditions. The three medical conditions for focused data collection were selected based on two criteria: the condition had to represent a significant volume of cross-border patient care,15 and the combination of conditions was intended to cover the most relevant services offered by a hospital—that is, emergency surgical and medical services, and maternal and neonatal services. The three conditions selected were acute myocardial infarctions (AMI), acute appendicitis and deliveries. For each condition selected, the literature was searched for specific QI strategies. Search terms used included “quality assurance”, “quality improvement”, “quality assessment”, and “performance measurement”.16–31 We stopped searching when additional publications no longer resulted in new relevant QI strategies, activities or measures. Practising medical specialists were consulted for their comments and suggestions on the specific QI strategies, activities, and measures (see Acknowledgements).
Both members of the MARQuIS team and the nine country coordinators reviewed the draft questionnaire. (For Belgium two country coordinators were appointed, one for Flemish-speaking and one for French-speaking hospitals.) The questionnaire was then pilot tested in two hospitals in Ireland and the UK (these countries were chosen for language-related reasons), and a few amendments were made as a result. The questionnaire was translated into five languages (Spanish, French, Polish, Czech, and Dutch); the country coordinators were responsible for translation. A forward–backward translation protocol was used. Figure 1 shows the final structure of the questionnaire, which totalled 199 questions. For each of the four sections a preferred respondent (at the senior level) was suggested.
Various scoring scales were used depending on the type of question. Items were scored on a two-point scale (yes/no), a four-point scale (see tables 2 to 8 in Results), or a five-point Likert scale (1 = strongly agree, 5 = strongly disagree).
Sampling and recruitment
Our survey focused on European hospitals with a minimum of 100 acute care beds, and offering care for at least two out of the three conditions selected for study (AMI, appendicitis and deliveries). Two additional criteria defined the hospital sample: ownership status (public, private not-for-profit and private for-profit), and actual or potential cross-border care delivery.31 The target was to include a total of 600 hospitals covering eight countries. For smaller countries (Belgium, the Netherlands, Ireland, Czech Republic) this meant that all hospitals meeting the inclusion criteria were invited to participate. In the remaining countries, hospitals were randomly sampled from a list of hospitals that met the sampling criteria. Hospital recruitment was done by the country coordinators, who used different strategies. To make participation more appealing to hospitals, a package of advantages was offered to the hospitals, including membership in the MARQuIS network, a certificate of participation, and free subscription to the project’s newsletter. Hospitals that agreed to participate in the survey received an e-mail inviting them to enter the MARQuIS website (http://www.marquis.be) and fill out the web-based questionnaire. Upon request, a paper version of the questionnaire could be used and sent to the researchers (MJMHL or IR). Data were collected from the beginning of April to the end of August 2006. Hospitals received up to three reminders. Again, country coordinators used different approaches to raise the response rates.
Validation of the data
To validate the questionnaire data, two analyses were performed by using data collected during on-site hospital visits in a selected sample of 89 hospitals that had previously completed the questionnaire. Visits were performed by independent external auditors. All aspects of on-site visits are described in detail elsewhere.32
During the visits, the hospitals’ key informants were asked to answer 25 questions that had been previously asked in the questionnaire. The reliability of the questionnaire was assessed by the level of agreement between the responses to these 25 questions as given in the questionnaire and during the on-site visits. In addition, for 14 of these 25 questions the external auditors requested evidence to check the answers given during the audit. Criterion validity was then assessed as the degree of agreement between the information provided by key informants and the evidence found to underpin this information. Reliability and validity were assessed as the index of expected agreement, which is the proportion of cases in which the results of both assessments matched.33
Response rates and study population
The country coordinators approached a total of 1396 hospitals, of which 483 visited the web-based questionnaire and ultimately 389 submitted the completed questionnaire (table 1). Response rates varied per country. The study population consisted of public (80%) as well as private (20%) hospitals, and included university (23.5%), teaching (48.9%) and non-teaching hospitals (27.6%). The mean number of sites per hospital was 2.46. Almost a quarter of all hospitals were considering collaboration to deliver cross-border patient care, but few hospitals were doing so at the time of the study.15
Results of the validation process
Regarding reliability, comparison of the information obtained for 25 items from questionnaires and on-site visits resulted in the following ranges of agreement: for five items agreement was over 90%, for 12 items the level of agreement was >70% and <90%, and for eight items agreement was <70%. Given the period of 7–9 months between the questionnaire and the on-site visits, it is likely that at least some of the items studied had actually changed. In addition, the 14 items analysed to assess criterion validity resulted in the following levels of agreement: seven items had an agreement index of >90%, six items rated >70% and <90%, and one item scored an index of <70%. Based on these results, we considered the information collected with the questionnaire a fair approximation of the actual situation at participating hospitals.
A previous MARQuIS study34 identified seven mandatory QI strategies for hospitals. The extent to which these strategies are applied at European hospitals is reported below.
QI strategy 1: Organisational quality management programmes
Hospitals reported using quality management programmes in developing and implementing QI (table 2). Overall, the International Organization for Standardization (ISO) 9000 management system standards were used most often, and the European Foundation for Quality Management (EFQM) model was used least often. However, there were wide variations between countries. Belgium was the only country where the EFQM model was preferred by most hospitals (60.9%); in all other countries the ISO system was the dominant scheme. In Poland (50.0%), the Czech Republic (59.5%), and Spain (70.3%) the use of ISO was widespread. In Spain some hospitals seemed to use both schemes. Irish hospitals reported moderate use of a quality management programme (ISO = 31.6%, EFQM 20.0%), but stressed the systematic use of specific QI teams in most of their hospital departments, either systematically (47.8%) or unsystematically (17.4%). French hospitals relied least on the programmatic approach with the ISO (28.6%) or EFQM system (3.4%), and the availability if QI teams was limited (27.7%).
Responsibilities for, and policies on, blood transfusion, the use of antibiotics, and hospital infection control were clearly assigned to a committee or person in almost all hospitals across Europe, with the exception of Czech hospitals, which reported significantly less clear organisation for all three hospital-wide functions. Spanish hospitals showed a gap in organising responsibilities for blood transfusion (87.5%), and Irish hospitals in the use of antibiotics (87%). Responsibility for the prevention of bed sores seemed less structured within European hospitals, varying from a reported 66.7% in Ireland to 95.8% in Belgium.
QI strategy 2: Systems for obtaining patients’ views
Monitoring the views of patients by systematically conducting patient surveys was a common practice in 64.5% of the participating European hospitals. The Czech Republic stood out, reporting that 91.9% of their hospitals systematically monitored patient views. These numbers refer to hospital-wide systems for collecting patients’ views on the care they received. At the department level, patients are asked at discharge for their opinion on the quality of care delivered by the hospital staff. In France this strategy was widely implemented, with approximately 65% of the hospitals reporting a policy to measure patients’ opinion at discharge. In Poland, this was a common practice in less than 14% of the hospitals. The rates varied significantly for the three medical conditions included in our study, and variation between countries seemed greater than within countries (table 3).
Across participating European hospitals, patient involvement seemed to be little developed. Hospitals were asked to identify the activities in which individual patients or patient organisations were always or almost always involved. Participation in the design of protocols or the development of standards was reported by 3% to 4% of all hospitals; participation in improvement projects or in quality committees was reported by less than 10%. Patient involvement was best implemented in France, with almost 40% of the hospitals reporting that they involved patients in the discussion of the results of patient surveys or complaints, and 32% stating that patients participated in quality committees.
QI strategy 3: Patient safety systems
Hospitals were asked how patient safety was organised and managed, whether the results were reported, and if so, how they were reported (table 4). Responsibility for patient safety was assigned to a committee or person in approximately 75% of the hospitals. At 39.1% of the hospitals a risk management programme or system was in place; 50% of the hospitals systematically reported and analysed adverse events, and 55.6% also reported complications to the medical staff. These are average numbers for Europe; variation between countries was substantial. Irish hospitals scored consistently high (>90%), and Belgium and Spanish hospitals relatively low on the availability of safety systems.
Specific safety questions addressed drug safety management and patient identification. In general, drug safety seemed to be assured in all participating hospitals: the use of drugs was standardised and controlled, and systems for storing, checking and preventing unauthorised access to drugs seemed well implemented. Electronic drug prescriptions were used widely only in Czech hospitals (86.8%). By comparison, in Ireland a mere 13% of the hospitals reported use of electronic prescriptions. For patient identification systems the findings were the opposite, with 100% of Irish versus 29.4% of Czech hospitals using bracelets to identify admitted patients.
QI strategy 4: Clinical guidelines
Clinical guidelines were widely used at participating European hospitals. Hospital-wide guidelines for preoperative assessment and prophylactic antibiotic use were in place in the vast majority (75–90%) of hospitals. In the Czech Republic and Ireland, guidelines for prophylactic antibiotic use were least widely used, versus 100% coverage in Belgian hospitals.
Laboratory work seemed to be highly standardised across the various types of laboratories across Europe. On average, standard operating procedures (SOPs) were available in approximately 90% of all hospitals. At the department level, availability of clinical guidelines or protocols was significantly less common. In summary, guidelines were available for the management of patients with AMI (mean = 86.7%), appendicitis (mean = 54.3%), or obstetrical problems such as breech presentation (71.5%) and vaginal birth after caesarean delivery (64.5%). Between-country variation for the availability of clinical or practice guidelines was limited for hospital-wide guidelines and SOPs, but substantial for condition-related, department-level guidelines (table 5).
QI strategy 5: Performance indicators or measures
For the three medical conditions included in this study, hospitals were asked to report the availability of performance data for a selection of clinical indicators. Table 6 shows the findings. In summary, the availability of AMI performance data was most complete, averaging approximately 70% for the seven selected indicators. Poland reported the highest percentages, France the lowest. Performance data on the management of appendicitis were being collected for approximately 50% of the five indicators, varying from 42% for perforated appendicitis treated surgically 24 h after admission to 68.2% for wound infections. The Czech Republic and Poland performed best in this area. Lastly, the statistics for obstetrical indicators varied from 54.0% for the rate of vaginal birth after caesarean delivery to 85.3% for the percentage of caesarian deliveries. Obstetrical data were most complete in Poland and Belgium, and least complete in Ireland (table 6).
QI strategy 6: Internal audit, assessment of clinical standards
Medical staff performance was systematically reviewed at 50% of the participating hospitals, and peer review (site visits) was conducted at approximately 25%. Between-country variations were considerable. Belgium, Poland and the Czech Republic reported that over 60% of the hospitals performed medical staff performance reviews, versus 26.1% of Irish hospitals. However, Irish hospitals made more use of peer review (site visits) than any other European country (39.1%).
On average, 50% of the laboratories at European hospitals were periodically surveyed by an internal audit team. Percentages varied according to the type of laboratory and between countries. France reported generally low rates; in Poland internal auditing seemed broadly implemented. However, only a third of the Polish hospitals reported the results of internal audits to their governing boards, versus approximately 90% of the hospitals in the Czech Republic and Ireland. Polish hospitals more openly shared the results with their medical staffs (59.2%), but other countries reported higher percentages. Belgium was the exception here: only 40% disclosed their results to medical staff (table 7).
QI strategy 7: External assessment
Most hospitals (88%) have been assessed (at least in part) by an external organisation such as an accreditation (59.4%) or certification (49.4%) institute, a patient organisation (18.5%), or a government inspection body (66%). Some hospitals were audited by more than one organisation. In Spain, for instance, 64.8% of all hospitals (n = 88) reported being evaluated by an accreditation body, and 63.6% by a certification institute. In France (n = 63), 93.7% of all hospitals had been accredited, and in Ireland (n = 22) 90.9%. In Poland (n = 75), government inspections were the most frequently reported type of external evaluation (76%).
French hospitals reported being most open (92.3%), and Spanish hospitals the least open (19.8%) about their assessment results. On average, 52.9% of the hospitals in our sample publicly disclosed their assessment results (table 8). Most participating hospitals (84.3%) reported plans for re-evaluation within the next 3 years (not shown). Accreditation bodies were listed most frequently as the future assessors; in the Czech Republic, Ireland, the UK, and the Netherlands, more than 85% of the hospitals expressed this intention, and the figures were 78.1% for France and 77.3% for Spain.
Quality improvement (QI) strategies are widely used in European hospitals. The most widely applied QI strategy is external assessment of hospitals, whereas patient involvement in QI activities is the least widely applied
Reported implementation varies per country. This leaves considerable room for progress in making QI in hospitals a reality
Differences also suggest that, for various reasons, countries may prefer some QI strategies over others
Contribution to better patient care
International comparisons of the use of QI strategies can promote learning and the spread of good practice
The results of this study may be useful to national policy makers in monitoring the attainment of healthcare policy goals
Points for further research
Further research should focus on exploring the relationship between the use of QI strategies and the actual performance of hospitals, including the relative contribution of each of the seven QI strategies to performance
This study has some limitations. The response rates varied per country, and were particularly low in the UK and the Netherlands. This may be explained by the various approaches used to recruit hospitals for the MARQuIS project, and to the effect of questionnaire fatigue due to the over-application of questionnaire surveys to evaluate healthcare performance in general. We therefore cannot rule out participation bias. Also, accuracy of the information is always a limitation when using self-reported data. However, the results of our validation process strongly suggest that the reported results are fairly accurate. Further, translation of the questionnaire, the use of jargon, and the involvement of people from various healthcare systems may have caused differences in how the items were interpreted. Lastly, hospitals may use local QI approaches or tools not included in this questionnaire, in which case the application of QI strategies, as described in this article, may misrepresent the “maturity” of hospitals’ quality management systems. These limitations should be taken into account when interpreting results.
International comparisons can promote learning and the spread of good practice, and are one of the ways in which the European Community is expected to raise healthcare quality. This study of how European hospitals apply seven common QI strategies found considerable variation between the level of implementation of the different strategies—a finding that leaves considerable scope for progress in making QI a reality.
The use of QI strategies at the European level was determined or at least influenced by national and international policy making and regulation, as well as by national and local bottom-up actions initiated by professionals or others.35 In our study 88% of all hospitals reported having been externally assessed; the widespread application of the “external assessment” QI strategy can be ascribed to the fact that most countries have adopted one or more models of external assessment (ie, accreditation, certification or licensure) to ensure and improve hospital performance, which in turn has been related to financing healthcare delivery.
However, policies and regulations may not always be effective, as shown by the fact that in most hospitals (>90%), patient involvement in QI activities was lacking. This was despite the various legal and other efforts undertaken by the European Commission over the past decades to increase citizens’ participation in QI, and in the organisation and structure of health services in general.4 5 Future research should focus on detecting barriers to the implementation of these QI strategies. In this regard, efforts by the EU to facilitate improvements and foster European collaboration may help to further increase implementation.36 37
Legislation recently proposed by the European Commission stresses the values and principles of safe, high-quality health services that underpin European health systems. However, the question arises as to how these agreed-upon values and principles can be applied by member states.36 We believe our results may help national policy makers to monitor the attainment of healthcare policy goals. The application of more QI strategies, however, may not necessarily imply more positive effects on performance. Our findings would be even more valuable if the demonstrated use of QI strategies could be related to actual performance in hospitals. This would give EU policy makers direct input for monitoring the development of healthcare policies and regulations. Elsewhere in this supplement this relationship is explored in greater depth.
The authors wish to thank the European Commission for funding this research, and all those who supported and guided this work both within the MARQuIS project team and as external associates. In particular we like to thank all the country coordinators for their valuable comments: B Kutryba (Poland), J Bañeres (Spain), P Doets and H Beaard (the Netherlands), C Bruneau and F Bousquet (France), A Jacquerye and A Vleugels (Belgium), I Stanek, S Zamarian and V Typovska (Czech Republic), H Crisp and A Cassell (UK), and E O’Connor (Ireland). We also express special thanks to the medical specialists who provided knowledgeable comments and suggestions on condition-specific questions: FJL Reijnders (Slingeland Hospital, Doetinchem, the Netherlands), GP Gerritsen (Tweesteden Ziekenhuis, Tilburg, the Netherlands) and AAM Wilde (Academic Medical Center, Amsterdam, the Netherlands). Lastly, we thank all the MARQuIS research and project partners for their continuing collaboration in this questionnaire survey: B van Beek, P Garel, O Groene, K Hanslik, P Poletti, C Shaw, E Spencer and K Walshe.
Funding: Research done for the “Methods of Assessing Response to Quality Improvement Strategies (MARQuIS)” project (SP21-CT-2004-513712) was funded by the European Commission through its Scientific Support to Policies action under the Sixth Framework Programme for Research.
Competing interests: None declared.
This is an open-access article distributed under the terms of the Creative Commons Attribution Non-commercial License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.