Article Text
Abstract
Background Several countries have national policies and programmes requiring hospitals to use quality and safety (QS) indicators. To present an overview of these indicators, hospital-wide QS (HWQS) dashboards are designed. There is little evidence how these dashboards are developed. The challenges faced to develop these dashboards in Dutch hospitals were retrospectively studied.
Methods 24 focus group interviews were conducted: 12 with hospital managers (n=25; 39.7%) and 12 support staff (n=38; 60.3%) in 12 of the largest Dutch hospitals. Open and axial codings were applied consecutively to analyse the data collected.
Results A heuristic tool for the general development process for HWQS dashboards containing five phases was identified. In phase 1, hospitals make inventories to determine the available data and focus too much on quantitative data relevant for accountability. In phase 2, hospitals develop dashboard content by translating data into meaningful indicators for different users, which is not easy due to differing demands. In phase 3, hospitals search for layouts that depict the dashboard content suited for users with different cognitive abilities and analytical skills. In phase 4, hospitals try to integrate dashboards into organisational structures to ensure that data are systematically reviewed and acted on. In phase 5, hospitals want to improve the flexibility of their dashboards to make this adaptable under differing circumstances.
Conclusion The literature on dashboards addresses the technical and content aspects of dashboards, but overlooks the organisational development process. This study shows how technical and organisational aspects are relevant in development processes.
- governance
- healthcare quality improvement
- performance measures
- quality measurement
- report cards
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
Introduction
Improving the quality and safety (QS) performance of hospitals has become increasingly important in recent years.1 2 Most countries, especially the USA,3 Canada,4 Australia,5 Great Britain6 and the Netherlands7 have established national QS policies, programmes and frameworks introducing QS indicators to measure performance in healthcare. These policies, programmes and frameworks require hospitals to use QS indicators to monitor and improve performance.8 9 To create an overview of QS indicators, most hospital boards commissioned the development of hospital-wide QS (HWQS) dashboards.10–12 HWQS dashboards are information delivery systems that present a clear overview of the QS indicators needed to achieve the desired objectives and thus enable users to manage the QS performance of hospitals.13 Usage of these dashboards is likely to improve a hospital’s QS performance13–15 as several studies show that deploying HWQS dashboards reduces, for example, infection rates,16 medication errors,17 falling incidents18 and air quality.19
However, several requirements should be met before these dashboards can be used as QS improvement tools.13 14 Research shows that these dashboards should: (1) provide content that aligns with the needs of users; (2) be designed in such a way that the content is easily comprehensible to a variety of users; (3) offer various functionalities that users can deploy to customise dashboard content and (4) display timely, complete and correct data to be perceived as valid and reliable by users.13–15 Unfortunately, research shows that hospitals often fail to develop HWQS dashboards that adhere to these requirements. Hospitals lack experience and expertise regarding dashboard development, making it unlikely for users to use these dashboards in their daily decision-making processes. This renders such dashboards ineffective as tools for QS improvement.20
As hospitals are clearly encountering problems in developing and working with these dashboards, it is important to examine the challenges they face and learn how to overcome these challenges. To our knowledge, virtually no research has been conducted on the developmental and implementation challenges as what limited research there is, originates in business studies that do not consider the complexities of hospital setting and care providing processes.21 The studies that do consider hospital settings are often not focused on HWQS dashboards, but look instead at a single disease or treatment,22 financial23 or logistical dashboards.24 The few studies that do focus on QS dashboards concentrate on departmental QS dashboards, for example, radiology,25–27 nursing,28 neonatology29 or emergency room dashboards.30 31
Bridging this gap, we evaluated the development, implementation and work processes of HWQS dashboards in Dutch hospitals to answer the following research question: What challenges do hospitals face in the development, implementation and refinement processes of HWQS dashboards and how do they overcome them?
Methods
Design
To retrospectively examine the development process of HWQS dashboards, a qualitative study using semistructured focus group interviews was deemed most appropriate.32 Focus groups provide an opportunity for collective reflection on the process as they ensure that different viewpoints are represented, encourage the sharing of experiences and promote discussion.33 34 The interaction among respondents confirms, reinforces or contradicts the contributions of individual respondents, giving this method a high level of validity.33 34
Data collection and analysis
Since larger hospitals contain more departments and cover more specialties than smaller hospitals, it is harder for them to keep track of all QS indicators. This invokes the need for oversight, resulting in the development of HWQS dashboards.35 Therefore, the 12 largest hospitals (>500 beds) were included (n=9% of all Dutch hospitals). Eight were academic hospitals, two teaching hospitals and two general hospitals.
Hospital boards provided the researchers with a list of possible respondents, selected on the basis of the researchers’ predefined list of jobs/roles. Respondents were invited by email to participate in the study, and following their consent, took part in the focus group interviews conducted at their hospital. In total, 12 interviews were held with groups of hospital managers (n=25; 39.7%) from various managerial levels (board, division, department) and 12 interviews were held with support staff (n=38; 60.3%) from various departments (quality management, information technology, business intelligence). Each focus group interview lasted about 90 min. Each of the focus group interviews was moderated one experienced senior academic staff member in the research team (in casu: AMW and MdB). The topic list for the semistructured focus group interviews was derived from the literature and included questions on the definition and appearance of the dashboard, the development and implementation process, the challenges encountered, their opinion on what was successful or hindering, ways to deal with challenges and their views on current and future use of the dashboard. During the focus group interviews history of the development process in their organisation was used to steer the discussion. The moderator (senior academic staff) made summaries of the discussion and asked the respondents to reflect on these summaries and adjust or amend on the summary. Examples of the dashboards (pictures) and documents (policy reports) were collected to interpret and contextualise interview data.
The focus groups were transcribed verbatim and anonymity was ensured by withholding names and organisations from the transcripts. A member check was conducted by asking respondents to assess the transcript of their interview. Transcripts and examples were analysed by consecutively deploying open coding to develop a coding framework. Next axial coding was used to assemble related codes into overarching categories (definition, types, purpose, development stages, roles in development stages, challenges, validity and reliability).36
Results
All the studied hospitals have some sort of HWQS dashboard containing different content (eg, type of indicators, visual presentation) serving a variety of purposes. The national safety programme starting in 2008 marks the start for most Dutch hospitals to develop HWQS dashboards, because this programme focused on the measurements of QS indicators.37 The ‘maturity’ of the IT systems (eg, Electronic Patient Records, Financial systems) and the attention of the hospital board for the development of a dashboard influenced the speed for the development of HWQS dashboards in a hospital. Despite the difference in the development process, all hospitals (n=12) display the safety indicators of the Dutch national safety programme (eg, pressure ulcers, medication verification, infections, pain scores and mortality rates) and most hospitals (n=8) report patient indicators (eg, patient complaints, patient satisfaction research). A few (n=3) depict improvement indicators (eg, safety rounds, improvement actions and safety culture measures).
The way hospitals prioritise and combine QS indicators depends on the purpose they ascribe to HWQS dashboards. The purposes range from providing external accountability (n=12), monitoring internal or/and strategic policies (n=12), improvement and learning (n=12), signalling new developments (n=10), encouraging creative dialogue (n=8), internal and external benchmarking (n=5) to initiating improvement initiatives (n=3). Although hospitals differ in terms of content and purpose, they agree that the dashboard should display crucial QS indicators that enable them to monitor QS objectives and detect improvement opportunities at a glance.
‘It’s like a dashboard in a car that tells you about your speed, fuel, lights, and the weather. It lets you know if you’re on the right track regarding the defined objectives, no matter whether these come from outside [the hospital] or not…. In general, that’s what a dashboard means to me. Only thing is, a car dashboard is simple; our dashboard is complex.’ (H1—Management—P4)
Respondents argue that using HWQS dashboards can improve QS performance if designed properly. Based on our data, we constructed a heuristic model of the development process of these dashboards broken down into five phases. This model is based on the lessons learnt expressed by the respondents, either based on ‘good practices’ (what would we do the same way again) or on the problems they encountered and how they solved this. Each phase contains its own challenges that developers should overcome before progressing to the next phase. It should be noted that hospitals occasionally deviated from this development process due to internal circumstances or they did not fully address the challenges from a previous phase, which resulted in their moving back-and-forth between the different phases. Our model is based on the respondents’ explanations and assessment of their development process. It should not be regarded as an exact blueprint of the data, rather as an overview of possible challenges associated with the various phases in the developmental process of HWQS dashboards.
Phase 1: data inventory
The development process starts by making an inventory of QS data usable for constructing QS indicators. Respondents state that it is challenging for hospitals to make a comprehensive inventory because of the one-sided nature of their QS data.
All hospitals collected QS data for accountability to external stakeholders (eg, healthcare inspectorate, patient associations, external registries). This results in dashboards that depict QS indicators relevant to external stakeholders and not necessarily relevant to internal purposes (n=12). This limits the effectiveness of HWQS dashboards according to our respondents, since users believe that externally driven QS indicators fail to represent their personal QS performance.
When making inventories, most hospitals focus on easily usable quantitative QS data and disregarded qualitative QS data (eg, patient complaints, audit results) as the latter is not easily convertible to display on dashboard templates (n=10). Although some respondents dismiss qualitative data as inconclusive, most argue that it provides the necessary context and insights into phenomena that are not quantifiable.
‘Yes, but our dashboard needs to go far beyond numbers only, because not everything can be presented in numbers. For instance, complaints, you can’t just say: “Patients are filing so-and-so many more or so many fewer complaints, because that’s meaningless. You have to discuss the nature and content of the complaints; that gives far more valuable information.” ’ (H3—Management—P1)
Although hospitals would rather not burden their employees with more administrative tasks, others allow current data collection to expand a bit by institutionalising new data collection sources (eg, discharge interviews, safety rounds) (n=3).
Thus, in this first phase the challenges are to balance the needs of external and internal stakeholders and to combine qualitative and quantitative QS data, without increasing the administrative burden.
Phase 2: dashboard content
In this phase, hospitals deploy QS data derived from data inventories to develop dashboard content. This consists of QS indicators (measures representing aspects of QS performance), appurtenant nominators and denominators and connected norms by which QS performance is assessed. Respondents state that it is challenging for hospitals to develop and prioritise useful dashboard content due to differing needs for specification. This results in ongoing discussions about the validity and reliability of the data presented on their dashboards.
To develop the content to improve QS performance, hospitals deployed three methods: (1) some hospitals adopted evidence-based frameworks recommended by external stakeholders (n=3); (2) some hospitals developed content by the expert opinion of specific hospital medical experts to define relevant QS indicators (n=6) and (3) other hospitals established a process to develop dashboard content in which managers (board members, middle managers) collaborated with support staff (QS staff, business intelligence staff, information technology staff) and healthcare professionals (physicians, nurses) (n=3). Respondents emphasise that involving healthcare professionals in developing content is important as respondents believe that this will make these stakeholders more likely to endorse and identify with this content.
‘QS indicators are deemed more credible if people are involved in the development process. That is what we notice. And the more reliable and valid QS indicators are, the more people want to deploy them to manage QS performance.’ (H10—Support staff—R6)
The dashboard content often reflects a compromise between the needs of users from different organisational units, hierarchical levels and professional fields. Ideally according to our respondents, HWQS dashboards display a combination of generic process-oriented QS indicators used by managers to compare and assess QS performance on the hospital or departmental level as well as specific outcome-oriented QS indicators used by healthcare professionals to evaluate personal or patient results. Additionally, hospitals consider introducing action indicators to this mix, which are useful for assessing actions taken after certain clinical outcomes occur.
‘We’re just starting the process. For instance, we only administrate when something gets done, which is a challenge in itself. For example, measuring the pain score. However, we can’t assess if that action is appropriate. For example, if the pain is treated with medication. We can only assume that the action was okay, we can’t show that on our dashboard since we don’t measure it—it’s not mandatory in our national indicator system.’ (H8—Support staff—R3)
Validated, reliable measurement of dashboard content is further complicated by storing QS data in fragmented and incompatible source systems (eg, patient health record systems, human resource systems, accounting systems) making it difficult to extract the dashboard content. Nevertheless, some hospitals manage to extract content directly from underlying source systems (n=4), while most hospitals maintain data warehousing systems to store and organise QS data derived from these systems (n=8).
The main challenges in this second phase are to overcome the discussion about the validity and the reliability of the indicators and to extract relevant content from existing IT-systems.
Phase 3: dashboard design
In this phase, hospitals design a dashboard layout that supports the visualisation of content. Respondents say that it is challenging for hospitals to design an inclusive layout that is comprehensible to users with differing executive duties, cognitive abilities and analytical skills.
The graphical presentation should fit the purpose. To achieve congruence, hospitals often rely on the following graphics: bar and column charts to display comparisons; scatter and bubble charts to demonstrate relationships; line and column histograms to present distribution; donut, pie and waterfall charts to show composition and run and area charts to depict progress. Respondents add that these charts are only effective if they are continuously updated and show real-time QS data, which increases the likelihood of users identifying with and acting on this QS data.
‘I find it important to always be aware of how fast I’m driving, so that I can adjust my speed and avoid getting a fine. For me, a quality dashboard is a tool that helps me manage my performance operationally and in real-time.’ (H6—Management—R1)
The respondents suggested that, to be broadly comprehensible, HWQS dashboards should also use colour to clarify content. Almost all hospitals use some sort of ‘traffic light’ colour coding: red for bad performance, orange for mediocre performance and green for good performance, relative to the norm (n=11). However, respondents indicate that users do not always respond well to traffic light coding, as they often feel ashamed if their performance lingers in red too long and this discourages users from acting on the dashboard. Therefore, respondents recommend using neutral colour coding, which only one hospital is actually doing.
‘It is often demotivating for departments if they hang about in red for a long time. That is why we decided not to use red or orange in the new dashboard design, but focus instead on the green area to show improvement in performance and emphasize progress.’ (H5—Support staff—R5)
Most respondents argue that clarifying text also contributes to comprehensibility. Therefore, all hospitals provide textual clarification in their HWQS dashboards (n=12). These texts explain possible causes and the meaning of results (n=10) and provide possible actions for QS improvement (n=4). Most hospitals provide extensive text as they assume that users do not have time to interpret dashboard content themselves. However some hospitals provide minimal textual clarification to stimulate reflection. Support staff prepare these clarifications (n=4) or the organisational department of the measurements on the dashboard (n=8).
‘Our intention is to provide minimal textual clarification on dashboards to stimulate users to find answers for themselves. That’s what we want, but we are aware of the threshold. Not everyone does it and so we also give some clarification.’ (H11—Support staff—R2)
Finally, for optimal comprehensibility, HWQS dashboard should be equipped with several functionalities that enable users to tailor the content to their specific needs explained especially the management respondents. Most of the current dashboards have drill-down functions that let users specify statistics to the level of individual patients or employees (n=10). Furthermore, most also give users filtering functions that enable them to sort statistics by patient features (eg, age, gender, diagnosis) (n=9). Additionally, some dashboards are equipped with alerts that notify users of unusual deviations in QS performance (n=6). Although respondents often state that they would like to be able to save their personal settings for dashboard content, few hospitals provide this functionality (n=2).
Thus, the challenge is to find a layout that suits the needs of different users and provide understandable charts (in the right form and colours), clarifying text and sorting functions.
Phase 4: integrating evaluation
In this phase, hospitals integrate evaluation of HWQS dashboards into organisational structures. Respondents state that hospitals find it challenging to motivate managers, support staff and healthcare professionals to review the dashboard systematically due to lack of time, high workload and irregular schedules. Therefore, to facilitate a structural review of indicator outcomes and the dashboard design itself, hospitals embed HWQS dashboards in their quality management cycles (eg, Plan-Do-Check-Act cycle, Plan-Do-Study-Act cycle, Define, Measure, Analyze, Improve and Control cycle, Define-Measure-Analyze-Design-Verify cycle).
HWQS dashboards are used in these improvement cycles to measure and monitor the progress of quality initiatives and provide necessary input to structure meetings dealing with quality management cycles. The structure of meetings differs greatly as some hospitals have established elaborate forms to promote collective evaluation (n=7), while other hospitals direct their efforts to improving the technical aspects (n=5). Despite the differing approaches, all hospitals deploy HWQS dashboards as input for periodic meetings at which individual healthcare professionals and managers are held accountable for their QS performance (quality control) and new QS performance objectives (quality planning) are defined (n=12).
Most hospitals also deploy HWQS dashboards as input for periodic strategy meetings at which people from various disciplines and levels discuss the content, collectively developing initiatives regarding quality deficiency prevention and quality improvement (n=8).
‘They [managers and healthcare professionals] gather once a month. We present some content, on complaints or mortality rates, for instance. They consider the trends and compare current numbers to those of previous years. They look at important items: “What do we see and how does it compare to other departments?” These coalitions of leaders play a key role in analyzing information and communicating with departments. It’s really starting to work well.’ (H3—Support staff—R4)
The main challenge of this fourth phase is to embed collective discussion of the content of HWQS dashboards in quality deficiency prevention (quality assurance) and continuous quality improvement processes.
Phase 5: improving dashboard flexibility and connectivity
At this point in the development process, these dashboards are an integral part of QS improvement efforts. However, their static nature and predetermined content makes them less flexibly connected to their ever-changing organisational environment, which decreases their relevance. As some hospitals have invested in rigid software tools, flexible connection between HWQS dashboards is even less likely (n=5). To change the dashboard requires knowledge on IT systems, the skill to extract data from IT systems and sometimes also authorisation to access these systems. If this cannot be done by the hospitals themselves experts need to be hired and this is costly.
As hospitals are subject to continuously changing national QS regulations, legislation and policies, respondents argue that HWQS dashboards should be flexible enough to adjust to the requirements and priorities of external stakeholders. This can be done by connecting their dashboards to the data systems or dashboards operated by external stakeholders (eg, healthcare inspectorate, patient associations, external registries). Only a few hospitals have achieved this (n=3).
As the QS field is entangled with other disciplines (eg, human resources, marketing, finance), respondents argue that HWQS dashboards should be flexible enough to depict content from other internal dashboards. That would permit contextual analysis and multidisciplinary decision-making. Although respondents argue that HWQS dashboards should combine content from different fields, only two hospitals have actually achieved this.
‘We used to have all kinds of individual reports and now we are looking for ways to combine the fields of quality, finance and human resources to allow for contextual analysis. Ideally, we’d like a dashboard that brings together these different fields in a clear structure.’ (H1—Support staff—R4)
In this fifth phase, mutually intertwined dashboards become crowded with a variety of indicators so that they lack visual simplicity. Therefore, most dashboards have flexible content, consisting of partly exchangeable QS indicators, which can be added or removed if relevant to (departmental) context (n=9). Respondents argue that users should be able to adjust the content to match their specific context and highlight those QS indicators that need attention.
‘I can decide what gets displayed on the dashboard. If I think healthcare professionals will miss an indicator it is my responsibility to put the right one on … But if they do well for a long time on an indicator, it’s not useful to keep showing it and then I’d rather display some other indicator that I think is more relevant. It all depends on which indicators I want to highlight.’ (H2—Support Staff—R5)
Therefore, this final phase of the development process is focused on improving flexibility and enriching content by connecting HWQS dashboards to other internal and external measurement systems.
Discussion
We retrospectively examined the developmental process of HWQS dashboards in Dutch hospitals. We tried to find a typology of connected steps, intertwined problems or general path in our data, but found no common ground for this. The relative small sample size (only 12 hospitals) might be the cause of this. However, we found common grounds in the challenges that need to be addressed and overcome (see table 1 for an overview). Our study has defined a heuristic model consisting of five phases of these challenges during as developmental process.
The literature suggests that data availability is a crucial precondition for the development of dashboards.13 In accordance, this study confirms that hospitals consider data availability a priority as they make data inventories beforehand to determine the available QS data (phase 1).13 21 38 However, this presents hospitals with a challenge as available QS data are often quantitative and summative in nature (used for external accountability38), while users also desire qualitative and formative QS data (used for internal quality improvement38), confirming earlier findings.13 21 Therefore, hospitals extend the scope of data inventories with other types of data (audit results, safety culture assessment, patient feedback).39
Previous research shows that actual dashboard development often starts with the translation of available QS data into useful dashboard content.13 21 Accordingly, this study shows that hospitals proceed by developing useful dashboard content (phase 2), which is challenging as users have different needs. Corroborating earlier studies,13 some hospitals establish processes in which different users collaborate on developing dashboard content. Focus groups could be used to gain understanding of the needs and wishes of the users. Two other methods are also deployed to develop dashboard content: (1) using external evidence-based frameworks and (2) deploying expert opinion. Ideally, these processes would result in HWQS dashboards that display a combination of process, outcome and action indicators. To obtain this balance, fragmented source systems impeding varied data extraction should be eliminated; according to Kroch 15data, warehousing systems are the solution for this.
The literature shows that dashboard development often continues by designing a layout capable of depicting content conveniently.13 14 21 This study shows that hospitals continue the development process by designing broadly comprehensible dashboards (phase 3). This is challenging due to the varying tasks, skills and abilities of users. To achieve a broadly comprehensible layout, hospitals should ensure that real-time graphic/visual presentation of content fits the purpose of dashboard.13 14 Hospitals also employ colour coding to clarify content, corroborating previous research.14 However, the often used ‘traffic light’ colour coding could be discouraging as users do not like to linger in the red for too long. Surprisingly, no hospital taught healthcare professionals or managers to understand statistical measurements and the related graphics to help them understand the dashboard. This study also shows that hospitals provide textual explanations to clarify dashboard content. Finally, to enable users to customise dashboard content to their own needs (ie, learning and improving), hospitals add three main functionalities, namely drill-down,21 filter14 and alert functions.13 Other functionalities suggested in literature (eg, forecasting,13 scenario analysis,21 bookmarking13) were not integrated in the examined dashboards.
While the first three phases focus on technical aspects of dashboard development, the remaining two phases take organisational aspects which also include implementation and adjustment of dashboards into account. Several studies emphasise that dashboards become more effective when their content is frequently reviewed.10 15 40 This is challenging as users often fail to review HWQS dashboards due to, for example, a lack of time or technical problems with IT systems which are not so easy to change. Therefore, hospitals embed these dashboards in quality management cycles (phase 4) to monitor quality initiatives and provide input for two types of meetings dealing with quality management cycles, namely accountability meetings and (strategy) policy meetings.
Although the studied dashboards provide input for meetings, their content remains static and predetermined, making them less flexibly connected to their changing organisational environment. This finding is consistent with the literature.13 Therefore, hospitals are challenged with improving flexibility and connectivity (phase 5). Findings show that hospitals improve flexibility and connectivity by allowing for variations in dashboard content among users and departments as stated in earlier studies41 and by connecting HWQS dashboards to other internal and external data systems and dashboards to induce sensitivity to their environment and facilitating contextual analysis of QS performance.
Remarkably, the literature on dashboard development mostly addresses the technical aspects of development processes (eg, securing data quality, ensuring data availability, constructing dashboard content), while overlooking the organisational aspects (eg, ensuring frequent review, determining dashboard content, establishing situational adaptability). In contrast, this study acknowledges that HWQS dashboards can improve QS performance only when they are technically adequate and embedded in the organisation. Combining technical and organisational aspects into one comprehensive development process is the contribution of this study to the literature.
In this study, we found commonalities in challenges faced by the 12 studied hospitals and ways to deal with these challenges. One could question if it is possible to speed up the development process, by setting some national boundaries or by developing dashboards in collaboration with different hospitals. The context of our study is the Dutch marked-based healthcare system in which a central countrywide Electronic Patient Record system is lacking, and the government is not providing guidance, nor has the task to do so. Additionally, some of the phases we found our study showed the importance of discussion within an organisation, to develop the content of a dashboard (choice of indicators, making definitions, finding useful layout) and the support for the dashboard and therefore the use of HWQS dashboard.
Limitations
This study included only HWQS dashboards, which limits the generalisability of the results and makes further research into the development of other HWQS dashboards necessary. Although we retrieved valuable data by interviewing hospital managers and support staff, it could also be useful to examine the opinions of healthcare professionals more in depth. In this study, our focus group stimulated interaction among respondents, thus enriching data collection. However, focus groups could also induce group pressure resulting in socially desirable responses.
Given that this study examined the development process of HWQS dashboards retrospectively, based on respondents’ experiences, we were not present during the process, which invokes the need for research which studies such processes in real-time using ethnographic methods.
Conclusion
This study retrospectively examined the development processes of HWQS dashboards in Dutch hospitals. It uncovered several challenges that need to be addressed to establish a HWQS dashboard. Our findings are relevant to hospitals looking to develop a new or improve an existing HWQS dashboard as the findings will enable them to deal with the challenges and learn from practices the studied hospitals used. We found a common development process for HWQS dashboards that contains five phases (see table 1).
References
Footnotes
Contributors AMJWMW contributed to the study design, acquisition of data, analysis, interpretation of data and prepared the manuscript. DSEB contributed to the interpretation of the data and prepared the manuscript. MdB contributed to the study design, acquisition of the data and commented on manuscript drafts. All authors have approved the final manuscript.
Funding The research was funded by the Citrien Fonds of ZonMW (grant 8392010042) and conducted on behalf of the Dutch Federation of University Medical Centers’ Quality Steering programme.
Competing interests None declared.
Patient consent Not required.
Ethics approval Ethics approval for this study is not necessary under Dutch law as no patient data were collected. Written consent was obtained from the respondents.
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement The datasets supporting this article are available in the repository of the Erasmus University Rotterdam. Contact the authors for access.