Article Text

‘Show me the data!’ Using time series to display performance data for hospital boards
  1. Christine Soong1,
  2. Chaim M. Bell2,3,
  3. Paula Blackstien-Hirsch4
  1. 1Department of Medicine, University of Toronto Temerty Faculty of Medicine, Toronto, Ontario, Canada
  2. 2Medicine, Sinai Health System, Toronto, Ontario, Canada
  3. 3Medicine and Health Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada
  4. 4Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, Canada
  1. Correspondence to Dr Chaim M. Bell, Medicine, Sinai Health System, Toronto, Canada; Chaim.Bell{at}

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Core to the role of a hospital board is establishing organisational strategy and multi-year priorities, ensuring processes are in place for risk prevention and mitigation, and overseeing progress on strategic outcomes relevant to multiple quality domains. This role is distinct from that of the hospital management team (comprised of managers and clinical and operational leaders) that is responsible for daily operations and implementing the hospital’s strategic plan.1 While senior executives such as Chief Executive Officers may sit on both boards and management teams in some countries (such as the UK and Canada), the roles of boards and management teams differ. For example, boards focus on long-term strategic planning, while management focuses on day-to-day operations. To support these different roles, hospital boards and management teams both need access to data in graphic formats that help to differentiate random variation from significant changes, although required elements may differ. Hospital management should have access to and be familiar with performance metrics to make day-to-day decisions to ensure high quality of care in addition to regular (eg, monthly) review of performance metrics linked to strategic priorities to inform them about impact on outcomes over time. Examples of performance outcomes include metrics such as rates of hospital acquired infections, falls, medication adverse events, readmissions and patient experience measures. In addition to outcome measures, hospital management teams also require access to process measures which support problem solving on implementation of quality and safety strategies linked to strategic outcomes.

Hospital boards generally focus attention on measures to answer questions about risk: How safe are we now? If striving to be safer in a particular area such as sepsis or pressure injuries, are we moving in the right direction, and are we at target? If not, what are we doing to achieve target performance? Is the observed change significant? Hospital boards are ultimately accountable for the quality of care delivered in hospitals. As such, data review is a key component of effective hospital board governance. The literature underscores the need for concise standardised, timely information presented to board members in formats that are meaningful and easily interpretable.2–7 However, there is little description or guidance on the most effective format for scorecards and thus, variability exists in how data are displayed.8 9 Data are often presented solely in dashboards with average indicator scores (usually by quarter) over the course of 1 year, which are colour-coded red, yellow or green depending on how close the averages are to the expected targets.10 However, these dashboards can be misleading, as they include very few data points (four quarters at most which lack details on trends over time) and the colour-coding is dependent on how narrow the corridors are relative to the target performance meaning a measure can be ‘red’ even if it was improving over time and narrowly missed the target. Charts prepared for boards should include monthly data points in graphic format over a longer period of time to appreciate trends, and denote whether an observed change is significant, thus avoiding erroneous conclusions tied to random variation.11 12

Considering chance variation to inform hospital management teams and boards

Statistical process control (SPC) charts, a type of time series chart, are commonly used in quality improvement as they enable a clear distinction between statistical significance and random variation.13 14 An advantage of SPCs is that it allows detection of whether a process is ‘in-control’, that is, variations from a mean are part of a stable process when they occur between upper and lower control limits. SPC charts can therefore identify variation secondary to an external force or ‘special cause’ from change that occurs due to random chance. Such ‘special cause variation’ may be attributable to a quality improvement intervention and can provide convincing statistical evidence of change provided a stable process before introduction of the intervention.15 For example, an SPC chart of catheter-associated urinary tract infection rates on a medical ward showing special cause variation after the implementation of a quality improvement initiative aimed at early urinary catheter removal and catheter insertion avoidance would provide compelling evidence to the hospital management team and the board that the intervention was effective to improve care. Rather than tracking outcomes over time within the hospital or unit, funnel plots can be used for between-group comparisons, with units or hospitals outside the control limits identified as having outlier performance, either significantly worse or significantly better.16

In healthcare, the application of SPC charts has been encouraged among quality improvement teams, managers and executives for day-to-day decision-making.13 An example might be to review falls data to investigate datapoints outside the upper control limit to determine if the higher number is cause for concern. Another use may be to decide whether to commit to additional funding to spread a falls prevention intervention throughout the organisation that has clearly demonstrated impact beyond chance. Collectively, the hospital management and operations teams work to operationalise strategic quality improvement plans and should use SPCs to inform their work.

In contrast to management, hospital boards have a different role—one where they hold the hospital management team accountable for achieving results on outcomes linked to strategic priorities. To support this role, hospital boards should review performance data—but in what format and how? Should SPCs, run charts or other charts be used as methods for data display? There is evidence that regular board review of quality metrics is associated with strong hospital performance, although a causal link has yet to be established.17 However, specific recommendations on the use and impact of SPC charts at the level of hospital boards are unclear and uptake at the board level appears limited. In a previous study of hospital board data reports among randomly selected English National Health Service (NHS) trusts, only 6% of charts submitted to the board depicted the role of chance.12 The authors postulated that board members may be unaware of control charts and/or support staff may lack the skills to create such charts.

In this issue of BMJ Quality & Safety, Kudrna and colleagues evaluated the impact of SPC training on the use of SPC charts in hospital board documentation in England.18 This observational study sampled 20 intervention acute care hospitals that received the training and matched these on the number of patient attendances, degree of specialisation and deprivation level to 20 control hospitals. The intervention consisted of training on SPC chart creation and interpretation, when and how to use them, how SPC charts can inform decision-making about process variation and the limitations of other charts. The training was delivered to board members (lasting 90 min) and quality improvement staff (over one working day) to improve knowledge about SPC charts and improve their uptake. The primary outcome was the proportion of all charts in board documents generated using SPC charts. Other outcomes included the quality of SPC charts (eg, whether control limits were set), the proportion of time series charts that used SPC methods and the proportion of time series and between-group charts (funnel plots) that used SPC methods. Qualitative data included thematic analysis of comments from participants about the training sessions. Nearly one-half of the total 6287 charts (encompassing all intervention and control hospitals) were quality and safety charts. Of these, 75% were time series charts, 10% between group comparisons, 14% were both time series and between-group comparisons and only 7% were SPCs. The authors found that intervention hospitals had a greater increase in SPC use from pre-intervention to post-intervention periods (7% to 21%, relative risk ratio of 9; 95% CI 3 to 32) compared with controls (0.6% to 1.2%). A majority of the qualitative comments was positive: better awareness of SPC, understanding how poor data presentation can lead to poor decisions, changing how participants think about data and reporting and interpretation of data.

In advocating for SPC use at the board level, Kudrna and colleagues hope to prevent over-reaction to normal variation in data. In their study, the authors found a ‘high proportion (of) charts with time series information in board papers (90%)’, suggesting that many of the boards in the NHS were using time series data, although specifics of the charts were not provided. We also do not have information on how hospital management teams were displaying and interpreting data given that only board reports were reviewed for the study. Since operations teams are responsible for carrying out quality improvement (QI) work, it would be important to understand SPC use at cascading levels within the organisation, from executives to front-line staff and how it impacts on outcomes. It may be possible that SPC charts were used by the operations teams but not shared with boards or that the operations teams have little appreciation of SPC charts. Studies identifying board practices associated with high performing organisations support regular review of data by board members but stop short of recommending the format most useful in supporting the board in their data review.

Additionally, although statistically effective, uptake only reached 21% in the intervention arm of the study which suggests an enduring low appetite for SPC use, although the optimal number is unknown. Another factor to be considered, in relation to sustainability at the board level, is that members are appointed for fixed terms, necessitating ongoing orientation and education of new members in order to maintain proficiency in SPC use and interpretation. Longer board memberships like those in the UK could reduce the need for ongoing maintenance.

The use of SPC charts and other time series data is a cornerstone of institutional quality improvement and should be used widely by improvement teams and managers, including experts in analytical and statistical tools, to evaluate change. Hospital boards have a different role and the question is whether board members should become experts at interpreting statistical results, considering some boards may be predominantly composed of lay public rather than clinical experts. Advocating for the inclusion of SPCs in hospital board reports begs the question, what is the problem we are trying to solve? If the problem is that boards are not provided with tools needed to respond to questions stemming from their fiduciary role (ie, ‘is care changing in the right direction?’), then familiarity with time series such as run charts or SPC may be appropriate. If the goal is to achieve better clinical outcomes, SPC use is critical by the operations team on a day-to-day basis but may or may not be needed at the board level, given its role is to hold the management team accountable to achieve set targets.

Improving hospital board decision-making

Data that fail to distinguish between common and special cause variation can result in misinterpretation by boards and subsequently, erroneous advice and decisions. If the role of hospital boards is to oversee the implementation of strategic priorities and ensure risk is mitigated, how can boards best review data to achieve these objectives? Through our experiences in quality improvement and serving on hospital boards, we have identified three common challenges experienced by board members in making sense of information provided in board packages: (1) large numbers of included measures and data are overwhelming and often obscure summary messaging; (2) misinterpreting the data due to insufficient narrative to enable board members to quickly make sense of the information provided; and (3) the lack of understanding performance within organisational context (table 1). These three challenges are in addition to the challenge already described for boards that are only provided with data in colour-coded dashboards. Based on findings from the literature on governance and characteristics of successful boards19 and supplemented by our own experience, we advocate for a number of additional recommendations, that are of equal importance to the display of data, when deciding what data and format to use when presenting data to board members (table 1).

Table 1

Common challenges facing hospital boards and recommended mitigation strategies

This will ensure the focus is on presenting data in a format that is easily interpretable and informative, to equip board members to effectively perform their role for organisational quality and safety.

Supplemental material

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.


Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.


  • Contributors All authors contributed to the commentary.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Commissioned; internally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Linked Articles