Intended for healthcare professionals

Editorials

Hospital mortality league tables

BMJ 2003; 326 doi: https://doi.org/10.1136/bmj.326.7393.777 (Published 12 April 2003) Cite this as: BMJ 2003;326:777

Question what they tell you—and how useful they are

  1. Bobbie Jacobson, director (bobbie.jacobson{at}lho.org.uk),
  2. Jenny Mindell, deputy director (jenny.mindell{at}lho.org.uk),
  3. Martin McKee, professor of European public health (martin.mckee{at}lshtm.ac.uk)
  1. London Health Observatory, London W1G 0AN
  2. London School of Hygiene and Tropical Medicine, London WC1E 7HT

    Last week (6 April) the Sunday Times published the latest annual assessments of hospital performance compiled by the Dr Foster organisation. Dr Foster claims to provide the “only authoritative and independent guides to UK health services in the public and private sectors” and seeks to “empower consumers and their doctors to make the best possible choices.”1 Dr Foster has brought together a wealth of information, including equipment and services available at each hospital and how the hospital performs on waiting lists and complaints, but its hospital mortality figures will arouse the most interest. Many in the NHS and elsewhere will be asking themselves how they should respond to these data.

    Four main questions need a response. Firstly, what do the data actually mean? A hospital does much more than treat inpatients. Over the past decade the scope and nature of ambulatory care provided in hospitals has changed enormously, not only in surgery but also in other specialties such as oncology, where increasingly sophisticated treatments involve a complex mix of inpatient and outpatient episodes.w1 Moreover, there is good evidence that as the length of the average hospital inpatient episode falls, an increasing proportion of deaths occur outside the hospital.2 Consequently, a measure of outcome looking only at inpatients is a highly selective view of the overall picture.

    Secondly, are the results a valid measure of what they purport to be? Compared with previous yearsw2 Dr Foster has done much to enhance the quality of the data used since it published its first guide. It has changed the way it deals with in-hospital transfers and excludes people who are recorded more than once as having died. Of course, this means that rankings this year are not comparable with those in previous years—so all changes in rankings need to be interpreted with caution. But the Dr Foster method cannot avoid the probably insoluble problem arising from the continuing use of finished consultant episodes—the NHS's measure of hospital activity.w3 Since a patient's stay in hospital might include several finished consultant episodes these need conversion to hospital spells, and assumptions have to be made about which episode's main diagnosis to use. This method could be improved if supported by an audit of case notes, but this would need to be led by clinicians. In addition, the meaning of a hospital spell for someone suffering multiple complications of a chronic disease, possibly requiring several admissions over the course of a year, remains unclear.

    Thirdly, there is the question of primary diagnosis. Diagnostic criteria change, as illustrated by the 58% increase in the incidence of myocardial infarctions as a result of using new, troponin based investigations.3 As the additional patients have worse outcomes, there is an incentive for hospitals not to invest in the new diagnostic protocols.

    Fourthly, even if the data were accurate, what value would they add to our understanding of hospital performance? A hospital may have a high inpatient mortality rate because of factors related to circumstances before or during admission, to care provided during the stay itself, or to arrangements for discharge. In Scotland, inpatient mortality rates from myocardial infarction are influenced by the extent to which people die before reaching hospital.4 There are large variations in admission rates for many common conditions, not explained by differences in prevalence of disease,w4 but which seem to reflect differences in admission thresholds, and thus in severity. Ideally, Dr Foster should adjust for severity and comorbidity, perhaps using secondary diagnoses; although the variable quality of recording in the UK makes that impossible at present,5 adjustment for deprivation could be made. Hospitals also differ in the availability of places for people to be discharged to, such as nursing homes or hospices. Hospital death rates will be higher where these are less available.w5

    Assuming that the figures are an effective measure of overall hospital performance, what action should then follow? Hospitals are complex systems that are part of larger systems and also contain subsystems.w6 Where does a suspected failure lie and who should be called to account? Might failures in one system within the hospital be missed because they are compensated for by good performance in another? Then there is the matter of timeliness, with data relating to events up to three years previously. Finally, given the wide scope of the government's agenda for quality in the NHS,w7 what value does publication of these measures in a newspaper add?

    Since the key to improving performance lies in partnership between those who provide and monitor the services and those who use them, a start might be made in future of providing more than four working days for trusts to check mortality data before publication. This would avoid the anger the first Dr Foster report generated when some trusts found that their data were incorrect. There is no substitute, however, for involvement of clinicians and users in discussions of how their data are to be used and presented. Without this, the key to effective further action will be lost once the oxygen of publicity is cut off. The London Health Observatory has provided a briefing and commentary on the new Dr Foster's Guide to help trusts to interpret their own findings and decide whether further investigation is warranted.6

    Maybe we should not worry as the cost of the activity is borne by Dr Foster and the Sunday Times? However the cost of dealing with questions arising from their publications is considerable. But will publication lead to genuine attempts to identify examples of poor practice and to address them? Evidence from the United States is not encouraging. In New York, after such information was made available, some surgeons with very low operating volumes and poor outcomes stopped operating, and death rates after cardiac surgery fell.7 But rates fell equally rapidly in states such as Massachusetts that did not publish death rates.8

    What is clear is that publication leads to unintended changes in behaviour: cardiac surgeons were reported to be less willing to operate on high risk cases, a finding supported by cardiologists, who had more difficulty getting such patients treated.9 Publication also led to changes in data recording: for example, almost threefold increases in recorded rates of chronic obstructive pulmonary disease and over fourfold rises in congestive heart failure served to reduce severity adjusted mortality rates.10 Apparent improvements in recorded performance may be equally illusory in Britain—as shown by the recent frenetic activity to meet targets for waits in emergency departments; these lasted only for the week in which activity was recorded.w8

    Footnotes

    • Competing interests MM has undertaken research using NHS data for many years with the goal of finding a valid and robust way to assess performance. He has yet to succeed. He has also collaborated as a researcher with CHKS, a company undertaking benchmarking work, but has never derived financial gain from this relationship. The London Health Observatory receives core funding from the Department of Health and London's primary care trusts and has received specific funding from London's mental health trusts to develop a model for benchmarking indicators of mental health. It is also involved in a number of pieces of work developing and interpreting indicators for primary care trusts and local strategic partnerships.

    • Embedded Image Extra references appear on bmj.com

    References

    1. 1.
    2. 2.
    3. 3.
    4. 4.
    5. 5.
    6. 6.
    7. 7.
    8. 8.
    9. 9.
    10. 10.