Article Text

Download PDFPDF

Grand rounds in methodology: when are realist reviews useful, and what does a ‘good’ realist review look like?
  1. Claire Duddy,
  2. Geoff Wong
  1. Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
  1. Correspondence to Dr Geoff Wong, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, OX2 6GG, UK; geoffrey.wong{at}phc.ox.ac.uk

Abstract

Research in the quality and safety field often necessitates an approach that supports the development of an in-depth understanding of how a complex phenomenon occurs, or how an intervention works. Realist review is an increasingly popular form of evidence synthesis that provides a theory-driven, interpretive approach to secondary research. Realist reviews offer quality and safety researchers the opportunity to draw on diverse types of evidence to develop explanatory theory about how, when and for whom interventions ‘work’ or outcomes occur. The approach is flexible, iterative and practical, typically drawing on the experience of policymakers, practitioners and patients throughout the review. With the increasing use of realist reviews, some common misconceptions about the approach have become evident in the literature. This paper introduces what is involved when planning and conducting a realist review, and where the approach can offer most value, as well as outlining common challenges that researchers may face when adopting the approach, and recommended solutions. Our aim is to support researchers who are considering conducting a realist review to understand the key principles and concepts involved, and how they can go about producing high-quality work.

  • Continuing education, continuing professional development
  • Evidence-based medicine
  • Graduate medical education

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

  • Realist reviews are a form of theory-driven evidence synthesis and are growing in popularity across many disciplines, including healthcare. While many high-quality realist reviews are undertaken and published, some are of questionable methodological quality.

WHAT THIS STUDY ADDS

  • This paper highlights the value of realist reviews to develop an in-depth understanding of complex interventions and phenomena in the quality and safety field. It describes the key conceptual understanding of programme theory, context, mechanism, outcome and causation that researchers must appreciate before undertaking a realist review. Adequate training, guidance from an experienced methodologist and use of available quality and publication standards can support researchers to produce high-quality reviews that address complex quality and safety issues.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • Researchers who are considering a realist review on a quality and safety topic can consider its suitability in relation to their research aims and questions. A clear understanding of the strengths and challenges involved in this approach can inform the decision to undertake a realist review.

The situation in practice

For researchers, practitioners and policy and decision-makers working on issues of quality and safety in healthcare, realist reviews hold promise as a way to make sense of existing evidence to answer research questions that demand explanations taking into account complexity. Imagine, for example, a scenario where the senior management team of a large hospital are struggling with staff recruitment and retention. This is a long-term issue in the hospital and multiple previous initiatives have met with varying success. Managers have come to realise that an evidence-based strategy to address this issue is needed, especially as concerns about quality of care and patient safety are raised as staffing ratios come under pressure. The recruitment of ‘non-traditional’ health and social care professionals to undertake new roles has been suggested as a potential solution. Managers anticipate that this innovation could generate positive outcomes but are cognisant of potential challenges in integrating new professionals into an existing workforce. As with many interventions that aim to improve quality or safety in healthcare, the management team recognises that there is potential to learn from others’ experiences and the success and failure of similar strategies. They consider commissioning an evidence synthesis project to help establish what is already known. For a complex problem like this, adopting a realist approach has distinctive strengths to identify possible solutions. In this paper, we will use this scenario and other examples drawn from published literature to illustrate where and how a realist approach can be useful and provide examples of good practice.

Realist reviews focus on developing useful explanations for how and why things happen (or not). When we anticipate that the outcomes of an intervention are likely to be variable and context dependent, the realist approach provides a tool to allow researchers to answer questions about when and for whom important outcomes occur, and why and how they come about. In this case, the management team know that introducing new staff is not a simple change, but a complex intervention that would be introduced into an existing complex system. They can therefore anticipate that introducing new staff roles may ‘work’ better under certain conditions, such as in certain departments, or where certain groups of professionals work together. A realist review can be used to develop theoretical causal explanations to understand this kind of variation in outcomes, based on learning from what has already been done.

Realist reviews have been used to ‘diagnose’ problems and better understand a wide range of complex practices and interventions in the quality and safety field, from junior doctors’ antimicrobial prescribing1 to ‘safety-netting’ in primary care.2 The ultimate aim in a realist review project is typically focused on identifying solutions to complex problems, by developing theory-informed responses in the form of recommendations or new intervention designs.

Realist reviews: what makes them different?

Realist reviews (or ‘realist syntheses’—we use the terms synonymously) are sometimes mistaken for a type of qualitative evidence synthesis. This fundamental misconception restricts researchers’ use of diverse evidence and limits the scope and applicability of review findings. Three important distinctive features of realist reviews are summarised in table 1 and described in more detail below. Definitions of common realist terminology, including ‘context’, ‘mechanism’, ‘programme theory’ and ‘ context–mechanism–outcome configuration (CMOC)’ are provided in table 2.

Table 1

Three key differences between realist reviews and qualitative evidence synthesis methods (eg, thematic synthesis, meta-ethnography)

Table 2

Definitions of context, mechanism, programme theory and CMOC

First, realist reviews are explicitly theory driven. Not all qualitative approaches to evidence synthesis share this goal. The purpose of a realist review is to develop and refine theory, and to use this theory to direct the processes and focus of the review project.3 Realist reviews aim to produce a type of theory known as ‘programme theory’—a theory that explains what a programme or intervention is expected to do, and how it is expected to work.4 For example, Friedemann Smith et al’s programme theory of safety-netting in primary care settings demonstrates the important contexts, mechanisms and range of outcomes involved when this intervention is successful in its aim to ensure patient safety.2 Recommendations for policy, practice and intervention design can be informed by the understanding captured in a programme theory.

Realist reviews are a form of configurational (as opposed to aggregative) evidence synthesis. This means that included data are organised and interpreted to produce explanations for why, how, when, for whom and to what extent outcomes have occurred.5 Although originally developed to examine the workings of programmes or interventions, realist reviews are flexible. They can be used to develop and refine explanations of phenomena beyond interventions, including as a means of exploring and diagnosing the underlying causes of problems. The management team in the struggling hospital could commission a realist review to better understand the causes of staff shortages, evaluate potential solutions or both. The problem is clearly complex, involving a range of professionals, working across multiple settings, and so it is likely to require a complex intervention in response.

The second difference is that realist reviews can include many types of data, drawn from many types of documents. Qualitative, quantitative and mixed-methods studies, and grey literature are all valuable sources of data for theory building. Abrams et al’s realist review focused on the impact of ‘delegating’ general practitioner (GP) home visits to other staff.6 The newness of this intervention limited the availability of research literature on the topic, but this team was able to identify and use relevant evaluations, policy documents, commentary and news stories to build emerging theories that explain how home visit delegation produces outcomes related to staff and patient satisfaction.

Third, realist reviews are underpinned by a specific realist philosophy of science (or ontology). Not all approaches to evidence synthesis are explicit about their ontological position. This philosophy of science influences data analysis. Two important aspects of the ‘realist logic’ used to analyse data deserve attention here: the realist understanding of causation and the nature of mechanisms.

Realists employ the concept of generative ‘mechanisms’ to explain how outcomes occur, rather than assuming that interventions or programmes themselves are capable of causing outcomes. Explanations of why, how and when certain outcomes occur rest on another concept, namely ‘context’. A realist causal explanation for an outcome involves an understanding of the mechanism that produces it, and the identification of what is functioning as context to activate that mechanism. Realist researchers employ a useful heuristic that encapsulates this form of causal explanation: context+mechanism=outcome (C+M=O). This prompt provides an easy way to remember the conceptual components demanded by a realist explanation of causation, and a useful means of expressing the same. Realist causal explanations are configurations of the concepts of context, mechanism and outcome, typically presented as ‘CMOCs’.

In realist research, ‘mechanisms’ are understood as hidden, context-sensitive causal forces7 that produce outcomes. Realist analysis involves interpreting data to make inferences about mechanisms. The realist philosophy of science underpinning realist analysis posits that such mechanisms are inherent in social structures, objects and people, and take effect in certain ‘activating’ contexts. This underpinning assumption is important, as it provides a warrant for the transferability of realist causal explanations. This means that causal explanations that have been derived from one set of data may have explanatory value in other situations where it is reasonable to believe that the same mechanisms may be in operation. Rohrbasser et al developed 21 CMOCs (detailed in their supplemental material 8) explaining patterns of outcomes associated with the use of quality circles to improve the practice and well-being of GPs. The CMOCs identified conditions in which individuals are more likely to participate in quality circles, and a range of outcomes related to how such groups interact and function. The inferred mechanisms are ‘hidden’ (for example, feeling the ‘need for autonomy’) and potentially, transferable to other settings and groups (for example, feeling ‘safe’).8

Realists have a broad and flexible understanding of ‘outcomes’ and ‘contexts’. Realist analysis is not confined to considering only the final intended outcomes of an intervention (or measurable proxies). CMOCs often relate to more proximal outcomes (sometimes understood as ‘process outcomes’) and include unintended, as well as desirable outcomes. In Luetsch et al’s realist review of pharmacist-conducted medication reviews, ‘proximal’ outcomes (such as whether or not a medication review takes place) are demarcated from ‘distal’ outcomes (the ultimate intended outcomes of a reduction in healthcare use and hospital readmission).9 The distinction makes the scope of the review clear and identifies the proximal outcomes as intermediate steps on the way to the final desired endpoints.

Consideration of ‘context’ in realist research is not limited to identifying fixed demographic or setting-related characteristics (see table 2). Realist researchers have operationalised ‘context’ in myriad ways, but it is crucial that context is always understood as functioning to activate the mechanism within a CMOC10: the configuration is what matters. In a realist review exploring the contexts that affect access to primary care for socioeconomically disadvantaged older people in rural areas, some of the identified contexts were ‘fixed’ characteristics of individuals (for example, educational status), but others were socially constructed (for example, ‘expectations of ageing’).11 Some contexts are more amenable to change than others. Realists understand interventions as attempts to manipulate contexts, with the aim of creating conditions in which certain mechanisms will be triggered and so produce desired outcomes.

Realist reviewers must also account for how multiple causal explanations (CMOCs) inter-relate. The links between individual CMOCs are captured in a realist programme theory. This brings us back to the theory-driven nature of realist reviews. A realist review should begin with an ‘initial programme theory’ as its starting point. Initial programme theories represent a preliminary understanding of how an intervention works or how a phenomenon occurs. An initial programme theory may be based on a preliminary reading of the literature, or draw on the expertise of key stakeholders. Revisiting our hospital managers, an initial programme theory for their commissioned review might reflect their own understanding and expectations of how ‘non-traditional’ professionals could help to address the workforce crisis. These working assumptions can then be put to the test. During a realist review, secondary data are used to confirm, refute and refine the initial programme theory into one that is realist in nature, and underpinned by evidence. At the end of a realist review, the realist programme theory produced should include the CMOCs that provide realist causal explanations for outcomes.

When is a realist approach useful?

Realist reviews focus on understanding causation, aiming to build explanatory theory using the basic building blocks of CMOCs. A realist programme theory is potentially a vehicle to make sense of any intervention or phenomenon, but is most commonly applied to make sense of the workings of complex interventions. Interventions may be considered ‘complex’ in different ways: ‘the number of components involved; the range of behaviours targeted; expertise and skills required by those delivering and receiving the intervention; the number of groups, settings or levels targeted; or the permitted level of flexibility of the intervention or its components’.12 There is often an expectation that the outcomes that can be achieved will be context dependent. The C+M=O heuristic that guides data analysis in realist reviews provides a clear explanation of the relationship between what is understood to function as context and an outcome. This explicit explanatory link provides transparency and coherence in realist analysis and knowledge claims.

Many published realist reviews illustrate the value of developing CMOCs and programme theory, including reviews focused on understanding antimicrobial prescribing practice by doctors in training,1 13 14 causes of mental ill health in doctors15–17 and optimising delivery of remediation programmes for doctors.18–20 In these reviews, using the C+M=O heuristic deepened understanding of variation in outcome patterns, enabling identification of the important contexts in which desirable outcomes can emerge.

A realist review can also improve the understanding or ‘diagnose’ the causes of an identified ‘problem’, with the aim of developing potential solutions. A programme theory that captures this understanding can be used for a number of purposes, including informing the design of interventions to modify specific contexts,2 considering how a programme might be improved8 or informing the development of new interventions.11 A realist review could enable the hospital managers in our scenario to assess local contexts to identify potential intervention points to improve staff retention, or to identify and create supportive conditions for the successful introduction of new professionals joining the workforce.

Realist reviews should not be used to estimate the efficacy or effectiveness of interventions, for which aggregative approaches (such as a Cochrane-style systematic review and meta-analysis) to evidence synthesis are more appropriate.5 The configurational and interpretive nature of realist reviews supports their aim to explain and understand causation within interventions, programmes or phenomena. However, different approaches may be combined. Nyssen et al’s review used two approaches to synthesise evidence on therapeutic writing for people with long-term conditions. An effectiveness review was used to estimate the effect size of therapeutic writing, and a realist review to understand and explain why, for whom and in what contexts it was likely to work.21

Challenges when undertaking realist reviews

As realist reviews have grown in popularity, familiarity with the approach and research capacity in the field has grown. Many highly rigorous realist reviews are undertaken and published each year. However, our experience from undertaking peer review and following the realist literature suggests that some researchers find the process of undertaking realist reviews challenging. This section will outline some of the most common pitfalls that we have encountered, and the following section will offer some potential solutions to help researchers who are considering adopting the approach.

As discussed above, one common misconception is that realist reviews can be conducted in a similar way to other forms of qualitative evidence synthesis. In these cases, the importance of generative causation, the purpose of CMOCs and important realist concepts (especially mechanisms, context and programme theory) are not fully appreciated. Failure to understand the importance of generative causation means that some realist reviews do not focus on identifying and understanding mechanisms in their analysis. As a result, they do not produce causal explanations that take the form of CMOCs, and do not explain the action of mechanisms in relation to specific contexts and outcomes. Instead, data are often analysed thematically, and identified themes are categorised and reported as lists of contexts, mechanisms and outcomes that have no relationship with each other. This form of analysis misses the point and value of using CMOCs. The C+M=O heuristic is intended to be used as a device to support the development of causal explanations that specify not only how (via which mechanisms), but also when (under which contexts) outcomes occur. Identifying each as separate components does not allow researchers to realise the benefits of realist configurational thinking.22 23 Luetsch et al extracted data to provide tables of outcomes, contexts and mechanisms related to pharmacy-conducted medication reviews (tables 2, 3 and 4, respectively, in their paper). If the authors had stopped at this point, the reader would be left wondering how these contexts, mechanisms and outcomes relate to each other. In this case, the authors go on to configure the contexts, mechanisms and outcomes into CMOCs (table 5 in their paper) and by doing so they produce explanations that further our understanding.9

A closely related challenge surrounds the understanding of mechanisms in realist reviews. The most common misunderstanding is that a mechanism is the same thing as an intervention, or an interventional component or strategy. The result of this confusion is that analysis focuses on the wrong concept, undermining the potential transferability of any learning from the literature. Occasionally, this misunderstanding is compounded by a narrow understanding of context. Some realist reviewers adopt a definition of ‘context’ that limits the scope of their analysis, defining contexts only as physical settings, or demographic or socioeconomic characteristics of individuals. Such a narrow understanding again limits the value of developing CMOCs to explain causation for outcomes. A restricted view of what can function as a context within a CMOC means that potentially important contexts can be ignored.10 When the ultimate aim of a realist review is to develop recommendations for intervention design, practice or policy, this approach can prevent researchers from identifying important contexts that could be manipulated to change outcomes.

The value of a correct understanding of mechanism and a broad conceptualisation of context is demonstrated in Friedemann Smith et al’s realist review of ‘safety-netting’ in primary care.2 One CMOC from this review states: ‘When information is personally relevant and tailored to the patient (C), the safety-netting advice is adhered to (O), because the patient has a sense of ownership, relevance, understanding, and credibility of the information they have been provided (M)’. This CMOC offers a causal explanation for an important desirable outcome of this intervention. The mechanisms inferred describe the ‘hidden’ processes involved in patients’ responses to the identified context, producing the outcome of adherence to advice. The CMOC provides one explanation of ‘when’ (in which context) this outcome occurs. This review team have adopted a broad conceptualisation of context, allowing them to articulate a crucial aspect of the delivery of safety-netting, with clear implications for intervention design.

The intended end product of a realist review should be a realist programme theory—an overall explanation of how an intervention is meant to ‘work’ and should be implemented. Realist programme theory is the end product of the synthesis of the included data, and a device to organise the CMOCs devised during a realist analysis into a coherent whole. It is crucial to develop CMOCs, but there is added value if researchers can also demonstrate how individual CMOCs can be understood together. Realist reviews without a realist programme theory run the risk of providing CMOCs that are difficult to relate to each other, which can undermine their explanatory value. Friedemann Smith et al’s realist review illustrates the value of a programme theory. Their final programme theory (figure 3 in their paper) sequentially organises 22 CMOCs into a coherent explanation of safety-netting, identifying outcomes that occur during patient consultations and afterwards. The reader is not left guessing what applies where, to whom and when.2 Another example can be found in Price et al’s realist review of remediation programmes for doctors. In this review, 29 CMOCs are organised to explain how remediation of doctors produces its effects (figure 3 in their project report).19

Recommendations

There are some straightforward solutions to the challenges outlined above. We strongly recommend that researchers who are new to realist reviews undertake training, with the aim of developing a sufficient understanding of (a) the philosophy of science that underpins realist reviews, and specifically an appreciation of generative causation and the implications for realist analysis; and (b) the important realist concepts used during analysis and reporting (ie, contexts, mechanisms, outcomes and programme theory). Time and effort devoted to learning about realist reviews is time well spent. While training can support researchers to grasp the fundamentals, we also suggest that ongoing methodological support is needed to translate an abstract understanding into actual practice.

A realist methodologist or experienced realist reviewer can add value by sharing practical knowledge of how to execute the approach. They can advise on accepted practice in conduct and reporting (including the coherence, appropriateness and benefits of potential methodological innovations). A word of caution is needed in relation to the choice of person providing methodological support. Gaining experience and achieving competence take time and repeated practice. The hospital management team described at the beginning of this paper should prioritise identifying a realist review team with sufficient, proven expertise and experience to deliver high-quality work.

Flexibility of purpose and process in realist reviews is a double-edged sword. While there are processes to follow,3 4 these projects are iterative in nature and can only be protocolised to a certain extent. In practice, changes to protocol are allowed (and encouraged) within a realist review, as long as they are transparently reported. Changes could include progressively focusing on a review as the topic under study becomes clearer, and undertaking additional searches to meet emerging needs for data. This flexibility means that review projects can respond to identified knowledge gaps or emerging priorities. Review teams can draw on stakeholder expertise to help focus and prioritise within a review topic. Projects can be focused on developing in-depth explanations of a specific aspect of a problem, rather than providing superficial analysis of a broader topic. Duddy et al’s realist review of the National Health Service Health Check initially considered the programme as a whole but shifted focus in response to discussions with key stakeholders and gaps identified in exploratory literature searches. The project ultimately focused on variation in attendee experience post-health checks.24

Quality4 and publication standards25 have been produced to guide the conduct and reporting of realist reviews. These are open access and can be downloaded from the RAMESES Project website (www.ramesesproject.org); reporting standards are also available via the EQUATOR network (https://www.equator-network.org/). When funders, researchers or peer reviewers commission, conduct or assess realist reviews, they should use (and cite) the appropriate RAMESES standards. There has been confusion between the two sets of standards; some researchers claim to have followed RAMESES quality standards, but cite the publication standards. The publication standards should guide the reporting of a realist review, not its execution. In addition, some researchers who claim to have followed RAMESES publication standards for realist reviews do not always do so. Provision exists within the publication standards to omit reporting on particular items, but the standards are very clear that an explanation should be provided for any such omission.

Occasionally, realist reviews are unfairly judged against inappropriate quality or reporting standards. Applying standards developed for other forms of evidence synthesis does a disservice to the hard work that researchers put into realist reviews. Two common and related misunderstandings centre on understanding the purpose of searching in a realist review26 and the assessment of rigour of included documents.27 In relation to searching, quality appraisal tools for systematic reviews usually focus on comprehensive approaches to minimise the risk of missing potentially relevant studies. However, the RAMESES quality criterion asks researchers to ‘identify data to enable the review team to develop, refine and test programme theory’. The development of plausible, coherent theories does not rely on the exhaustive identification of all potentially relevant evidence. Theories are not more plausible or coherent because of the volume of data identified, retrieved and analysed.

Assessment of ‘rigour’ in realist reviews does not typically involve the systematic application of a critical appraisal tool or checklist. Wong has argued that the assessment of rigour in a realist review should be predominantly undertaken at the level of the programme theory.27 Data of questionable methodological provenance can still contribute to theory development and excluding such data could affect the overall plausibility and coherence of a programme theory.28 Focusing on the assessment of the explanatory plausibility and coherence of a programme theory may be more appropriate. The RAMESES quality standards do not specify how this should be done, but stipulate that refined programme theory should be consistent with the evidence and may also be supported by relevant substantive (formal) theory.4 We advocate an approach that applies three criteria: consilience, simplicity and analogy. The consilience of a theory refers to its ability to account for as much relevant data as possible; simplicity expresses the expectation that a theory should be as parsimonious as possible, avoiding the need for ad hoc exceptions, and analogy considers how well a theory ‘fits’ with what we already know. Any assessment of a theory using these criteria must consider all three principles together.27

Ideally, anyone charged with judging the quality of realist reviews should be familiar with the approach, current accepted practice and the RAMESES quality standards.4 In box 1, we have provided some helpful tips for reviewers to evaluate the quality of a realist review protocol or findings paper. These are not a substitute for the RAMESES standards, but can be used as prompts to support a quick assessment of quality and help reviewers identify areas of concern.

Box 1

Tips for peer reviewers of realist reviews

Read a realist review with caution if:

  1. It has no programme theory.

  2. It does not provide detailed context–mechanism–outcome configurations (CMOCs) to underpin the knowledge claims made within the Results/Findings section (for example, it only provides tables or lists of unconnected contexts, mechanisms and outcomes).

  3. Mechanisms have been conceptualised as intervention strategies, and not as context-sensitive hidden causal forces.

  4. CMOCs do not make sense as a causal statement or claim—try reading them out loud to see if they sound plausible.

Conclusion

There is considerable potential for the realist review approach to help improve the understanding of many quality and safety topics. In choosing a realist review, the hospital managers introduced at the beginning of this paper are adopting an approach to evidence synthesis that could help them to understand both the causes of their staffing problems, and how their preferred intervention might work, for whom, in what circumstances, why and how. Realist reviews can be challenging, but with the right team in place, have the potential to produce an in-depth understanding of complex problems and to develop theory-informed, practical solutions.

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.

Acknowledgments

The authors would like to thank the peer-reviewers and editors for their helpful and constructive feedback and advice which have greatly helped to improve this manuscript.

References

Footnotes

  • Contributors Both authors were involved in writing and revising the manuscript. GW is the guarantor and accepts full responsibility for the finished work and controlled the decision to publish.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests CD and GW both teach the Realist Review and Realist Evaluation Module at the University of Oxford (https://www.conted.ox.ac.uk/courses/realist-reviews-and-realist-evaluation).

  • Provenance and peer review Commissioned; externally peer reviewed.