Article Text

Download PDFPDF

Grand rounds in methodology: designing for integration in mixed methods research
  1. Timothy C Guetterman1,
  2. Milisa Manojlovich2
  1. 1 University of Michigan, Ann Arbor, Michigan, USA
  2. 2 School of Nursing, University of Michigan, Ann Arbor, Michigan, USA
  1. Correspondence to Dr Timothy C Guetterman, University of Michigan, Ann Arbor, Michigan, USA; tguetter{at}med.umich.edu

Abstract

Mixed methods research is a popular approach used to understand persistent and complex problems related to quality and safety, such as reasons why interventions are not implemented as intended or explaining differential outcomes. However, the quality and rigour of mixed methods research proposals and publications often miss opportunities for integration, which is the core of mixed methods. Achieving integration remains challenging, and failing to integrate reduces the benefits of a mixed methods approach. Therefore, the purpose of this article is to guide quality and safety researchers in planning and designing a mixed methods study that facilitates integration. We highlight how meaningful integration in mixed methods research can be achieved by centring integration at the following levels: research question, design, methods, results and reporting and interpretation levels. A holistic view of integration through all these levels will enable researchers to provide better answers to complex problems and thereby contribute to improvement of safety and quality of care.

  • Health services research
  • Qualitative research
  • Implementation science
  • Statistics
  • Evaluation methodology

Data availability statement

There are no data in this work.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

  • Mixed methods research integrates qualitative and quantitative approaches to more comprehensively address research questions. Despite its growing popularity, integration remains challenging to implement in practice.

WHAT THIS STUDY ADDS

  • This article bridges the levels of integration with the design stage of a mixed methods research project. It presents major decision points that span research question, design, methods, results and reporting and interpretation levels to achieve integration.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • Researchers using mixed methods research to understand a quality and safety-related situation of practice should consider the fit of mixed methods research and major levels of integration in designing mixed methods research.

The situation in practice

Mixed methods study designs in the health sciences have been popular for over two decades,1 2 yet researchers continue to miss opportunities to systematically integrate qualitative and quantitative approaches. Integration is the most important characteristic of mixed methods research and refers to the intentional combining of qualitative and quantitative data, methods, results and interpretation such that the two forms of research become interdependent to address research questions.3 Too many studies purport to use a mixed methods design when, in reality, the design consists of multiple methods that are not integrated or sequential qualitative methods alone. Even when both quantitative and qualitative data are collected, many researchers analyse each type of data separately and fail to integrate them to yield results that are more than just the sum of each type of analysis. No single type of data can fully explain phenomena that exist within a physical environment and affect the people and their interactions within it.

For instance, consider the following example. Barcode medication administration (BCMA) uses a technology that prevents medication errors and promotes patient safety when used appropriately. The medication administration process consists of nurses first using a barcode scanner to scan a patient’s identification (ID) wristband, scanning the medication which is dispensed in a single-unit package and then administering the medication and documenting it in the electronic medical record. As with many interventions, the BCMA implementation process is important for its success but deviations from the process are common and include workarounds that can lead to medication errors and patient harm. Few studies have identified why deviations occur and the influence of the context that surrounds medication administration on these deviations. To address this gap in the literature, Mulac and colleagues conducted a convergent mixed methods study to explore how nurses use the BCMA and identify policy deviations associated with unsafe practice.4 Researchers used structured observation (eg, a digital observation tool) to collect numerical data and quantify policy deviations. Field notes and nurse comments comprised the qualitative data. Quantitative results revealed that 140 patients were affected by task-related policy deviations during medication dispensing and 152 patients were affected by task-related policy deviations during medication administration. Qualitative results showed, for example, that nurses did not always scan the patient’s ID band prior to medication administration, creating a potential patient safety issue. Had the research team only relied on the quantitative data, results would have been limited to identifying the frequency of medication error, while the qualitative data alone highlighted problems in the process. Integration revealed additional insights by making a linkage between qualitative and quantitative results: nurses did not always scan the patient’s ID band because the medication did not have a barcode or barcode was unreadable for the scanner.

In general, quality and safety research suffers from variations on this problem of examining an issue from a single perspective, which a well-integrated mixed methods approach might solve. Healthcare delivery is complex and can be considered from multiple vantage points using a variety of methods that generate different kinds of knowledge. By using integration in a mixed methods approach, knowledge as a whole is advanced because we learn more about why patient safety and quality problems exist and what can be done to fix them. Mixed methods is a way to leverage complementary strengths of qualitative and quantitative methods, while ideally minimising overlapping weaknesses. For instance, a survey with both quantitative and open-ended qualitative items would have overlapping weaknesses inherent in the survey, such as same source bias. Five common reasons for using a mixed methods approach are in table 1, although it is worth noting that there are as many reasons as ways to mix methods.

Table 1

Reasons for using a mixed methods approach and possible applications

Despite the growth in the use of mixed methods and the many tutorials on how to perform mixed methods studies,5–7 there has been relatively little attention paid in quality and safety research to integrating quantitative and qualitative components. Furthermore, the existing literature on integration typically focuses on the procedures employed at a particular phase of the study, rather than taking a holistic approach that encompasses the entire research process. While helpful, the integration procedures do not occur in isolation and are most effective when considered across all phases of research from planning and designing through analysis and reporting. Although new strategies and methods of integration continue to be developed,3 8 9 many research proposals and empirical articles reporting mixed methods results give little attention to integration, fail to describe integration strategies or omit the results of integration altogether. Researchers continue to experience challenges in achieving meaningful integration, which is essential to the methodology to yield credible results. Integration is particularly important to mixed methods research in quality and safety to develop a comprehensive picture of the key constituents involved in a particular problem. Therefore, the purpose of this article is to guide quality and safety researchers in designing mixed methods research by centring integration through levels of the design process.

Major levels of integration when designing mixed methods research

At its core, integration is synergistic, generating something more than simply the cumulative effects of qualitative or quantitative research alone. The results of quantitative and qualitative analysis yield separate findings that lead to separate conclusions or claims.10 Integrative analysis in a mixed methods study results in a ‘meta-inference,’ defined as a conclusion that connects claims from both quantitative and qualitative analysis or results.10 A common misconception is that integration occurs only at the level of data. Rather, integration should occur at many levels of a mixed methods study, including the philosophical level,11 the research question level, the design level, the methods level, the results level and during interpretation and reporting.12 Although this article does not address integration at the philosophical level, interested readers should review literature on integrating philosophical perspectives.13–15 Being aware of these levels is essential for designing for integration in quality and safety studies. In addition, integration can inform follow-up phases, their sampling plans and data collection methods. For example, qualitative results could be used to develop or adapt a quantitative survey, as in the work of researchers who developed a tool to facilitate individualised self-management interventions for adolescents with asthma. Researchers first had teens create photo diaries then participate in semistructured interviews to develop a pool of candidate items for the tool.16 Numerous authors have discussed concepts of rigour, quality and validity as they apply to mixed methods research, which we will not repeat here.5 17–20

Integration at the research question level

The need for a close relationship between research design and research question is especially true for mixed methods studies. The aims or research questions drive the decision of what type of mixed methods design is most appropriate based on the intent of integration. Integration of both qualitative and quantitative research questions can serve different functional needs (box 1). Importantly, designing a mixed methods research study involves writing research questions or aims that reflect the need for integration. There are three broad types of mixed methods research questions: those that focus on the method (eg, how do quantitative results contribute to our understanding of qualitative findings?), those focused on content (eg, how can barriers and facilitators to a specific intervention best be explained?) or a combination of the two.5 Our ‘Situation in Practice’ is an example of the third type of question because the authors wanted to gain an in-depth understanding of how nurses use BCMA technology (qualitative methods) and record the number and types of BCMA policy deviations (quantitative methods). It is important to critically think about how and why various methods can integrate to address aims and research questions. For example, a trial of an intervention likely already has a strong conceptual framework for why it would or would not work. A focus group may not contribute more explanation for why the intervention was effective or not, but it might address other questions about the experiences of participants or implementation issues. When writing aims or research questions, we suggest including at least one that is mixed methods-focused and can only be addressed through integration.

Box 1

Four common functions of integrating qualitative and quantitative research47

Address the same research questions.

Address related questions for the purpose of elaboration, or questions raised by the other approach (ie, qualitative or quantitative).

Use qualitative methods to develop a quantitative measure or intervention.

Use one form of research to inform sampling of the other.

Integration at the design level

Mixed methods designs provide a map for when qualitative and quantitative data collection and analysis occur and where points of integration occur. What is most important is designing the study to address aims or research questions through integration, which can be done in a variety of ways. Onwuegbuzie and Leech have stressed that timing, sequence and priority are common factors across design types.21 Authors have described different types of mixed methods designs that follow these principles.5 22 Creswell and Plano Clark have advanced a parsimonious set of three core mixed methods designs: convergent, explanatory sequential and exploratory sequential.5 Convergent designs involve collecting and analysing quantitative and qualitative data and typically merging the results together to more completely understand an issue. Explanatory sequential designs use a phased approach and begin with quantitative data collection and analysis, which is followed by a second phase where qualitative data are collected and analysed to explain quantitative results. Exploratory sequential designs are also phased but begin with a qualitative phase and use results to develop an instrument or intervention that is then tested with quantitative data collection and analysis.5 These designs are distinguished by their intent, which also guides the sequence of qualitative and quantitative procedures and the points at which integration occur.

The core mixed method designs provide a foundation for identifying points of integration to address research questions and can be combined to create variations that would more thoroughly address unique research questions. Mixed methods researchers often intersect core designs with other approaches such as randomised controlled trials (RCT) or case studies, which are examples of complex designs. In implementation science, a mixed methods intervention design might involve implementing an RCT of an intervention to deliver self-management support to patients with cancer in three regional cancer centres, for example, and also integrating qualitative methods to understand barriers and facilitators to the implementation.23 Understanding participant experiences within a trial can also yield insight into mechanisms that might explain how or why an intervention was or was not effective.

Implementation science questions that focus on understanding implementation factors often require integrating methods.6 Our ‘Situation in Practice’ example, by including nurses’ comments related to medication administration, uncovered how poor implementation of the BCMA technology contributed to workflow disruptions, increased workload for nurses and possible medication errors. RCTs, case studies and other approaches can further intersect with implementation science with varying degrees of focus on assessing implementation strategies or the intervention effectiveness as determined by the research questions.24 Implementation research may involve quantitative measures of effectiveness and implementation variables, informed by a framework such as Reach Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM)25 or the Consolidated Framework for Implementation Research (CFIR).26 Mixing qualitative with quantitative methods can also help to unpack reasons for successful implementation or illuminate how implementation characteristics relate to outcomes and experiences of participants. For example, examining a paediatric patient safety intervention in Guatemala, researchers integrated qualitative and quantitative results, guided by the CFIR, to understand barriers and facilitators to implementation.27 They gathered surveys about attitudes towards patient safety programme adoption and interviews with staff focused on their views of implementation early in the programme followed by focus group interviews a year later with the goal of identifying opportunities for improvement.

A challenge of designing mixed methods research is that the procedures are not necessarily fixed but may intentionally change. For example, depending on the results of an initial quantitative phase in an explanatory sequential design, researchers decide what specific sampling plans or interview guides in the qualitative phase will best understand results. This decision is an example of the integration threaded throughout a mixed methods design. Naming a design is helpful, but more important is describing the design and providing relevant citations. We recommend articulating a rationale at the outset for why a mixed methods design is needed to justify the added complexity, resources and cost. Drawing a procedural diagram is recommended because it can summarise detailed information in a small space and clarify the points of integration in the design.28 Figure 1 displays a procedural diagram for a convergent design in a proposed mixed methods study whose purpose is to develop interventions to improve cancer patients’ medication adherence. By convention, boxes are used to represent data collection and analysis, whereas circles represent mixed methods integration and interpretation. Arrows indicate the direction and flow of activities. The amount and type of information within the boxes can vary. Although Creswell argues for simplicity,28 we provide more information in both data collection boxes and the interpretation circle for illustrative purposes.

Figure 1

Procedural diagram for a convergent mixed methods design.

Integration at the methods level

One of the factors that contributes to the versatility of mixed methods is the countless ways in which both types of data can be collected. Alternatively, data can be collected from a single data set but analysed both qualitatively and quantitatively. The qualitative and quantitative components need to be rigorous to ensure that the results of integration generate valid insights. For example, a mixed methods study in which the only qualitative data collected is through open-ended survey responses may be insufficiently rigorous if the response rate is low. Similarly, an inadequately powered quantitative sample could make it difficult to discriminate between high and low-performing hospitals on some aspect of patient safety (eg, hospital-acquired conditions).

Major procedures for integration at the methods level include connecting, building and merging.12 Connecting is a procedure for using the results of one form of research to inform the sampling of the other. For example, in a study exploring causes of variation for in-hospital cardiac arrest survival, researchers stratified hospitals based on survival rates and used connecting to sample hospitals that represented high, middle and lower performers.29 30 Another procedure, building integration, involves using the results of one form of research to inform data collection of the other. In the classic example, qualitative data gathered from interviews or focus groups on a specific topic (pain management, eg) are used to develop themes that become converted to items for a quantitative instrument that is tested in the population of interest (patients with postoperative pain).31

Merging is another common integration procedure, discussed in the next section, that typically involves integrating results to develop meta-inferences. However, designing for merging integration necessitates attention at the methods level. Quantitative and qualitative data fall on a continuum with quantitative data tending to be more closed ended and qualitative more open ended. To design for integration, it may help to brainstorm all possible data sources that could be identified to address research questions because phenomena being studied are not inherently quantitative or qualitative. Quantitative data might consist of lab results, registry data, test results, psychological assessments and educational measures. Because of how data are generated qualitative data are open ended and come from semistructured individual interviews, focus groups, open-ended survey questions, written participant journals, field observations and photographs. Thinking in an integrative manner while planning data collection helps to prepare for integrative analysis. Matching is a preintegration procedure for planning data collection about related concepts or domains both quantitatively and qualitatively.32 For example, in a study of quality of life during cancer diagnosis, at the design stage, researchers matched items on the European Organisation for Research and Treatment of Cancer quality of life questionnaire to questions in individual interviews to prepare for merging as an integration strategy.33 The matching does not have to be 1:1, yet it promotes thinking systematically about how to collect quantitative and qualitative data for the purpose of integration on common domains.

Integration at the results level

Mixed methods studies involve three types of analysis—qualitative, quantitative and integrative mixed methods—to produce results. Planning the qualitative and quantitative analysis has implications for integration. If the intent is to compare qualitative and quantitative results, the analysis should yield results that are comparable on the same level of analysis. For example, statistical analysis may produce an aggregated result such as descriptive statistics for variables or domains or the results of hypothesis testing. These types of results should be compared with an aggregated or grouped qualitative result, such as a theme, rather than raw qualitative data like quotes that are unanalysed.

Merging integration is a procedure most often used for integrating at the results level. When merging, researchers might compare quantitative and qualitative results on common domains to develop meta-inferences. Alternatively, they may examine patterns such as whether themes differ by statistical differences or categorical levels of a variable (eg, good vs poor outcomes) to develop meta-inferences. For instance, Hall and colleagues merged qualitative and quantitative implementation results to identify patient centredness, staff receptivity and desire for protocols as facilitators in addition to numerous barriers to implementing a paediatrics safety programme.27 In our ‘Situation in Practice’ the five domains from the Systems Engineering Initiative for Patient Safety model (ie, tasks, organisational factors, technology, physical environment and individuals) were used to develop meta-inferences.34 In the environmental domain, for example, quantitative results from the observation tool revealed that the medication room was far away from the nurse station. Qualitative results from nurses’ comments revealed that nurses stored medication at their station to avoid going back and forth to the medication room. The meta-inference here was that the location of the medication room affected task efficiency and time spent administering medications.

Integration during interpretation and reporting

Integration in mixed methods research leads to more than the sum of its parts.35 In our ‘Situation in Practice’ example, one might initially be appalled to learn of the numerous medication dispensing and administration policy deviations. Through integration, researchers develop a new interpretation of why and how such deviations occurred: patient medication drawers were too small to hold all patient medications (leading to dispensing omissions), a slow BCMA process led to nurses skipping use of the BCMA, and scanners were attached to the computer-on-wheels (COW) instead of being wireless. This meant that the COW had to be brought into a patient’s room to use the scanner but the COW’s large size made it unwieldy to move, so that nurses left the COW and attached scanner outside of a patient’s room, and administered medication without scanning the patient’s ID band.

Several tools, such as joint displays, can help quality and safety researchers develop meta-inferences (ie, conclusions that connect quantitative and qualitative claims) from the integration of results. Joint displays are a visual means of integrating quantitative and qualitative data, methods or results through matrices and figures9 36 and are a way to help researchers develop meta-inferences. Joint displays are used to represent the integration procedures in addition to the results of integration and meta-inferences where applicable. Software programs, such as Dedoose, HyperResearch, MAXQDA, NVivo and QDAMiner, can be helpful in developing joint displays.

The construction of a joint display encourages researchers to interrogate the entirety of results systematically in order to identify meta-inferences. Two popular types of joint displays of results are side-by-side joint displays that juxtapose quantitative and qualitative results on common domains along with resultant meta-inferences and themes-by-statistics joint displays that array qualitative themes by quantitative statistics similar to a crosstab.

For example, Hall et al 27 examined implementation of safety programmes at a hospital in Guatemala by comparing CFIR constructs from qualitative and quantitative data sets. Using a side-by-side joint display, researchers first conducted qualitative (ie, semistructured interviews and focus groups) and quantitative (ie, Evidence-Based Practice Attitude Scale) analysis and summarised the results in the cells, as in table 2, an adapted joint display derived from Hall.27 The next step was to compare quantitative and qualitative results on a similar domain, which was based on CFIR. This process facilitated systematically comparing quantitative and qualitative results on each domain and led to new insights into their integration.

Table 2

An example side-by-side joint display to develop meta-inferences

A themes-by-statistics joint display (table 3) was used to array medical students’ communication performance in a simulation measured by an objective-structured clinical examination (OSCE) by themes of those participating in the simulation.37 In this example, researchers also begin with qualitative and quantitative analysis, but continue by examining relationships, such as how themes differ based on levels of performance on the OSCE. The cells of the joint display could contain themes, subthemes, codes and illustrative quotes in addition to statistics. In table 3, the rows are themes of participant experiences in the communication simulation and columns represent quantitative levels of performance based on an OSCE score. The cells contain exemplar quotes for each theme to facilitate comparison by performance level. This cross-tabulation of results lends well to examining how a theme or category may differ across quantitative levels. In addition, it may be helpful to focus on a quantitative level and look across themes to generate meta-inferences.

Table 3

A joint display of qualitative themes by quantitative performance level37

In another example, to understand variation in nursing roles for in-hospital cardiac arrest response among hospitals with higher versus lower survival rates, researchers developed a themes-by-statistics joint display.30 Using the MAXQDA segment matrix (formerly Interactive Quote Matrix),38 researchers reviewed snippets of coded text for each major theme to examine whether there was variation between higher compared with lower performing hospitals. Although we have discussed side-by-side and statistics-by-themes joint displays, numerous other types appear in published studies.9 36 39–41

Reporting integration is one of the most important aspects of writing mixed methods research proposals or publications. Keeping this end product in mind is essential when designing for integration. Reporting is achieved through written results sections in a narrative form of weaving12 together quantitative and qualitative results along with meta-inferences and through joint displays. Meta-inferences often appear within the text of the results section of a manuscript or as the final column or row of a joint display. Writing mixed methods manuscripts can be very complicated but providing greater detail about the reporting process is beyond the scope of this paper and is discussed in other publications.32 42 43

Conclusion

Mixed methods research studies have the potential to provide more insight and a more comprehensive understanding of many complex phenomena that quality and safety science addresses. Considering integration as an afterthought at the end of a mixed methods study fails to capitalise on its strengths and limits its impact. Advances in patient safety may depend on answers to research questions that incorporate integration at multiple levels into mixed methods approaches. A holistic view of integration across levels of the research design process helps to take full advantage of the strengths of mixed methods. We encourage quality and safety researchers to continue to innovate methods, create new integration procedures and consider integration holistically.

Data availability statement

There are no data in this work.

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.

Acknowledgments

We are very grateful for comments on earlier versions of the manuscript by reviewers and editors in addition to two of our colleagues, Dr. Melissa DeJonckheere and Dr. Sergi Fàbregues. Their thoughtful and critical feedback was very helpful in shaping this manuscript.

References

Footnotes

  • X @tc_g, @mmanojlo

  • Contributors Both authors conceived this study. TG wrote the first draft and revision and is guarantor. Both authors critically reviewed the manuscript and approved the final version.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Provenance and peer review Commissioned; externally peer-reviewed.