1. Strength of associationWhat is the size of the effect, that is, what is the relative risk or OR? What is the size of the effect, that is, what is the efficacy of the intervention on outcomes of interest?
| Statistical process control (SPC) charts enable identification of special cause of variation that is unlikely to be due to chance alone, thereby providing statistical evidence of effect and its magnitude (measures of relative risk, number needed to treat and estimates of the magnitude of attributable effect are useful measures of effect size).State the magnitude of change and its clinical or systems meaning Use SPC charts, with a clear rule set and control limits, to determine changes in process in advance, to maintain objectivity and to avoid fishing for a result
|
2. Consistency of associationAre repeated observations from different places, at different times, with differing methods, by different researchers, under different circumstances in agreement? Does repeated application of intervention provide similar results in different contexts?
| Existing evidence contributes to programme theory and implementation plan, which can be used to demonstrate consistent impact, for example, through scaling up.Keep track of intervention–outcome data as scale-up occurs to increase knowledge about causal consistency between intervention and outcome in different settings Analyse contextual barriers and enablers; make and note amendments to implementation plan
|
3. Specificity of association
| Combination of implementation design (eg, step-wedge), SPC charts and other analyses of change can inform the specificity of outcomes in relation to the intervention and planned implementation activities.Establish a comparison or control group, where possible, to identify secular trends (ie, explore the counterfactual: what might have occurred without the intervention?) Ensure that the design and evaluation plan mitigate potential bias and confounding Explore what alternative mechanisms exist that might obtain the effect
|
4. Temporality
| SPC charts determine relationships between timing of intervention and observed special cause, and Plan-Do-Study-Act (PDSA) cycles document QI&I activity.Annotate SPC charts with intervention events; include annotations of relevant external events apart from the planned intervention that could have influenced the outcome; make clear how and when special cause is detected and handled Ensure sufficient baseline data points to understand variation inherent in system Specify the predicted time period necessary to implement the intervention before improvement is expected to occur
|
5. Biological gradientAs more of the stimulus is added, is the response increased?
Is more effect observed with more intervention, or higher fidelity of intervention?
| A combination of programme theory, implementation design and plan (eg, step-wedge), SPC charts, and other analyses for change can examine the extent outcomes improve in relation to the intervention ‘dose’ in planned programme activities.Demonstrate relationship between dose of interventions and outcome using SPC charts or other analyses to display effect size In implementation plan design, consider the activities needed to deliver the ‘dose’
|
6. Plausibility
| Programme theory and process maps should incorporate the plausibility that the intervention is likely to impact the outcome of interest. The implementation plan should consider the amount of intervention required to obtain a response, and statistical evaluations should reflect the degree of confidence in cause and effect.Draw on existing theories and models (eg, behaviour sciences, implementation research) to determine the plausibility of the postulated QI&I initiative Observe how an intervention works in practice and link to PDSA cycles for testing theories; update programme logic in light of learning
|
7. Coherence
| Existing literature that demonstrates evidence for case (using knowledge from across disciplines) builds coherence.
|
8. Analogy
| Learning from other improvers and researchers; for instance a similar intervention (eg, ‘care bundle’) in one setting has analogy to another.
|
9. Experiment
| Programme theory highlights areas for implementation activity. The implementation design should mitigate confounding and bias where possible. PDSA cycles can be used to experiment, recognising that multiple changes may be required.Test changes using iterative PDSAs along the theoretical causal pathway to build confidence in cause–effect Document predictions, what changes were made and why; reflect on accuracy of predictions and determine new information gained; update programme theory
|