Article Text
Statistics from Altmetric.com
- Health professions education
- Educational outreach, academic detailing
- Medication safety
- Audit and feedback
Many of us will be familiar with educational outreach visits (EVs), which according to the Effective Practice and Organisation of Care taxonomy are defined as ‘personal visits by a trained person to health workers in their own settings, to provide information with the aim of changing practice’. But is merely providing information enough to achieve a change in practice, and through which mechanisms? In this issue of BMJ Quality & Safety, Luetsch and colleagues1 undertook a rigorous, transparent realist synthesis exploring how EVs work to influence prescribing behaviour in ambulatory care settings. Based on a synthesis of 43 papers, Luetsch and colleagues explain how the quality of clinician–education visitor interactions is pivotal, with sustained relationships and evidence-based open dialogue leading to reflection on practice and changes to prescribing. In this editorial, we explore the strong parallels between this study and wider health professions education (HPE) research, and then make the case that existing definitions may need to evolve beyond a conception of ‘education as information provision’, to ‘education as relational’.
Education is a particularly complex intervention to research because of its multiple interacting components (eg, clinician and education visitor), with multiple possible outcomes (eg, increase in prescribing knowledge, change in professional behaviour, ongoing collaboration), and where the ‘intervention’ (eg, educational visits) is typically tailored to the specific setting.2 All educational interventions are complex (not just EVs), including staff development initiatives and education within quality improvement (QI) initiatives, because the process is inevitably dependent on the emergent properties of the interactions that develop between the components (typically people). This complexity means that the ‘outputs’ of educational interventions are not always predictable from the ‘inputs’. Thus, Luetsch and colleagues’ observation that EVs varied by implementation1 did not surprise us. Indeed, from our perspective, a real strength of the realist synthesis approach adopted was that it enabled this complexity to be acknowledged, analysed and understood, rather than ‘controlled for’ and thereby neutralised.
As HPE researchers with a special interest in feedback interventions that improve prescribing practices, we were keen to connect Luetsch and colleagues’ findings to our knowledge of the HPE literature. Our feedback research has mainly involved early career doctors (eg, Coombes and colleagues3 and Parker and colleagues4), for whom feedback conversations typically occur in busy workplace environments. In comparison, we suspect there may be less of a power imbalance within the sense-making dialogues of EVs,1 plus EVs may happen in quieter offices and be more ‘formal’, involving scheduled, protected time (whereas feedback on hospital wards is typically more informal or ‘in passing’). Despite these differences, we recognised many of the key elements identified by Luetsch and colleagues via the ‘Context/Mechanism/Outcome configurations’ as reflective of good educational practice. For example, in feedback conversations done well, the educator and learner role distinction similarly becomes blurred and the focus becomes a learning conversation taking place in a psychologically safe context. Like EVs, feedback tends to have a variable impact on learning, since it too is a complex intervention. Luetsch and colleagues demonstrated that to maximise the effects of EVs, what really matters is not the provision of information (because information is easy to access), rather it is the sense-making that occurs through dialogue within professional relationships. We were struck by the strong alignment between these findings and our understanding of effective feedback processes.5 6
Audit and feedback has been defined as ‘a summary of health workers’ performance over a specified period of time, given to them in a written, electronic or verbal format’.7 Despite this, there is increasing evidence that the nature and qualities of relationships are essential prerequisites for creating the conditions for effective feedback conversations, including those concerning healthcare quality and patient safety. For example, Telio and colleagues8 explored the psychotherapeutic concept of the ‘therapeutic alliance’ and applied this to provide insights into the relational construction of feedback in medical education. By reorganising constructions of feedback around an ‘educational alliance’ framework, the authors reconceptualised the feedback process as negotiation in the environment of a supportive educational relationship.8 Similarly, in the context of pharmacists working with junior doctors, Noble and colleagues9 found that the ways that each professional negotiated a mode for engagement and communication (conceptually defined as ontogenetic ritualisation)10 had the potential to shape safe prescribing practices.
One realisation for us in writing this editorial is that definitions of education in the wider healthcare literature may need updating in light of the evolving educational evidence base. It seems to us, for example, that traditional conceptions of professional education may be holding back the full potential of feedback conversations. The term ‘feedback’ within healthcare settings often summons to mind a unidirectional information flow (eg, the well-known ‘feedback sandwich’11), with a more experienced clinician instructing a novice learner, and involves quite simple notions of error.8 12 However, power imbalances can reduce psychological safety within feedback conversations13 and increase the likelihood of a monologue (or ‘a telling’),12 and simple notions of error can undermine a collective understanding of any structural barriers to improved practice.12
As mentioned previously, EVs are currently defined as ‘personal visits by a trained person to health workers in their own settings, to provide information with the aim of changing practice’ and audit and feedback is defined as ‘a summary of health workers’ performance over a specified period of time, given to them in a written, electronic or verbal format’ (our emphasis, to demonstrate the focus of current definitions on information provision).7 Yet, educators have long known that provision of information alone does not change practice and that individuals must engage with, and make sense of, the information and then respond to it cognitively, emotionally and/or behaviourally. Traditional notions of education probably limit its relevance and potential,14 whereas pedagogic practices that enable the articulation of dispositions, values, goals and procedures hold much more potential.14 Unfortunately, it follows that if we think of education as information provision, then the educational interventions we design will fall short of their full potential too, and this has important repercussions.
For example, in their article about improvement interventions, Soong and Shojania15 conclude that ‘As a sole strategy, education rarely results in sustained behaviour change, earning it a ‛necessary but insufficient’ status’ (p354). This may be true when education is understood as a unidirectional flow of information, rather than as a complex intervention where relationships form the ‘active ingredient’. Yet, if we reconceptualise the nature and purpose of education as relational, then its potential to improve healthcare outcomes is substantial. Importantly, the unintended positive consequences go far beyond what most QI initiatives measure, since improved relationships also impact positively on organisational culture, workforce well-being and staff retention. Soong and Shojania discuss the cost of educational interventions, which they see as being expensive because it involves people’s time. Yet, we know that time invested in relationships and creating connectedness within supportive workplace cultures creates positive ‘ripple’ effects in organisations (for example, Carrieri and colleagues16). After all, if a single senior health professional was retained in work for 5 more years because of an educational intervention, then the financial savings would be substantial.
New definitions of education also need to be flexible enough to accommodate a range of ‘units of analysis’, since HPE research is increasingly proposing an interdependent way of thinking about learning,17 with both individual and workplace affordances being important. In other words, the unit of analysis for education research might be an individual learner or educator, but it might also be a dyad or a triad, an interaction, learning moment or learning event, a healthcare team participating in a QI project or a medical school cohort, an organisation, etc. HPE researchers are increasingly drawing on sociocultural theories, to supplement the psychological theories that predominated previously. As Billett puts it: ‘through engaging in work activities, individuals come to change what they know and do. This is called learning’ (p208).10 He goes on to recommend practice pedagogies to promote learning in and through workplace activities and interactions, such as storytelling, verbalisation and guided learning,14 which goes well beyond information provision. This renewed focus on professional relationships and the contexts that support learning holds significant potential to build upon existing, more individualistic, research studies.
These developments have exciting implications! If education was defined as relational, rather than as information provision, in light of developing theory and evidence, then future HPE interventions would likely adapt to meet the new definition. We predict that educational interventions meeting this new definition would have significantly greater impact on educational and healthcare outcomes than currently reported. For example, a Cochrane review concluded that audit and feedback generally leads to small but potentially important improvements in professional practice,18 yet this is based on a synthesis of all studies meeting current definitions (ie, including those interventions that HPE researchers might predict to have limited, null or even negative effect). If only those studies involving dialogue and relationships were included in the review, we predict the impact would be greater.
To conclude, in this editorial, we have explored the strong parallels and alignment between this study and the wider HPE research field. Like Luetsch and colleagues,1 our HPE research has highlighted the importance of dialogue, within psychologically safe relationships, that take account of the complexity of practice. We have also made the case that existing definitions need to evolve beyond a conception of ‘education as information provision’, to ‘education as relational’. The earlier focus on information provision and individual learners can now give way to a focus on learners interdependent with their learning environments, within teams, organisations and health systems. With this understanding of education, its potential to transform individuals, organisations and society is huge.
Ethics statements
Patient consent for publication
Ethics approval
Not applicable.
Footnotes
Contributors KM and CN shared ideas and discussed the focus of the editorial. KM created the first draft, which CN developed further, and then the two authors exchanged subsequent iterations. Both authors agreed the final version.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Provenance and peer review Commissioned; internally peer reviewed.