Article Text

Download PDFPDF

Nothing soft about ‘soft skills’: core competencies in quality improvement and patient safety education and practice
Free
  1. Joanne Goldman1,
  2. Brian M Wong1,2
  1. 1 Centre for Quality Improvement and Patient Safety, University of Toronto, Toronto, Ontario, Canada
  2. 2 Department of Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Ontario, Canada
  1. Correspondence to Dr Joanne Goldman, University of Toronto, Toronto, ON M5G 2L3, Canada; joanne.goldman{at}utoronto.ca

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Quality improvement and patient safety (QIPS) education programmes have proliferated in the past decade given the rising demand for healthcare professionals to develop the knowledge, skills and attitudes required to make improvements in healthcare.1–4 On the one hand, this proliferation is a positive sign of the institutionalisation of QIPS within our educational, practice, professional and regulatory spheres. On the other hand, while numerous QIPS education programmes are up and running, our understanding of key educational processes and how to optimise outcomes is still evolving. For instance, it remains unclear how to simultaneously optimise learning and project outcomes in quality improvement (QI) project-based learning or how to facilitate interprofessional learning in QIPS education.

In this issue of BMJ Quality and Safety, Myers and colleagues5 studied the influence of two postgraduate QIPS fellowship training programme for physicians on graduates’ career outcomes and on the institutions in which they are located. The two programmes, The Center for Healthcare Improvement and Patient Safety Fellowship at the University of Pennsylvania and The Harvard Fellowship in Patient Safety and Quality, provide coursework, access to a Masters degree and opportunities to complete projects within the health system. The study involved interviews with 28 graduates and 16 of their mentors.

The study findings are encouraging in that they demonstrate that these two education programmes had positive effects at different levels. Fellows reported improvements in QIPS knowledge, skills and attitudes and all pursued and completed a Masters degree. A very high percentage of graduates, on completion of their training, held academic and/or organisational leadership positions involved in QIPS administration, research or education. The programmes also attained positive health system outcomes through fellow-led QIPS capstone projects at the divisional, departmental and/or institutional levels. In some cases, the fellows’ impact went beyond their projects and contributed to changes in their institutional cultures through creating dialogue, providing a sense of urgency and momentum for action, shifting conversations from blame to curiosity and raising awareness of improvement science. Mentors described mentees as role models who were paving a career pathway in specialties where QIPS work was less familiar and increasing the acceptability of the field.

An underlying thread that piqued our interest relates to the question of ‘what skills should QIPS learners gain through an advanced longitudinal QIPS education programme?’ Readers can see that the two education programmes covered a range of topics, such as how healthcare organisations develop QIPS priorities, quality measurement, how to build an accountable team, barriers to improvement work and so on. Our attention was further drawn to the authors’ conceptualisation of ‘hard’ and ‘soft’ skills in QIPS. Myers and colleagues note that ‘graduates consistently differentiated between “soft” and “hard” skills learnt in their fellowship and found both to be transferrable to their current workplace’. Hard skills (eg, quality and safety skills such as performance gap specification and stakeholder analysis creation, and research skills such as statistical analysis and qualitative data analysis) were defined as those skills amenable to being taught and learnt in traditional educational settings. Soft skills (eg, change management, leadership, project reflection and reassessment) were defined as those skills necessary for working with people and teams, usually learnt over time in experiential settings.

To compare and contrast what might be viewed as ‘hard’ versus ‘soft’ QIPS skills, consider the abilities needed to implement a QI initiative such as reducing central-line associated bloodstream infections in intensive care units. ‘Hard’ QIPS skills might include collecting and analysing data to track bloodstream infection rates, creating a current-state process map to better understand the local problem as it relates to central-line use and management, or conducting plan-do-study-act (PDSA) cycles to implement a standardised care bundle adapted to the local context. Yet, what often differentiates how effectively one is able to enact these ‘hard’ skills (and, in turn, influence overall QI success) is through one’s ability to apply ‘soft’ skills. These might include skilfully addressing critiques from ‘resistors’,6 negotiating with the decision support group to mobilise resources to make data available in a timely manner, and fostering multiprofessional understandings and interprofessional collaborations throughout the QI initiative. Authentic conduct of PDSA7 depends on a plethora of change management and leadership skills. In other words, ‘hard’ and ‘soft’ QI skills are interlinked.

These results about ‘hard’ and ‘soft’ skills raise a number of issues for us: How do we identify the range of skills required? How do we teach and assess the development and acquisition of these skills? How do we label these skills? How do we ensure that ‘value’ is attached to the full range of skills?

Organisations in the UK, the USA and Canada have developed quality and safety competency frameworks to outline the knowledge, skills and attitudes expected across the learning continuum and into practice.8–11 There is overlap between these frameworks which generally cover the broad domains of patient safety and informatics, QI, health equity, patients and families as QIPS partners, and teamwork and collaboration. Within each domain, there are a mix of ‘hard’ and ‘soft’ skills; for example, within QI, competencies range from the ability to use common tools (eg, flow charts, process maps, fishbone diagrams) to inform QI efforts (ie, ‘hard skills’) to the ability to use change principles to implement and evaluate tests of change (ie, ‘soft skills’). These competency frameworks list abilities reminiscent of those identified by Myers and colleagues and provide a helpful first step in defining the range of knowledge, skills and attitudes that current healthcare systems require of their practitioners. However, many of these frameworks define competencies expected of all healthcare practitioners at the entry to practice stage, and stop short of elaborating on those advanced competencies expected of QIPS experts. Given that one important way that QIPS experts differentiate themselves rests on their mastery of these ‘soft skills’, an important next step would be to clarify and gain consensus on these higher-level QIPS competencies.

An additional issue to consider is how we teach and assess the range of QIPS skills in advanced QIPS training, and in particular those ‘soft’ skills that warrant special consideration. We offer the following suggestions:

  1. Multidisciplinary teaching: Scholars from diverse areas (eg, sociology, business and management, bioethics, etc) make important contributions to the field and practice of QIPS.12 The planning of QIPS educationprogrammes should incorporate faculty from disciplines outside of healthcare or more traditionally defined QI, particularly given the nature of the ‘soft’ skills that are pertinent to leading system-level changes. This need is similar to other professional competency efforts, with an increased recognition that the knowledge and skills required extend beyond the narrow confines of bioscientific domains and medical educator expertise.13 Interdisciplinary discussions within the classroom would not only enrich and deepen the learning experience but also set the stage for longer-term collaborations.

  2. Theoretically informed education: QIPS education planning should draw on the range of bioscientific, learning and sociocultural theories to provide a stronger reasoning for how we teach QIPS knowledge and skills.14 These theories can help, for example, promote individual-level processes such as reflection in patient safety learning,15 recognise interpersonal processes such as power and hierarchy in interprofessional QIPS learning,16 and address organisational issues such as supportive environments in project-based learning.4

  3. Integrate teaching of ‘soft’ and ‘hard’ skills: It makes intuitive sense to teach ‘soft skills’ in the context of ‘hard skills’ in QIPS education. This suggestion reflects a broader theme of integrated learning that is pervasive in medical education.17 Originally focused on the integration of clinical learning and basic science, the more contemporary view has been expanded to reflect the broader array of sciences that impact clinical knowledge and experiences, as well as a more strategic focus on cognitive integration.17 Further attention to integration in QIPS education would support the evidence underlying this recommendation.

  4. Adopt meaningful assessments: We need to expand the array of approaches used to assess whether learners have acquired the range of QIPS knowledge and skills. Current assessment approaches have largely focused on ‘hard’ skills. For example, the QI Knowledge Application Tool,18 one of the most widely referenced assessment tools in QI,19 20 asks learners to create an aim statement, define a measure and propose a change for a given QI problem. However, evaluating a broader set of QIPS abilities will likely require other assessment approaches, such as reflective practice, portfolios and multisource feedback, as alternative ways of assessing the range of skills.

Organising and labelling the knowledge and skills that characterise expertise is central in any professional domain. We draw attention, though, to how we label and characterise such skills, such as use of the terminology ‘hard’ and ‘soft’. A quick exploration of the literature provides other examples of the use of the terms ‘hard’ and ‘soft’ in reference to healthcare and health professions education. Martin et al 21 22 differentiate between ‘hard’ metrics of evidentiary facts and ‘soft’ intelligence data based on feelings or intuitions in a study of challenges related to voice about safety concerns. Bringing attention to the importance of trust in medicine, Bleakley23 describes ‘trust’ in this context as being a ‘soft’ principle situated in the midst of a ‘hard’ enterprise of modern, managed healthcare that is ‘historically patriarchal, authoritative and controlling…’.

In these examples, we experience the references to ‘hard’ in terms of dominant, entrenched and valued principles and activities, in contrast to references to ‘soft’, which are those that may be less visible and valued although no less important. We urge mindfulness in how we categorise skills in QIPS education, for what purposes, the meanings we ascribe to them and the unintended consequences of such labels. In fact, the labels ‘hard’ and ‘soft’ might unintentionally perpetuate the preferential emphasis placed on learning the ‘hard’ skills (eg, incident analysis, process mapping), despite the fact that the social science domains of QIPS clearly recognise QIPS as not merely a technical issue but also a site of social processes that influence behaviours.24 25 It may be that a re-framing of the terms ‘hard’ and ‘soft’ skills is needed to ensure that the full range of skills receives equal attention in QIPS education. An illustrative example for how others have tried to address this concern is the preferred use of the term ‘intrinsic roles’ to address the problematic references to ‘non-medical expert roles’ (ie, communicator, collaborator, advocate) that was occurring in relation to the CanMEDS physician competency framework in Canada.26

The paper by Myers’ et al provides evidence of the outcomes of advanced QIPS training, and contributes to an important and ongoing discussion about our role as QIPS educators and leaders in defining what knowledge is valued in the classroom, how that knowledge gets labelled, and the implications for the practice of QIPS. Particularly encouraging is the fact that graduates of these two advanced QIPS programme recognised the value of enacting a broader range of skills in their pursuit of QIPS activities. Ultimately, expertise in QIPS requires the integration of a wide range of skills, marrying the ability to reflect, persuade, negotiate, collaborate and lead with the activities traditionally associated with the conduct of QIPS.

References

Footnotes

  • Twitter @Brian_M_Wong

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Commissioned; internally peer reviewed.

Linked Articles