BMJ Qual Saf 21:439-442 doi:10.1136/bmjqs-2011-000540
  • Viewpoint

Knowledge implementation in healthcare practice: a view from The Netherlands

  1. Roland Friele3,4
  1. 1Radboud University Nijmegen Medical Centre, Scientific Institute for Quality of Healthcare, Nijmegen, The Netherlands
  2. 2Department of Health Policy and Management, Erasmus University, Rotterdam, The Netherlands
  3. 3NIVEL, Netherlands Institute for Health Services Research, Utrecht, The Netherlands
  4. 4Tranzo, Tilburg University, Tilburg, The Netherlands
  1. Correspondence to Professor M Wensing, Radboud University Nijmegen Medical Centre, Scientific Institute for Quality of Healthcare, P.O. Box 9101, 6500 HB, Nijmegen, The Netherlands; m.wensing{at}
  1. Contributors Each author led one component of the analytical report underlying this paper. MW drafted the paper and prepared the revised version. RB and RF provided substantial input.

  • Accepted 18 November 2011
  • Published Online First 23 December 2011


In this contribution we discuss some pertinent issues regarding knowledge implementation in the Netherlands, focusing on the largest public funding agency for health research in the Netherlands (ZonMw). The commentary is based on a report, which includes a structured analysis of 79 projects funded by ZonMW, a survey of published implementation research covering 141 systematic reviews, and qualitative study of the implementation infrastructure in the Netherlands. Five themes were identified. First, the term ‘knowledge implementation’ may be better replaced by more specific terms in some situations. Second, contextual factors need to be taken more systematically into account when planning and evaluating implementation programs. Third, knowledge may change when implemented and this needs to be considered in projects. Fourth, we observed that implementation has developed into a specific world, separated from both healthcare practice and scientific research. It is important to guard against the risk of isolation from the practical and societal needs that the field is meant to address. Finally, we suggest that the strong focus on ‘doing projects’ and limited opportunities for structural funding may reduce substantial improvement in the field. Many good activities are underway, but the policies regarding knowledge implementation appear to need some adjustment. In its policy plan for the coming years, ZonMW has partly taken up the lessons from our advisory report.


In 2010, ZonMw—the largest public funding agency for health research in The Netherlands—commissioned an analytical report to inform its policies regarding the implementation of research findings into healthcare practice. ZonMw employed an active policy regarding knowledge implementation since its start in the 1990s.1 2 In this contribution we discuss some pertinent issues that emerged from the analyses, after we have elaborated on healthcare policy regarding knowledge implementation in the Dutch healthcare system in the previous two decades.

Nationwide policies on knowledge implementation

The Dutch healthcare system is characterised by distributed decision-making, implying that many stakeholders are involved in the implementation of knowledge in healthcare practice.3 ZonMw works at an arms length of the Ministry of Health and the Dutch Organisation for Scientific Research. It aims to stimulate the uptake of research findings in several ways, including obligatory dissemination activities in each funded project, focused studies of aspects of knowledge implementation, syntheses of available knowledge and organising stakeholder meetings. ZonMw has coordinated many large-scale programmes targeted at implementation of research evidence and best practices.

ZonMw is a merger of two other organisations: ZON and NWO-Medical Sciences. ZON (‘Zorg Onderzoek Nederland’: Care Research Netherlands) was established in 1998 by law to coordinate programmes of innovation and applied research in healthcare for the Ministry of Health. It was the result of lobbying by a group of officials in the Ministry of Health to improve the transparency, coherence, quality and impact of funded projects. The law on ZON explicitly stated that a task was to promote the uptake of research findings in healthcare practice, because the feeling was that many projects funded by the Ministry had too little impact on healthcare practice. The group of officials had the support from a few academics, but many biomedical researchers and coordinators of practice-based projects were sceptical. In 2001 ZON merged with NWO-Medical Sciences (NWO stands for Dutch Organisation for Scientific Research). The underlying idea was to cover the complete spectrum from discovery-orientated biomedical research to practice-related innovations and evaluations.

Description of the report

Our report comprised of three components,4 which will be briefly described. The first was a systematic analysis of documentation on 79 completed studies to identify factors associated with the success of implementation programmes. All studies were funded by ZonMw, purposefully spread across healthcare sectors and types of research. Documentation on project plan and results was analysed using a structured data-abstraction form. Some support was found for the positive impact of two factors. First, the use of multiple strategies, including multiple channel communication, appears to be more effective than the use of single strategies. We noticed that a majority of the projects used such a complex multi-faceted strategy. Second, we found that projects that also focused on contextual factors (such as aspects of the healthcare organisation) seemed to be most effective.

The second component of our report was a comprehensive search of the international research evidence in Pubmed and Cochrane Library up to 2009, resulting in a structured description of 141 systematic reviews of implementation interventions. The vast majority (n=110) referred to somatic healthcare (both primary care and hospital care). Around half the reviews (n=55) looked at organisational strategies. The other types of strategies were more or less equally represented in the literature reviews found. The implementation strategies had, overall, a beneficial effect on professional practice. Effects on patient outcomes were less obvious.

The third component of the report was a qualitative study of the implementation infrastructure in The Netherlands. Infrastructures were defined in three ways: social (ie, the organisations that are involved in supporting or enhancing knowledge implementation in Dutch healthcare), instrumental (the kinds of methods used) and conceptual (the concepts used). The researchers conducted interviews with 28 respondents from various types of organisations, convened two expert meetings and performed a document analysis. They also build on their own experiences as evaluators of quality and knowledge translation programmes. Each subsector (eg, acute care, long-term care, public health) appears to have its own infrastructure with a large number of organisations that develop knowledge, organisations that disseminate and implement knowledge (acting as brokers or links) and organisations that use new knowledge. The roles, strategy and instruments of different organisations (professional associations, centres of excellence, research institutes) were found to be converging. Typical of The Netherlands is that quality and dissemination and implementation work has become a market in its own right. Conceptually, the language of diffusion has become dominant, with an emphasis on a phased approach of knowledge development, implementation and spread.

Themes emerging from the report

Conceptual confusion

In this contribution, we use the term ‘implementation’ broadly to refer to ‘improvements in clinical practice based on a practical recommendation that is underpinned by research as much as possible’.5 However, we noticed that some people associate the term ‘implementation’ with the imposition of scientific knowledge on individuals from outside. This specific interpretation can prompt resistance, so it may be better to avoid using the term where this is the case. Perhaps the term ‘improvement’ or ‘knowledge transfer’ may be better used, although these may also have specific meanings. Some authors prefer the term ‘co-production’ to put an emphasis on the active participation of knowledge users in the construction of knowledge.6 Currently, different labels including ‘improvement science’ and implementation science' are used in many countries.

Context of implementation

The outcome of many projects in The Netherlands was influenced by contextual factors, whereas these contextual factors were only scarcely considered in the planning phase of these projects. Increased attention on the relevance of contextual factors provides a different light on taxonomies of implementation strategies, which tend to emphasise the intervention methods used (educational, organisational, financial, etc). Our reviews of implementation strategies showed that all strategies can have a positive effect on the performance of professionals. However, it is impossible to say which strategies or combinations of strategies will be successful in general. The effects of implementation seem to depend strongly on contextual factors. These findings call the value of taxonomies of implementation strategies emphasising intervention methods into question. Such taxonomies need to be complemented by a taxonomy or checklist of contextual factors. Moreover, contextual factors need to be taken into account in describing and evaluating interventions. Examples of those factors are organisational culture, teamwork, leadership, structural organisational characteristics, external incentives and regulations, and availability of management tools.7

Changes in the knowledge when implemented

One assumption in many implementation projects was that knowledge is static. Knowledge that is developed in one place and applied in another and is thought not to change when transferred. However, we found in our analysis of Dutch projects that in many cases adjustments of the knowledge to local circumstances are made as part of the ‘implementation’. Even if a particular intervention has proved effective, it will be adapted to the local situation when implemented elsewhere. The provision of care is a complex activity for which scientific knowledge, in the form of instruments like clinical guidelines and quality indicators, can provide guidance only to some extent. Knowledge may be better regarded as a semi-finished product that gets adjusted in practice to make it work. Implementation research should thus also focus on the questions—How knowledge gets translated into practice? And what outcomes such translations are likely to achieve (eg, Will the expected positive effects endure?)? The ‘not invented here syndrome’, that is often considered to be detrimental for effective improvement may instead be used as a force set to modify the semi-finished knowledge to adjust it to a specific situation.

Implementation as a separate world

Implementation is usually directly associated with a question from practitioners, managers or policymakers. In The Netherlands, ‘implementation’ (and quality improvement) has become a separate social world, almost an industry, more or less independent of research, practice or policy. This world encompasses intermediary organisations (centres of expertise, professional associations, consultancy firms and to some degree universities), and also senior-level healthcare managers. This process is associated with a certain degree of professional development in which the growing profession has tried to carve out a specific knowledge domain vis-à-vis other domains and has organised in specific networks. This may have beneficial effects, such as raising standards and making it easier to attract more talented people. However, it is also important to guard against the possible drawbacks, such as growing isolation from the practical and social needs that the discipline is meant to address. The danger of isolation could also be a lack of substantive experience with the practices in which knowledge is to be implemented, making efforts less effective. Also, isolation of this practical implementation world from scientific evaluation of instruments and interventions (‘implementation science’) may reduce its effectiveness.

Dominance of project-based approach

One basic part of the infrastructure of implementation and implementation research is its culture of ‘doing projects’. Projects are integral to the market type organisation of implementation work, and are seen to have value in creating transparency (eg, being ‘SMART’). However, healthcare practitioners are growing tired of the large number of projects being carried out, which appear to have little connection with each other and leave little room for sustained change. Researchers are growing tired of accountability obligations, which leave them less time to devote to substantial issues and makes it difficult to develop longer term research agendas. Moreover, a great deal of time goes into acquiring projects, reducing the time available for actually performing research. All in all, it would be worth reconsidering the optimum mix of competition and long-term programme funding. Another possibility might, for example, be to have a group of experts familiar with the field to guide innovation and implementation processes in similar organisations. This would allow knowledge of the field to be deployed more broadly. When such partnerships emerge, interested and motivated groups are created.

Advancing knowledge of implementation

The past few years have seen exponential growth in the number of implementation projects and studies in The Netherlands as well as in other countries. New studies and projects would do well to use the results of these projects where possible. Unfortunately, some projects still lacked any systematic survey of previous implementation research. A new project for the implementation of a particular clinical guideline must not, for example, overlook the results of earlier projects in the same field. We found that vested interests (such as the desire for a proposal to be awarded funding) can be a factor in failure to take account of previous results. This does not help the accumulation of knowledge, and it increases the likelihood that mistakes will be repeated. The recruitment and training of researchers for implementation research poses another challenge.8 For instance, implementation research deals with the analysis of the impact of multiple interventions. These interventions change over time and interventions will be handled in different ways in different settings. This requires research methodologies that are adapted to these features.


Although many good activities are currently underway in The Netherlands, the policies would appear to need some adjustment. New questions for research concern issues such as how knowledge is adapted in practice, what role contextual factors play and how these can be exploited. Recommendations also include working more with long-term consortia based around specific themes. Above all, more opportunity is needed for analysis and reflection to expedite progress with the implementation of knowledge.

In its policy plan for the coming years,9 ZonMw stated that it has learnt from our evaluation four lessons: to use specific terminology rather than the phrase ‘implementation’; to learn more in detail about different users and contexts; to continue innovation and research on implementation methods; and to strengthen the knowledge infrastructure per healthcare sector. The first three items relate to themes in this commentary, while the last item is not necessarily consistent with our analysis regarding implementation as a separate world. The remaining two themes in this commentary (the proposition that knowledge may change during implementation and the limitations of project-based funding) were not included in the list of lessons in the ZonMw report. Overall, this suggests ZonMw has partly taken up the lessons from our advisory report.


  • Funding ZonMw funded the report underlying this paper, but had no role in the writing of the manuscript or in the decision to submit for publication.

  • Competing interests None.

  • Provenance and peer review Not commissioned; externally peer reviewed.


Free sample

This recent issue is free to all users to allow everyone the opportunity to see the full scope and typical content of BMJ Quality & Safety.
View free sample issue >>

Email alerts

Don't forget to sign up for content alerts so you keep up to date with all the articles as they are published.