Skip to main content

A scoping review of theories, models and frameworks used or proposed to evaluate knowledge mobilization strategies

Abstract

Background

Evaluating knowledge mobilization strategies (KMb) presents challenges for organizations seeking to understand their impact to improve KMb effectiveness. Moreover, the large number of theories, models, and frameworks (TMFs) available can be confusing for users. Therefore, the purpose of this scoping review was to identify and describe the characteristics of TMFs that have been used or proposed in the literature to evaluate KMb strategies.

Methods

A scoping review methodology was used. Articles were identified through searches in electronic databases, previous reviews and reference lists of included articles. Titles, abstracts and full texts were screened in duplicate. Data were charted using a piloted data charting form. Data extracted included study characteristics, KMb characteristics, and TMFs used or proposed for KMb evaluation. An adapted version of Nilsen (Implement Sci 10:53, 2015) taxonomy and the Expert Recommendations for Implementing Change (ERIC) taxonomy (Powell et al. in Implement Sci 10:21, 2015) guided data synthesis.

Results

Of the 4763 search results, 505 were retrieved, and 88 articles were eligible for review. These consisted of 40 theoretical articles (45.5%), 44 empirical studies (50.0%) and four protocols (4.5%). The majority were published after 2010 (n = 70, 79.5%) and were health related (n = 71, 80.7%). Half of the studied KMb strategies were implemented in only four countries: Canada, Australia, the United States and the United Kingdom (n = 42, 47.7%). One-third used existing TMFs (n = 28, 31.8%). According to the adapted Nilsen taxonomy, process models (n = 34, 38.6%) and evaluation frameworks (n = 28, 31.8%) were the two most frequent types of TMFs used or proposed to evaluate KMb. According to the ERIC taxonomy, activities to “train and educate stakeholders” (n = 46, 52.3%) were the most common, followed by activities to “develop stakeholder interrelationships” (n = 23, 26.1%). Analysis of the TMFs identified revealed relevant factors of interest for the evaluation of KMb strategies, classified into four dimensions: context, process, effects and impacts.

Conclusions

This scoping review provides an overview of the many KMb TMFs used or proposed. The results provide insight into potential dimensions and components to be considered when assessing KMb strategies.

Peer Review reports

Contribution to the literature

  • The evaluation of KMb strategies is a critical dimension of the KMb process that is still poorly documented and warrants researchers’ attention.

  • Our review identified the most common theories, models and frameworks (TMFs) proposed or used to assess KMb strategies and the main components to consider when evaluating a KMb strategy.

  • By developing an integrative reference framework, this work contributes to improving organizations’ capacity to evaluate their KMb initiatives.

Background

It is widely recognized that research evidence has the potential to inform, guide, and improve practices, decisions, and policies [1]. Unfortunately, for diverse reasons, the best available evidence is still too seldom taken into account and used [2,3,4,5,6,7]. The field of research on knowledge mobilization (KMb) has been growing rapidly since the early 2000s [2, 3, 8,9,10,11]. Its purpose is to better understand how to effectively promote and support evidence use.

Knowledge mobilization is one of many terms and concepts developed over recent decades to describe processes, strategies, and actions to bridge the gap between research and practice. Other common terms often paired interchangeably with the term “knowledge” are “translation”, “transfer”, “exchange”, “sharing” and “dissemination”, among others. [12, 13]. Some are more closely linked than others to specific fields or jurisdictions. For this study, we adopted the term knowledge mobilization (KMb) because it conveys the notions of complexity and multidirectional exchanges that characterize research-to-action processes. We used it as an umbrella concept that encompasses the efforts made to translate knowledge into concrete actions and beneficial impacts on populations [1]. Moreover, the term KMb is also used by research funding agencies in Canada to emphasize the medium- and long-term effects that research knowledge or research results can have on potential users [1, 14].

KMb represents all processes from knowledge creation to action and includes all strategies implemented to facilitate these processes [14]. A KMb strategy is understood as a coordinated set of activities to support evidence use, such as dissemination activities to reach target audiences (for example, educational materials, practical guides, decision support tools) or activities to facilitate knowledge application in a specific context and support professional behaviour change (for example, community of practice, educational meetings, audits and feedback, reminders, deliberative dialogues) [15]. A KMb process may vary in intensity, complexity or actor engagement depending on the nature of the research knowledge and the needs and preferences of evidence users [7].

KMb is considered a complex process, in that numerous factors can facilitate or hinder its implementation and subsequent evidence use. The past two decades have seen the emergence of a deeper understanding of these factors [2, 3, 16]. These may be related to the knowledge mobilized (for example, relevance, reliability, clarity, costs), the individuals involved in the KMb process (for example, openness to change, values, time available, resources), the KMB strategies (for example, fit with stakeholder needs and preferences, regular interactions, trust relationships, timing), and organizational and political contexts (for example, culture of evidence use, leadership, resources) [2, 6, 17, 18]. However, more studies are needed to understand which factors are more important in which contexts, and to evaluate the effects of KMb strategies.

On this last point, while essential, it is often very complex to study KMb impacts empirically to demonstrate the effectiveness of KMb strategies [19,20,21]. Partly for this reason, high-quality studies that evaluate process, mechanisms and effects of KMb strategies are still relatively rare [2, 22,23,24,25]. As a result, knowledge about the effectiveness of different KMb strategies remains limited [10, 17, 19, 23, 26,27,28] and their development cannot be totally evidence informed [3, 19, 20, 23, 29, 30], which may seem incompatible with the core values and principles of KMb.

The growing interest in KMb has led to an impressive proliferation of conceptual propositions, such as theories, models and frameworks (TMF) [2, 3, 9, 11, 12, 31, 32]. Many deplore the fact that these are poorly used [11, 30, 33] and insist on the need to test, refine and integrate existing ones [3, 31, 34]. Indeed, the conceptual and theoretical development of the field has outpaced its empirical development. This proliferation appears to have created confusion among certain users, such as organizations that need to evaluate their KMb strategies. Besides implementing and funding KMb strategies, knowledge organizations such as granting agencies, governments and public organizations, universities and health authorities are often required to demonstrate the impact of their strategies [21, 35, 36]. Yet this can be a significant challenge [20, 23, 29]. They may have difficulty knowing which TMFs to choose, in what context and how to use them effectively in their evaluation process [12, 37].

Indeed, the evaluation of KMb strategies is still relatively poorly documented, with respect to the phases of their development and implementation. Our aim in this scoping review is to clarify, conceptually and methodologically, this crucial dimension of the KMb process. This would help organizations gain access to evidence-based, operational and easy-to-use evaluation toolkits for assessing the impacts of their KMb strategies.

Objectives

To survey the available knowledge on evaluation practices for KMb strategies, we conducted a scoping review. According to Munn et al. [38], a scoping review is indicated to identify the types of available evidence and knowledge gaps, to clarify concepts in the literature and to identify key characteristics or factors related to a concept. This review methodology also allows for the inclusion of a diversity of publications, regardless of their nature or research design, to produce the most comprehensive evidence mapping possible [39]. The objective of the scoping review was to identify and describe the characteristics of theories, models and frameworks (TMFs) used or proposed to evaluate KMb strategies. The specific research questions were:

  1. (1)

    What TMFs to evaluate KMb strategies exist in the literature?

  2. (2)

    What KMb strategies do they evaluate (that is types of KMb objectives, activities, target audiences)?

  3. (3)

    What dimensions and components are included in these TMFs?

Methods

This scoping review was conducted based on the five steps outlined by Arksey and O’Malley [39]: (1) formulating the research questions; (2) identifying relevant studies; (3) selecting relevant studies; (4) extracting and charting data; and (5) analysing, collating, summarizing and presenting the data. Throughout the process, researchers and knowledge users (KMb practitioners) were involved in decisions regarding the research question, search strategy, selection criteria for studies and categories for data charting. We followed the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines [40]. No protocol was registered for this review.

Search strategy and information sources

The search strategy was developed, piloted and refined in consultation with our team’s librarian. Search terms included controlled vocabulary and keywords related to three main concepts: (1) knowledge mobilization (for example [knowledge or evidence or research] and transfer, translation, diffusion, dissemination, mobilization, implementation science, exchange, sharing, use, uptake, evidence-based practice, research-based evidence), (2) evaluation (for example, evaluat*, measur*, impact, outcome, assess, apprais*, indicator) and (3) TMF (for example, framework*, model*, method*, guide*, theor*). See Additional file 1 for the search terms and strategies used in the electronic searches.

The following databases were searched from January 2000 to August 2023: MEDLINE (Ovid), PsycInfo (Ovid), ERIC (ProQuest), Sociological Abstracts (ProQuest), Dissertations & Theses (Proquest), Érudit and Cairn. These databases were chosen to identify relevant references in the health, education and social fields. Several search strategies were tested by the librarian to optimize the retrieval of citations known to the investigators and to increase the likelihood that all relevant studies would be retrieved. We also searched reference lists of included articles and previous systematic reviews [11, 12, 15, 41].

Eligibility criteria

A publication was considered eligible if it (1) presented or used a theory, model, or framework (TMF), (2) described dimensions or specific components to consider in the evaluation of KMb strategies, (3) presented or discussed KMb strategies or activities (any initiatives to improve evidence use), and (4) proposed outcomes that might result directly or indirectly from the KMb strategies. Studies were excluded from analysis if they (1) presented a TMF to assess the impact of research without mentioning KMb strategies or an intervention not related to KMb and (2) presented evaluation dimensions or components that could not be generalized. We considered publications in English or French. All types of articles and study designs were eligible, including study protocols.

Study selection

The results of the literature search were imported into Covidence, which the review team used for screening. After duplicate articles were removed, the titles and abstracts were screened independently by two of the three reviewers (EMC, MJG, GL). Publications identified as potentially relevant were retrieved in full text and screened independently by three reviewers (EMC, MJG, GL). Discrepancies regarding the inclusion of any publication were resolved through discussion and consensus among reviewers. The principal investigator (SZ) validated the final selection of articles.

Data synthesis

A data charting form was developed in Microsoft Excel and piloted by the research team. Data extracted included study characteristics (authors, authors’ country of affiliation, year, journal, discipline, article type, study setting, study aim), KMb strategies of interest, KMb objectives, KMb target audiences and TMFs used or proposed for KMb evaluation (existing or new TMF, specific dimensions or components of TMF and so on). Data were extracted by a single reviewer (SL, JC or OP) and validated by a second reviewer (SZ). Disagreements were discussed between reviewers and resolved by consensus. No quality appraisal of included studies was conducted, as this is optional in scoping reviews and the purpose was only to describe the content of identified TMFs [42].

Data analysis and presentation of results

Data were summarized according to study characteristics, KMb strategy characteristics (activities, objectives, target audiences), types of TMFs, and dimensions or components to consider for KMb evaluation. Disagreements during the process were discussed and resolved through consensus (SL, DG, SZ). A KMb strategy might have one or more objectives and include one or more activities. Thus, the objectives and activities of the KMb strategies extracted from the selected studies were summarized based on existing categorizations. The categorization of KMb objectives was inspired by Gervais et al. [15] and Farkas et al. [43] (Table 1).

Table 1 Categories of KMb strategy objectives

The KMb activities were categorized according to the Expert Recommendations for Implementing Change (ERIC) taxonomy [44]. The activities were first classified according to the full taxonomy and then grouped into the nine categories proposed by Waltz et al. [45] (Table 2).

Table 2 Categories used to classify KMb activities

The TMFs were categorized according to the categories of theoretical approaches described by Nilsen [32]: process models, evaluation frameworks, determinant frameworks and classic theories (Table 3). The category “implementation theories” originally described by Nilsen [32] was not used because we did not identify any article that fit this category. We also added a category named “logic models” due to the nature of the identified TMFs. Logic models are often used in theory-driven evaluation approaches and are usually developed to show the links among inputs (resources), activities and outputs (outcomes and short-, medium- and long-term effects) [46].

Table 3 Categories used to classify the identified TMFs

Finally, the content extracted from the TMFs was analysed using mainly an inductive method. This method allows, among other things, to develop a reference framework or a model from the emerging categories that are evident in the text data [50].

The classification of concepts is the result of multiple readings and interpretations. The concepts associated with each dimension of the framework were classified according to their meaning. Similar concepts were grouped together to form components. These grouped components were then associated with the subdimensions and main dimensions of the framework.

Results

Search results

The searches yielded 4763 articles. Of those, 4258 were excluded during the title and abstract screening. Of the 505 full-text articles, we retained 88 in our final sample. The results of the search and selection processes (PRISMA flowchart) are summarized in Fig. 1.

Fig. 1
figure 1

PRISMA flowchart summarizing search strategy and selection results [40]

Publication characteristics

Most articles were published after 2010 (n = 70, 79.5%), with an average of 5 articles per year between 2010 and 2023 compared with an average of 2.1 articles per year between 2001 and 2009; there were no eligible articles from 2000. The search was conducted in August 2023, and only five articles were published in these 7 months of the year. Table 4 presents the main characteristics of the selected articles. A full list of the included articles with their main characteristics is presented in Additional file 2.

Table 4 Characteristics of included articles (n = 88)

The number of theoretical and empirical articles was relatively similar. Among the theoretical articles, 19 descriptive articles (21.6%) were aimed at describing a KMb strategy, a KMb infrastructure or a TMF related to a specific programme or context; 18 articles (20.5%) synthesized knowledge to propose a TMF (new or revised); and three articles conducted systematic reviews (3.4%).

The empirical articles category included studies with different methodological approaches (quantitative, qualitative, mixed methods). We will not report the details of the methodologies used, as this would result in a long list with few occurrences. The empirical articles can be divided into three categories: (1) studies that evaluated a TMF related to KMb (n = 16, 18.2%), (2) studies that evaluated a KMb strategy (n = 21, 23.9%) and (3) studies that evaluated both a KMb strategy and a TMF (n = 7, 8.0%).

Most articles were related to healthcare (n = 71, 80.7%). This field of study was divided into three subdomains. The healthcare and social services articles usually described or assessed a KMb strategy targeting health professionals’ practices in a variety of fields (for example, occupational therapy, dentistry, mental health, pharmacology, gerontology, nursing and so on). The health policy and systems articles usually described or assessed KMb strategies targeting decision-making processes, decision-makers or public health interventions and policies. The continuing education articles assessed training programmes for health professionals aimed at increasing knowledge and skills in a specific field. The articles in the general field described or discussed TMFs and KMb strategies that could be applied to multiple disciplines or contexts. Finally, the articles in the education field described or assessed a KMb strategy targeting education professionals.

Almost half of the articles (n = 42, 47.7%) studied KMB strategies implemented in only four countries: Canada, Australia, the United States and the United Kingdom. Countries in South America, the Caribbean, Africa, Asia, the Middle East, China and Europe were underrepresented (n = 8, 9.1%). The remaining 34 articles (38.6%) did not specify an implementation context and were mostly theoretical articles. Regarding the authors’ countries of affiliation, Canada, the United States, Australia and the United Kingdom were again the most represented countries, featuring in 85% of the articles (n = 75).

What theories, models or frameworks exist in the literature to evaluate KMb strategies?

Several articles proposed a new TMF (n = 37, 42.0%), and some articles proposed a logic model specifically developed to evaluate their KMb strategy (n = 17, 19.3%). One-third of the articles used existing TMFs (n = 28, 31.8%). A few articles only referred to existing TMFs but did not use them to guide a KMb strategy evaluation (n = 6, 8.5%).

The identified TMFs were then categorized according to their theoretical approaches (adapted from Nilsen, [32]) (Table 5). Five articles used or proposed more than one TMF, and three TMFs could be classified in two categories. Several articles proposed or used a process model (n = 34, 38.6%) or an evaluation framework (n = 28, 31.8%); these were the two most frequently identified types of TMFs. Fewer articles proposed or used a logic model (n = 17, 19.3%), a determinant framework (n = 12, 13.6%) or a classic theory (n = 7, 8.0%). The TMFs most often identified in the articles were the RE-AIM framework (n = 5, 5.7%), the Knowledge-to-Action framework [9] (n = 4, 4.5%), the Theory of Planned Behavior [51] (n = 3, 3.4%) and the Expanded Outcomes framework for planning and assessing continuing medical education [52] (n = 3, 3.4%). In total, we identified 87 different TMFs in the 88 articles. Only nine TMFS were retrieved in more than one article.

Table 5 List of theories, models and frameworks identified in the selected articles

What KMb strategies do the TMFs evaluate (activities, objectives, target audience)?

Thirty-eight articles reported using more than one activity in their KMb strategy. According to the ERIC compilation, “Train and educate stakeholders” activities were the most common, followed by “Develop stakeholder interrelationships” and “Use evaluative and iterative strategies”. Table 6 presents the various types of activities and the number of articles that referred to each.

Table 6 Types of KMb activities identified in the articles

Of the 88 articles analysed, 18 (20.4%) did not specify a KMb objective. The remaining articles proposed one or more KMb strategy objectives. Specifically, 39 (36.4%) articles had one objective, 15 (17.0%) had two, three (3.4%) had three, and 13 (14.8%) had four or five. Table 7 presents the different types of objectives and the number of times they were identified.

Table 7 Types of KMb objectives identified in the articles

The target audiences for KMb strategies were clearly specified in half of the articles (n = 44, 50.0%). Generally, these were empirical articles that targeted specific professionals (n = 36, 40.9%) or decision-makers (n = 8, 9.1%). Just under one-third of the articles identified a broad target audience (for example, professionals and managers in the health system, a health organization) (n = 26, 29.5%). Finally, 18 articles (20.4%) did not specify a target audience for KMb; these were most often theoretical articles.

What are the dimensions and components included in TMFs for evaluating KMb strategies?

The analysis of the identified TMFs revealed many factors of interest relevant for the evaluation of KMb strategies. These specific components were inductively classified into four main dimensions: context, process, effects and impacts (Fig. 2). The context dimension refers to the assessment of the conditions in place when the KMb strategy is implemented. These include both the external (that is, sociopolitical, economic, environmental and cultural characteristics) and internal environments (that is, characteristics of organizations, individuals and stakeholder partnerships). These factors are understood to influence the selection and tailoring of a KMb strategy. The process dimension refers to the assessment of the planning, levels and mechanisms of implementation, as well as to the characteristics of the KMb strategy implemented. The effects dimension refers to the assessment of outcomes following the KMb strategy implementation. The potential effects vary depending on the strategy’s objectives and can be either the immediate results of the KMb strategy or short-, medium- and long-term outcomes. The conceptual gradation of effects was generally represented in a similar way in the TMFs analysed, but the temporality of effects could vary. A medium-term outcome in one study could be understood as a long-term outcome in another. However, the majority of authors group these effects into three categories (Gervais et al. 2016: p. 6): (1) short-term effects, measured by success of KMb strategy measured by success of KMb strategy (number of people reached, satisfaction, participation and so on); (2) medium-term effects linked to changes in individual attitude and the use of knowledge; and (3) the long-term effects that result from achieving the KMb objective (for example, improved practices and services, changed collective behaviour, sustainable use of knowledge).

Fig. 2
figure 2

The main evaluation dimensions that emerged from the TMFs analysed

Finally, the impacts dimension refers to the ultimate effects of KMb products or interventions on end users, as measured by the organization (Phipps et al. [36], p. 34). The evaluation of these ultimate effects can be measured by the integration of a promising practice into organizational routines, by the effects on service users or by the effects on the health and well-being of communities and society in general.

This gradation shows the importance of measuring effects at different points in time, to take account of the time they take to appear and their evolving nature (Gervais et al., 2016: p. 6).

Most of the articles presented the dimensions that should be evaluated, whereas the empirical articles presented the dimensions but also used them in practice to evaluate a KMb strategy. Only five articles (5.7%) did not mention specific dimensions that could be classified.

Table 8 presents both the number of articles that presented dimensions to be evaluated and the number of articles that evaluated them in practice. These results showed that the effects dimension was both the most often named and the most evaluated in practice. The other three dimensions (context, process, impacts), while quite often mentioned as relevant to assess, were less often evaluated in practice. For example, only five articles (5.7%) reported having assessed the impacts dimension.

Table 8 Number of articles that mentioned or evaluated the different dimensions

As previously mentioned, the components relevant for the evaluation of KMb strategies were extracted from the identified TMFs. Table 9 presents these components, which represent the more specific factors of interest for assessing context, process, effects and impacts.

Table 9 Dimensions, subdimensions and components for evaluating KMb strategies

Discussion

Although often overlooked, the evaluation of KMb strategies is an essential step in guiding organizations seeking to determine whether the expected outcomes of their initiatives are being realized. Evaluation not only allows organizations to make adjustments if the initiatives are not producing the expected results, but also helps them to justify their funding of such initiatives. Evaluation is also essential if the KMb science is to truly inform KMb practice, such that the strategies developed are based on empirical data [30]. To make KMb evaluation more feasible, evaluation must be promoted and practices improved.

This scoping review meets the first objective of our project, which was to provide an overview of reference frameworks used or proposed for evaluating KM strategies, and to propose a preliminary version of a reference framework for evaluating KM strategies. Several key findings emerged from this scoping review:

Proliferation of theories, models and frameworks, but few frequently used

We are seeing a proliferation of TMFs in KMb and closely related fields [132, 133]. Thus, the results of this scoping review support the argument that the conceptual and theoretical development of the field is outpacing its empirical development. Most of the reviewed articles (42.0%) proposed a new TMF rather than using existing ones. Furthermore, we identified relatively few empirical studies (50.0%) that focused on the evaluation of KMb strategies. Consequently, the TMFs used were poorly consolidated, which does not provide a solid empirical foundation to guide the evaluation of KMb strategies. Also, not all the TMFs proposed in the articles were specifically developed for evaluation; some were focused on KMb implementation processes. These may still provide elements to consider for evaluation, although they were not designed to propose specific indicators.

A scoping review published in 2018 identified 596 studies using 159 different KMb TMFs, 95 of which had been used only once [11]. Many authors reported that these are rarely reused and validated [11, 30, 33] and that it is important to test, refine and integrate existing ones [3, 31, 34, 133]. A clear, collective and consistent use of existing TMFs is recommended and necessary to advance KMb science and closely related fields [12, 31]. The systematic review by Strifler et al. [11] highlights the diversity of available TMFs and the difficulty users may experience when choosing TMFs to guide their KMb initiatives or evaluation process. Future work should focus on the development of tools to better support users of TMFs, especially those working in organizations. By consolidating a large number of TMFs, the results of this scoping review contribute to these efforts.

The importance of improving evaluation practices for complex multifaceted KMb strategies

Another noteworthy finding was the emphasis on the evaluation of strategies focused on education and professional training for practice improvement (52.3%). Relatively few of the reviewed articles looked at, for example, the evaluation of KMb strategies aimed at informing or influencing decision-making (13.6%), or KMb strategies targeting decision-makers (9.1%). These results reaffirm the importance of conducting more large-scale evaluations of complex and multifaceted KMb strategies. These involve a greater degree of interaction and engagement, are composed of networks of multiple actors, mobilize diverse sources of knowledge and have simultaneous multilevel objectives [19, 134].

The fact that some KMb strategies are complex interventions implemented in complex contexts [134] presents a significant and recurring challenge to their evaluation. Methodological designs, approaches and tools are often ill-suited to capture the short-, medium- and long-term outcomes of KMb strategies, as well as to identify the mechanisms by which these outcomes were produced in a specific context. It is also difficult to link concrete changes in practice and decision-making to tangible longer-term impacts at the population level. Moreover, these impacts can take years to be achieved [36] and can be influenced by several other factors in addition to KMb efforts [2, 19, 24]. Comprehensive, dynamic and flexible evaluation approaches [135,136,137] using mixed methods [20] appear necessary to understand why, for whom, how, when and in what context KMb strategies achieve their objectives [2, 21, 25]. For instance, realist evaluation, which belongs to theory-based evaluation, may be an approach that addresses issues of causality without sacrificing complexity [134, 138, 139]. This evaluation approach aims to identify the underlying generative mechanisms that can explain how the outcomes were generated and what characteristics of the context affected, or not, those mechanisms. This approach is used to test and refine theory about how interventions with a similar logic of action actually work [139].

Large heterogeneity of methodologies used in empirical studies

Despite the growth of the KMb field, a recurring issue is the relatively limited number of high-quality studies that evaluate KMb outcomes and impacts. This observation is shared by many of the authors of our scoping articles [2, 22,23,24,25]. Only a limited number of empirical articles met the selection criteria (n = 44/88) in this scoping review. Synthesizing these studies is challenging due to the diversity of research designs used and the large number of potential evaluation components identified. In addition, most of the identified studies used TMFs and measurement tools that were not validated [20, 29] and that were specifically developed for their study [16, 25, 140]. Moreover, these studies did not describe the methods used to justify their choice of evaluation dimensions and components [25], which greatly hinders the ability to draw inferences and develop generalizable theories through replication in similar studies [110, 140,141,142,143]. The lack of a widely used evaluation approach across the field is therefore an important issue [16, 20] also highlighted by this scoping review.

Our aim in this review was not to identify specific indicators or measurement tools (for example, questionnaires) for assessing KMb strategies, but rather to describe dimensions and component of TMFs used for KMb evaluation. However, a recent scoping review [144] looked at measurement tools and revealed that only two general potential tools have been identified to assess KMb activities in any sector or organization: the Level of Knowledge Use Survey (LOKUS) [145] and the Knowledge Uptake and Utilization Tool (KUUT) [95]. The authors also assert the importance of developing standardized tools and evaluation processes to facilitate comparison of KMb activities’ outcomes across organizations [144].

Lack of description and reporting of KMb strategies and evaluation

Another important finding from this review was the sparsity of descriptions of KMb strategies in the published articles. In general, the authors provided little information on the operationalization of their KMb strategies (for example, objectives, target audiences, details of activities implemented, implementation context, expected effects). The KMb strategy objectives and the implemented activities should be carefully selected and empirically, theoretically or pragmatically justified before the evaluation components and specific indicators can be determined [146].

To improve consistency in the field and to contribute to the development of KMb science, many authors reported the need to better describe and report KMb strategies and their context [8, 54, 146,147,148,149,150]. KMb strategies are often inconsistently labelled across studies, poorly described and rarely justified theoretically [146, 150, 151]. It was not possible in this scoping review to associate the evaluation components to be used with the objectives and types of KMb strategies, as too much information was missing in the articles. Over the past 10 years, several guidelines have been proposed to improve the reporting of interventions such as KMb strategies: the “Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations checklist” [147], the “Standards for Reporting Implementation Studies (StaRI)” [150] and the “Template for Intervention Description and Replication (TIDieR)” [152]. These guidelines should be used more often to enhance the reporting of KMb strategies and help advance the field [153].

Implications for future research

This scoping review provides an overview of potential factors of interest for assessing the context, process, effects and impacts of a KMb strategy. It also proposes a preliminary inventory of potential dimensions and components to consider when planning the evaluation of a KMb strategy. Given the broad spectrum of factors of interest identified across studies, not all of them can be assessed in every context. Rather, they should be targeted according to the objectives of the evaluation, the nature of the KMb strategy and the resources available to conduct the evaluation. Thus, this inventory should not be understood as a prescriptive, normative and exhaustive framework, but rather as a toolbox to identify the most relevant factors to include in the evaluation of a given KMB strategy, and to address a need often expressed by organizations wishing to evaluate their KMb efforts.

Additional work is needed to validate and operationalize these dimensions, to identify relevant measurement tools related to the different components and to see how this inventory could support KMb evaluation practices in organizations.

This scoping review is the first stage of a larger research project aimed at improving organizations’ capacity to evaluate their KMb initiatives by developing an integrative, interdisciplinary and easy-to-use reference framework. In the second phase of the project, the relevance and clarity of the evaluation dimensions identified in the scoping review will be validated through a Delphi study with KMb specialists and researchers. The enriched framework will then be pilot tested in two organizations carrying out and evaluating KMb strategies, to adapt the framework to their needs and to further clarify how the dimensions can be measured in practice. In this third phase, guidance will be provided to help organizations adopt the framework and its support kit. The aim of the project is to go beyond proposing a theoretical framework, and to help build organizations’ capacity to evaluate KT strategies by proposing tools adapted to their realities.

Review limitations

Some limitations of this scoping review should be acknowledged. First, given the numerous different terms used to describe and conceptualize the science of using evidence, it is possible that our search strategy did not capture all relevant publications. However, to limit this risk, we manually searched the reference lists of the selected articles. Second, the literature search was limited to articles published in English or French, and the articles were mostly from high-income countries (for example, North America); therefore, the application of the identified concepts in this scoping review to other contexts should be further explored.

In addition, the search strategy focused on scientific publications to assess progress made in the field of knowledge mobilization strategy evaluation. The grey literature was not examined. It should be considered in future research to complete the overview of evaluation needs in the field of knowledge mobilization.

Finally, the paucity of information in the articles sometimes made it difficult to classify the TMFs according to the taxonomies [32, 44], which may have led to possible misinterpretation. However, to limit the risk of errors, the categorization was performed by two reviewers and validated by a third in cases of uncertainty.

Conclusions

Given the increasing demand from organizations for the evaluation of KMb strategies, along with the poorly consolidated KMb research field, a scoping review was needed to identify the range, nature and extent of the literature. This scoping review enabled us to synthesize the breadth of the literature, provide an overview of the many theories, models and frameworks used, and identify and categorize the potential dimensions and components to consider when evaluating KMb initiatives. This scoping review is part of a larger research project, in which the next steps will be to validate the integrative framework and develop a support kit to facilitate its use by organizations involved in KMb.

Availability of data and materials

The dataset supporting the conclusions of this article is included within the article and its additional files.

Abbreviations

KMb:

Knowledge mobilization

TMFs:

Theories, models, and frameworks

References

  1. Social Sciences and Humanities Research Council. Guidelines for Effective Knowledge Mobilization. 2019. https://www.sshrc-crsh.gc.ca/funding-financement/policies-politiques/knowledge_mobilisation-mobilisation_des_connaissances-eng.aspx Accessed 28 Dec 2022.

  2. Boaz A, Davies H, Fraser A, Nutley S. What works now? evidence-informed policy and practice. Bristol: Policy press; 2019.

    Book  Google Scholar 

  3. Curran JA, Grimshaw JM, Hayden JA, Campbell B. Knowledge translation research: the science of moving research into policy and practice. J Contin Educ Heal Prof. 2011;31(3):174–80.

    Article  Google Scholar 

  4. Global Commission on Evidence. The Evidence Commission report: A wake-up call and path forward for decision-makers, evidence intermediaries, and impact-oriented evidence producers. McMaster University; 2022 p. 144. https://www.mcmasterforum.org/networks/evidence-commission/report/english

  5. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104(12):510–20.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Orton L, Lloyd-Williams F, Taylor-Robinson D, O’Flaherty M, Capewell S. The use of research evidence in public health decision making processes: systematic review. PLoS ONE. 2011;6(7): e21704.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Straus SE, Tetroe J, Graham ID, editors. Knowledge translation in health care: moving from evidence to practice. 2nd ed. Chichester, West Sussex ; Hoboken, NJ: Wiley/BMJ Books; 2013, 406

  8. Barwick M, Dubrowski R, Petricca K. Knowledge translation: The rise of implementation. 2020; https://ktdrr.org/products/kt-implementation/KT-Implementation-508.pdf

  9. Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Continuing Educ Health Professions. 2006;26(1):13–24.

    Article  Google Scholar 

  10. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102.

    Article  PubMed  Google Scholar 

  12. Esmail R, Hanson HM, Holroyd-Leduc J, Brown S, Strifler L, Straus SE, et al. A scoping review of full-spectrum knowledge translation theories, models, and frameworks. Implement Sci. 2020;15(1):11.

    Article  PubMed  PubMed Central  Google Scholar 

  13. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5(1):16.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Fonds de recherche du Québec. Stratégie de mobilisation des connaissances 2014–2017. 2014. https://frq.gouv.qc.ca/en/mobilization-of-knowledge/. Accessed 28 Dec 2022.

  15. Gervais MJ, Souffez K, Ziam S. Quel impact avons-nous ? Vers l’élaboration d’un cadre pour rendre visibles les retombées du transfert des connaissances. TUC Revue francophone de recherche sur le transfert et l’utilisation des connaissances. 2016;1(2):21.

    Google Scholar 

  16. Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatr. 2019;60(4):430–50.

    Article  Google Scholar 

  17. Mitton C, Adair CE, Mckenzie E, Patten SB, Perry BW. Knowledge transfer and exchange: review and synthesis of the literature. Milbank Q. 2007;85(4):729–68.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Fazey I, Bunse L, Msika J, Pinke M, Preedy K, Evely AC, et al. Evaluating knowledge exchange in interdisciplinary and multi-stakeholder research. Glob Environ Chang. 2014;25:204–20.

    Article  Google Scholar 

  20. Gervais MJ, Marion C, Dagenais C, Chiocchio F, Houlfort N. Dealing with the complexity of evaluating knowledge transfer strategies: guiding principles for developing valid instruments. Res Eval. 2016;25(1):62–9.

    Article  Google Scholar 

  21. Reed MS, Bryce R, Machen R. Pathways to policy impact: a new approach for planning and evidencing research impact. Evid policy. 2018;14(3):431–58.

    Article  Google Scholar 

  22. Kim C, Wilcher R, Petruney T, Krueger K, Wynne L, Zan T. A research utilisation framework for informing global health and development policies and programmes. Health Res Policy Sys. 2018;16(1):9.

    Article  Google Scholar 

  23. Langer L, Tripney J, Gough D University of London, Social Science Research Unit, Evidence for Policy and Practice Information and Co-ordinating Centre. The science of using science: researching the use of research evidence in decision-making. 2016.

  24. Rajić A, Young I, McEwen SA. Improving the utilization of research knowledge in agri-food public health: a mixed-method review of knowledge translation and transfer. Foodborne Pathog Dis. 2013;10(5):397–412.

    Article  PubMed  Google Scholar 

  25. Scarlett J, Forsberg BC, Biermann O, Kuchenmüller T, El-Khatib Z. Indicators to evaluate organisational knowledge brokers: a scoping review. Health Res Policy Syst. 2020;18(1):93.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Bornbaum CC, Kornas K, Peirson L, Rosella LC. Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: a systematic review and thematic analysis. Implement Sci. 2015;10(1):162.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Sarkies MN, Bowles KA, Skinner EH, Haas R, Lane H, Haines TP. The effectiveness of research implementation strategies for promoting evidence-informed policy and management decisions in healthcare: a systematic review. Implement Sci. 2017;12(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Scott SD, Albrecht L, O’Leary K, Ball GD, Hartling L, Hofmeyer A, et al. Systematic review of knowledge translation strategies in the allied health professions. Implement Sci. 2012;7(1):70.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Dagenais C, Malo M, Robert É, Ouimet M, Berthelette D, Ridde V. Knowledge transfer on complex social interventions in public health: a scoping study. PLoS ONE. 2013;8(12): e80233.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Davies HT, Powell AE, Nutley SM. Mobilising knowledge to improve UK health care: learning from other countries and other sectors – a multimethod mapping study. Health Serv Deliv Res. 2015;3(27):1–190.

    Article  Google Scholar 

  31. Damschroder LJ. Clarity out of chaos: use of theory in implementation research. Psychiatry Res. 2020;283: 112461.

    Article  PubMed  Google Scholar 

  32. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Ellen ME, Panisset U, Araujo de Carvalho I, Goodwin J, Beard J. A knowledge translation framework on ageing and health. Health Policy. 2017;121(3):282–91.

    Article  PubMed  Google Scholar 

  34. Wensing M, Bosch M, Grol R. Developing and selecting interventions for translating knowledge to action. CMAJ. 2010;182(2):E85–8.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Bennet A, Bennet D, Fafard K, Fonda M, Lomond T, Messier L, et al. Knowledge mobilization in the social sciences and humanities: moving from research to action. Frost: MQI Press; 2007.

    Google Scholar 

  36. Phipps D, Cummins J, Pepler D, Craig W, Cardinal S. The Co-produced Pathway to Impact Describes Knowledge Mobilization Processes. JCES. 2016;9(1). https://jces.ua.edu/articles/258. Accessed 17 Nov 2022.

  37. Birken SA, Rohweder CL, Powell BJ, Shea CM, Scott J, Leeman J, et al. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13(1):143.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Munn Z, Peters MDJ, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol. 2018;18(1):143.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

    Article  Google Scholar 

  40. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73.

    Article  PubMed  Google Scholar 

  41. Moullin JC, Sabater-Hernandez D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13(101170481):16.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Pham MT, Rajić A, Greig JD, Sargeant JM, Papadopoulos A, McEwen SA. A scoping review of scoping reviews: advancing the approach and enhancing the consistency. Res Synthesis Methods. 2014;5(4):371–85.

    Article  Google Scholar 

  43. Farkas M, Jette AM, Tennstedt S, Haley SM, Quinn V. Knowledge dissemination and utilization in gerontology: an organizing framework. Gerontologist. 2003;43:47–56.

    Article  PubMed  Google Scholar 

  44. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10(1):109.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Smith JD, Li DH, Rafferty MR. The implementation research logic model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15(1):84.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  48. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–58.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  49. Sketris IS, Carter N, Traynor RL, Watts D, Kelly K, following contributing members of the CNODES Knowledge Translation Team: Pierre Ernst JG Brenda Hemmelgarn, Colleen Metge, Michael Paterson, Robert Platt W and Gary Teare. Building a framework for the evaluation of knowledge translation for the Canadian Network for Observational Drug Effect Studies. Pharmacoepidemiol Drug Saf. 2020;29 Suppl 1(d0r, 9208369):8–25.

  50. Thomas DR. A general inductive approach for analyzing qualitative evaluation data. Am J Eval. 2006;27(2):237–46.

    Article  Google Scholar 

  51. Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211.

    Article  Google Scholar 

  52. Moore DE, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1–15.

    Article  PubMed  Google Scholar 

  53. Tschida JE, Drahota A. Fidelity to the ACT SMART Toolkit: an instrumental case study of implementation strategy fidelity. Implement Sci Commun. 2023;4(1):52.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9(1):781.

    Article  Google Scholar 

  55. Bertone MP, Meessen B, Clarysse G, Hercot D, Kelley A, Kafando Y, et al. Assessing communities of practice in health policy: a conceptual framework as a first step towards empirical research. Health Res Policy Sys. 2013;11(1):39.

    Article  Google Scholar 

  56. Gagliardi AR, Legare F, Brouwers MC, Webster F, Wiljer D, Badley E, et al. Protocol: developing a conceptual framework of patient mediated knowledge translation, systematic review using a realist approach. Implement Sci. 2011;6(101258411):25.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Sargeant J, Borduas F, Sales A, Klein D, Lynn B, Stenerson H. CPD and KT: models used and opportunities for synergy. J Contin Educ Heal Prof. 2011;31(3):167–73.

    Article  Google Scholar 

  58. Stetler CB, Ritchie J, Rycroft-Malone J, Schultz A, Charns M. Improving quality of care through routine, successful implementation of evidence-based practice at the bedside: an organizational case study protocol using the Pettigrew and Whipp model of strategic change. Implement Sci. 2007;2(101258411):3.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Kok MO, Schuit AJ. Contribution mapping: a method for mapping the contribution of research to enhance its impact. Health Res Policy Syst. 2012;10(101170481):21.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Dadich A. From bench to bedside: methods that help clinicians use evidence-based practice. Aust Psychol. 2010;45(3):197–211.

    Article  Google Scholar 

  61. Brown P, Bahri P. Engagement’ of patients and healthcare professionals in regulatory pharmacovigilance: establishing a conceptual and methodological framework. Eur J Clin Pharmacol. 2019;75(9):1181–92.

    Article  CAS  PubMed  Google Scholar 

  62. Dobbins M, Ciliska D, Cockerill R, Barnsley J, DiCenso A. A framework for the dissemination and utilization of research for health-care policy and practice. Worldviews Evid Based Nurs Presents Arch Online J Knowl Synthesis Nurs. 2002;9(1):149–60.

    Article  Google Scholar 

  63. Gagliardi AR, Brouwers MC, Bhattacharyya OK. The guideline implementability research and application network (GIRAnet): an international collaborative to support knowledge exchange: study protocol. Implement Sci. 2012;7(101258411):26.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Brooks SP, Zimmermann GL, Lang M, Scott SD, Thomson D, Wilkes G, et al. A framework to guide storytelling as a knowledge translation intervention for health-promoting behaviour change. Implement sci commun. 2022;3(1):35.

    Article  PubMed  PubMed Central  Google Scholar 

  65. Cullen L, Hanrahan K, Edmonds SW, Reisinger HS, Wagner M. Iowa implementation for sustainability framework. Implement Sci. 2022;17(1):1.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Labbé D, Mahmood A, Miller WC, Mortenson WB. Examining the impact of knowledge mobilization strategies to inform urban stakeholders on accessibility: a mixed-methods study. Int J Environ Res Public Health. 2020;17(5):1561.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Straus SE, Tetroe J, Graham ID, Zwarenstein M, Bhattacharyya O, Shepperd S. Monitoring use of knowledge and evaluating outcomes. Can Med Assoc J. 2010;182(2):E94–8.

    Article  Google Scholar 

  68. Bennett S, Whitehead M, Eames S, Fleming J, Low S, Caldwell E. Building capacity for knowledge translation in occupational therapy: learning through participatory action research. BMC Med Educ. 2016;16(1):257.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Brown C, Rogers S. Measuring the effectiveness of knowledge creation as a means of facilitating evidence-informed practice in early years settings in one London Borough. Lond Rev Educ. 2014;12(3):245–60.

    Article  Google Scholar 

  70. Talbott E, De Los RA, Kearns DM, Mancilla-Martinez J, Wang M. Evidence-based assessment in special education research: advancing the use of evidence in assessment tools and empirical processes. Except Child. 2023;89(4):467–87.

    Article  Google Scholar 

  71. Rosella LC, Bornbaum C, Kornas K, Lebenbaum M, Peirson L, Fransoo R, et al. Evaluating the process and outcomes of a knowledge translation approach to supporting use of the Diabetes Population Risk Tool (DPoRT) in public health practice. Canadian J Program Eval. 2018;33(1):21–48.

    Article  Google Scholar 

  72. Couineau AL, Forbes D. Using predictive models of behavior change to promote evidence-based treatment for PTSD. Psychol Trauma Theory Res Pract Policy. 2011;3(3):266–75.

    Article  Google Scholar 

  73. Dufault M. Testing a collaborative research utilization model to translate best practices in pain management. Worldviews Evid Based Nurs. 2004;1:S26-32.

    Article  PubMed  Google Scholar 

  74. Beckett K, Farr M, Kothari A, Wye L, le May A. Embracing complexity and uncertainty to create impact: exploring the processes and transformative potential of co-produced research through development of a social impact model. Health Res Policy Syst. 2018;16(1):118.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Kramer DM, Wells RP, Carlan N, Aversa T, Bigelow PP, Dixon SM, et al. Did you have an impact? A theory-based method for planning and evaluating knowledge-transfer and exchange activities in occupational health and safety. Int J Occup Saf Ergon. 2013;19(1):41–62.

    Article  PubMed  Google Scholar 

  76. Duhamel F, Dupuis F, Turcotte A, Martinez AM, Goudreau J. Integrating the illness beliefs model in clinical practice: a family systems nursing knowledge utilization model. J FAM NURS. 2015;21(2):322–48.

    Article  PubMed  Google Scholar 

  77. Wimpenny P, Johnson N, Walter I, Wilkinson JE. Tracing and identifying the impact of evidence-use of a modified pipeline model. Worldviews Evid Based Nurs. 2008;5(1):3–12.

    Article  PubMed  Google Scholar 

  78. Ward V, Smith S, House A, Hamer S. Exploring knowledge exchange: a useful framework for practice and policy. Soc Sci Med. 2012;74(3):297–304.

    Article  PubMed  Google Scholar 

  79. Grooten L, Vrijhoef HJM, Alhambra-Borras T, Whitehouse D, Devroey D. The transfer of knowledge on integrated care among five European regions: a qualitative multi-method study. BMC Health Serv Res. 2020;20(1):11.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Stetler CB. Updating the Stetler Model of research utilization to facilitate evidence-based practice. Nurs Outlook. 2001;49(6):272–9.

    Article  CAS  PubMed  Google Scholar 

  81. Ward V. Why, whose, what and how? A framework for knowledge mobilisers. Evid Policy J Res Debate Pract. 2017;13(3):477–97.

    Article  Google Scholar 

  82. Levin RF, Fineout-Overholt E, Melnyk BM, Barnes M, Vetter MJ. Fostering evidence-based practice to improve nurse and cost outcomes in a community health setting: a pilot test of the advancing research and clinical practice through close collaboration model. Nurs Adm Q. 2011;35(1):21–33.

    Article  PubMed  Google Scholar 

  83. Currie M, King G, Rosenbaum P, Law M, Kertoy M, Specht J. A model of impacts of research partnerships in health and social services. Eval Program Plann. 2005;28(4):400–12.

    Article  Google Scholar 

  84. Richard L, Chiocchio F, Essiembre H, Tremblay MC, Lamy G, Champagne F, et al. Communities of practice as a professional and organizational development strategy in local public health organizations in Quebec, Canada: an evaluation model. Healthc Policy. 2014;9(3):26–39.

    PubMed  PubMed Central  Google Scholar 

  85. Rycroft-Malone J, Wilkinson J, Burton CR, Harvey G, McCormack B, Graham I, et al. Collaborative action around implementation in collaborations for leadership in applied health research and care: towards a programme theory. J Health Serv Res Policy. 2013;18(3 Suppl):13–26.

    Article  PubMed  Google Scholar 

  86. Gagliardi AR, Fraser N, Wright FC, Lemieux-Charles L, Davis D. Fostering knowledge exchange between researchers and decision-makers: exploring the effectiveness of a mixed-methods approach. Health Policy. 2008;86(1):53–63.

    Article  PubMed  Google Scholar 

  87. Paquette-Warren J, Harris SB, Naqshbandi Hayward M, Tompkins JW. Case study of evaluations that go beyond clinical outcomes to assess quality improvement diabetes programmes using the Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE). J Eval Clin Pract. 2016;22(5):644–52.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Paquette-Warren J, Tyler M, Fournie M, Harris SB. The diabetes evaluation framework for innovative national evaluations (DEFINE): construct and content validation using a modified Delphi method. Can J diabetes. 2017;41(3):281–96.

    Article  PubMed  Google Scholar 

  89. Abbot ML, Lee KK, Rossiter MJ. Evaluating the effectiveness and functionality of professional learning communities in adult ESL Programs. TESL Canada J. 2018;35(2):1–25.

    Article  Google Scholar 

  90. Ho K, Bloch R, Gondocz T, Laprise R, Perrier L, Ryan D, et al. Technology-enabled knowledge translation: frameworks to promote research and practice. J Contin Educ Heal Prof. 2004;24(2):90–9.

    Article  Google Scholar 

  91. Yu X, Hu D, Li N, Xiao Y. Comprehensive evaluation on teachers’ knowledge sharing behavior based on the improved TOPSIS method. Comput Intell Neurosci. 2022;2022(101279357):2563210.

    PubMed  PubMed Central  Google Scholar 

  92. Arora S, Kalishman SG, Thornton KA, Komaromy MS, Katzman JG, Struminger BB, et al. Project ECHO: a telementoring network model for continuing professional development. J Contin Educ Health Prof. 2017;37(4):239–44.

    Article  PubMed  Google Scholar 

  93. Smidt A, Balandin S, Sigafoos J, Reed VA. The Kirkpatrick model: a useful tool for evaluating training outcomes. J Intellect Dev Disabil. 2009;34(3):266–74.

    Article  PubMed  Google Scholar 

  94. Jeffs L, Sidani S, Rose D, Espin S, Smith O, Martin K, et al. Using theory and evidence to drive measurement of patient, nurse and organizational outcomes of professional nursing practice. Int J Nurs Pract. 2013;19(2):141–8.

    Article  PubMed  Google Scholar 

  95. Skinner K. Developing a tool to measure knowledge exchange outcomes. Can J Program Eval. 2007;22(1):49–75.

    Article  Google Scholar 

  96. Lavis J, Ross S, McLeod C, Gildiner A. Measuring the impact of health research. J Health Serv Res Policy. 2003;8(3):165–70.

    Article  PubMed  Google Scholar 

  97. Boyko JA, Lavis JN, Abelson J, Dobbins M, Carter N. Deliberative dialogues as a mechanism for knowledge translation and exchange in health systems decision-making. Soc Sci Med. 2012;75(11):1938–45.

    Article  PubMed  Google Scholar 

  98. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

    Article  PubMed  Google Scholar 

  99. Gainforth HL, Latimer-Cheung AE, Athanasopoulos P, Martin Ginis KA. Examining the feasibility and effectiveness of a community-based organization implementing an event-based knowledge mobilization initiative to promote physical activity guidelines for people with spinal cord injury among support personnel. Health Promot Pract. 2015;16(1):55–62.

    Article  PubMed  Google Scholar 

  100. Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019. https://doi.org/10.3389/fpubh.2019.00064.

    Article  PubMed  PubMed Central  Google Scholar 

  101. Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8(101616579):134.

    Article  PubMed  PubMed Central  Google Scholar 

  102. Bender BG, Simmons B, Konkoly N, Liu AH. The asthma toolkit bootcamp to improve rural primary care for pediatric asthma. J Allergy Clin Immunol Pract. 2021;9(8):3091-3097.e1.

    Article  PubMed  Google Scholar 

  103. de la Garza Iga FJ, Mejia Alvarez M, Cockroft JD, Rabin J, Cordon A, Elias Rodas DM, et al. Using the project ECHO TM model to teach mental health topics in rural Guatemala: an implementation science-guided evaluation. Int J Soc Psychiatry. 2023;69(8):2031–41.

    Article  PubMed  Google Scholar 

  104. Alkin M, Taut S. Unbundling evaluation use. Stud Educ Eval. 2003;29(1):1–12.

    Article  Google Scholar 

  105. Varallyay NI, Langlois EV, Tran N, Elias V, Reveiz L. Health system decision-makers at the helm of implementation research: development of a framework to evaluate the processes and effectiveness of embedded approaches. Health Res Policy Syst. 2020;18(1):64.

    Article  PubMed  PubMed Central  Google Scholar 

  106. McCabe KE, Wallace A, Crosland A. A model for collaborative working to facilitate knowledge mobilisation in public health. Evid Policy. 2015;11(4):559–76.

    Article  Google Scholar 

  107. Gonzales R, Handley MA, Ackerman S, O’sullivan PS. A framework for training health professionals in implementation and dissemination science. Acad Med. 2012;87(3):271–8.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Edgar L, Herbert R, Lambert S, MacDonald JA, Dubois S, Latimer M. The joint venture model of knowledge utilization: a guide for change in nursing. Nurs Leadersh. 2006;9(2):41–55.

    Article  Google Scholar 

  109. Stetler CB, Damschroder LJ, Helfrich CD, Hagedorn HJ. A Guide for applying a revised version of the PARIHS framework for implementation. Implement Sci. 2011;6(101258411):99.

    Article  PubMed  PubMed Central  Google Scholar 

  110. Brennan SE, Cumpston M, Misso ML, McDonald S, Murphy MJ, Green SE. Design and formative evaluation of the policy liaison initiative: a long-term knowledge translation strategy to encourage and support the use of cochrane systematic reviews for informing. Evid Policy. 2016;12(1):25–52.

    Article  Google Scholar 

  111. Hinchcliff R, Senserrick T, Travaglia J, Greenfield D, Ivers R. The enhanced knowledge translation and exchange framework for road safety: a brief report on its development and potential impacts. Inj Prev. 2017;23(2):114–7.

    Article  PubMed  Google Scholar 

  112. Ye J, Woods D, Bannon J, Bilaver L, Kricke G, McHugh M, et al. Identifying contextual factors and strategies for practice facilitation in primary care quality improvement using an informatics-driven model: framework development and mixed methods case study. JMIR Hum Factors. 2022;9(2): e32174.

    Article  PubMed  PubMed Central  Google Scholar 

  113. Brangan J, Quinn S, Spirtos M. Impact of an evidence-based practice course on occupational therapist’s confidence levels and goals. Occup Ther Health Care. 2015;29(1):27–38.

    Article  PubMed  Google Scholar 

  114. Bonetti D, Johnston M, Pitts NB, Deery C, Ricketts I, Tilley C, et al. Knowledge may not be the best target for strategies to influence evidence-based practice: using psychological models to understand RCT effects. Int J Behav Med. 2009;16(3):287–93.

    Article  CAS  PubMed  Google Scholar 

  115. Buckley LL, Goering P, Parikh SV, Butterill D, Foo EKH. Applying a “stages of change” model to enhance a traditional evaluation of a research transfer course. J Eval Clin Pract. 2003;9(4):385–90.

    Article  PubMed  Google Scholar 

  116. Boyko JA, Lavis JN, Dobbins M, Souza NM. Reliability of a tool for measuring theory of planned behaviour constructs for use in evaluating research use in policymaking. Health Res Policy Syst. 2011;24(9):29.

    Article  Google Scholar 

  117. Imani-Nasab MH, Yazdizadeh B, Salehi M, Seyedin H, Majdzadeh R. Validity and reliability of the Evidence Utilisation in Policymaking Measurement Tool (EUPMT). Health Res Policy Syst. 2017;15(1):66.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  118. Dwan KM, McInnes P, Mazumdar S. Measuring the success of facilitated engagement between knowledge producers and users: a validated scale. Evid Policy. 2015;11(2):239–52.

    Article  Google Scholar 

  119. Haynes A, Rowbotham S, Grunseit A, Bohn-Goldbaum E, Slaytor E, Wilson A, et al. Knowledge mobilisation in practice: an evaluation of the Australian Prevention Partnership Centre. Health Res Policy Sys. 2020;18(1):13.

    Article  Google Scholar 

  120. Haines M, Brown B, Craig J, D’Este C, Elliott E, Klineberg E, et al. Determinants of successful clinical networks: the conceptual framework and study protocol. Implement Sci. 2012;7(101258411):16.

    Article  PubMed  PubMed Central  Google Scholar 

  121. Ko LK, Jang SH, Friedman DB, Glanz K, Leeman J, Hannon PA, et al. An application of the science impact framework to the cancer prevention and control research network from 2014–2018. Prev Med. 2019;12: 105821.

    Article  Google Scholar 

  122. Leeman J, Sommers J, Vu M, Jernigan J, Payne G, Thompson D, et al. An evaluation framework for obesity prevention policy interventions. Prev Chronic Dis. 2012;9(101205018):E120.

    PubMed  PubMed Central  Google Scholar 

  123. Pettman TL, Armstrong R, Waters E, Allender S, Love P, Gill T, et al. Evaluation of a knowledge translation and exchange platform to advance non-communicable disease prevention. Evid Policy. 2016;12(1):109–26.

    Article  Google Scholar 

  124. Yearwood AC. Applying a logical theory of change for strengthening research uptake in policy: a case study of the Evidence Informed Decision Making Network of the Caribbean. Rev Panam Salud Publica. 2018;42: e91.

    Article  PubMed  PubMed Central  Google Scholar 

  125. Thomson D, Brooks S, Nuspl M, Hartling L. Programme theory development and formative evaluation of a provincial knowledge translation unit. Health Res Policy Syst. 2019;17(1):40.

    Article  PubMed  PubMed Central  Google Scholar 

  126. Garad R, Kozica-Olenski S, Teede HJ. Evaluation of a center of research excellence in polycystic ovary syndrome as a large-scale collaborative research translation initiative, including evaluating translation of guideline impact. Semin Reprod Med. 2018;36(1):42–9.

    Article  PubMed  Google Scholar 

  127. Reddy S, Wakerman J, Westhorp G, Herring S. Evaluating impact of clinical guidelines using a realist evaluation framework. J Eval Clin Pract. 2015;21(6):1114–20.

    Article  PubMed  Google Scholar 

  128. Van Eerd D, Moser C, Saunders R. A research impact model for work and health. Am J Ind Med. 2021;64(1):3–12.

    Article  PubMed  Google Scholar 

  129. Yip O, Huber E, Stenz S, Zullig LL, Zeller A, De Geest SM, et al. A contextual analysis and logic model for integrated care for frail older adults living at home: The INSPIRE Project. Int J Integr Care. 2021;21(2):9.

    Article  PubMed  PubMed Central  Google Scholar 

  130. Guo R, Bain BA, Willer J. Application of a logic model to an evidence-based practice training program for speech-language pathologists and audiologists. J Allied Health. 2011;40(1):e23–8.

    PubMed  Google Scholar 

  131. McDonald S, Turner T, Chamberlain C, Lumbiganon P, Thinkhamrop J, Festin MR, et al. Building capacity for evidence generation, synthesis and implementation to improve the care of mothers and babies in South East Asia: methods and design of the SEA-ORCHID Project using a logical framework approach. BMC Med Res Methodol. 2010;10(100968545):61.

    Article  PubMed  PubMed Central  Google Scholar 

  132. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.

    Article  PubMed  PubMed Central  Google Scholar 

  133. Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med. 2019;17(1):88.

    Article  PubMed  PubMed Central  Google Scholar 

  134. Kreindler SA. Advancing the evaluation of integrated knowledge translation. Health Res Policy Sys. 2018;16(1):104.

    Article  Google Scholar 

  135. Best A, Holmes B. Systems thinking, knowledge and action: towards better models and methods. Evid Policy. 2010;6(2):145–59.

    Article  Google Scholar 

  136. van Mil HGJ, Foegeding EA, Windhab EJ, Perrot N, van der Linden E. A complex system approach to address world challenges in food and agriculture. Trends Food Sci Technol. 2014;40(1):20–32.

    Article  Google Scholar 

  137. Wehrens R. Beyond two communities – from research utilization and knowledge translation to co-production? Public Health. 2014;128(6):545–51.

    Article  CAS  PubMed  Google Scholar 

  138. Ridde V, Pérez D, Robert E. Using implementation science theories and frameworks in global health. BMJ Glob Health. 2020;5(4): e002269.

    Article  PubMed  PubMed Central  Google Scholar 

  139. Salter KL, Kothari A. Using realist evaluation to open the black box of knowledge translation: a state-of-the-art review. Implement Sci. 2014;9(1):115.

    Article  PubMed  PubMed Central  Google Scholar 

  140. Van Eerd D, Cole D, Keown K, Irvin E, Kramer D, Gibson B, et al. Report on knowledge transfer and exchange practices: A systematic review of the quality and types of instruments used to assess KTE implementation and impact. Toronto: Institute for Work & Health; 2011 p. 130. https://www.iwh.on.ca/sites/iwh/files/iwh/reports/iwh_sys_review_kte_evaluation_tools_2011_rev.pdf

  141. Dobbins M, Robeson P, Ciliska D, Hanna S, Cameron R, O’Mara L, et al. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci. 2009;4(1):23.

    Article  PubMed  PubMed Central  Google Scholar 

  142. Rychetnik L, Bauman A, Laws R, King L, Rissel C, Nutbeam D, et al. Translating research for evidence-based public health: key concepts and future directions. J Epidemiol Community Health. 2012;66(12):1187–92.

    Article  PubMed  Google Scholar 

  143. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12(1):108.

    Article  PubMed  PubMed Central  Google Scholar 

  144. Bhawra J, Skinner K. Examination of tools associated with the evaluation of knowledge uptake and utilization: a scoping review. Eval Program Plann. 2020;83: 101875.

    Article  PubMed  Google Scholar 

  145. Lane JP, Stone VI, Nobrega A, Tomita M. Level Of Knowledge Use Survey (LOKUS): a validated instrument for tracking knowledge uptake and use. Stud Health Technol Inform. 2015;217:106–10.

    PubMed  Google Scholar 

  146. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139.

    Article  PubMed  PubMed Central  Google Scholar 

  147. Albrecht L, Archibald M, Arseneau D, Scott SD. Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations. Implement Sci. 2013;8(1):52.

    Article  PubMed  PubMed Central  Google Scholar 

  148. Bragge P, Grimshaw JM, Lokker C, Colquhoun H, Albrecht L, Baron J, et al. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17(1):38.

    Article  PubMed  PubMed Central  Google Scholar 

  149. Kastner M, Makarski J, Hayden L, Lai Y, Chan J, Treister V, et al. Improving KT tools and products: development and evaluation of a framework for creating optimized, Knowledge-activated Tools (KaT). Implement Sci Commun. 2020;1(1):47.

    Article  PubMed  PubMed Central  Google Scholar 

  150. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;6(356): i6795.

    Article  Google Scholar 

  151. Lokker C, McKibbon KA, Colquhoun H, Hempel S. A scoping review of classification schemes of interventions to promote and integrate evidence into practice in healthcare. Implement Sci. 2015;10(1):27.

    Article  PubMed  PubMed Central  Google Scholar 

  152. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;7(348): g1687.

    Article  Google Scholar 

  153. Wilson PM, Sales A, Wensing M, Aarons GA, Flottorp S, Glidewell L, et al. Enhancing the reporting of implementation research. Implement Sci. 2017;12(1):13.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We wish to thank Julie Desnoyers for designing and implementing the search strategy, Gabrielle Legendre for her contribution in the screening phase and Karine Souffez and Caroline Tessier for their input during the project.

Funding

This project was supported by an Insight Grant from the Social Sciences and Humanities Research Council of Canada (SSHRC) and by the Équipe RENARD (FRQ-SC). The funding bodies had no role in the conduct of this scoping review.

Author information

Authors and Affiliations

Authors

Contributions

SZ, MJG, EMC, JL, CD, EJ, KS, VR and CT were involved in developing and designing the scoping review. EMC, MJG and GL (collaborator) screened articles in duplicate. SL, DG, LJC and OP extracted data from the included articles. SL and DG synthesized the data. SL, SZ and EMC drafted the manuscript. SZ led the project, supervised and assisted the research team at every stage, and secured the funding. All authors provided substantive feedback and approved the manuscript prior to submission.

Corresponding author

Correspondence to Saliha Ziam.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Keywords and search strategy.

Additional file 2.

Summary of included articles.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ziam, S., Lanoue, S., McSween-Cadieux, E. et al. A scoping review of theories, models and frameworks used or proposed to evaluate knowledge mobilization strategies. Health Res Policy Sys 22, 8 (2024). https://doi.org/10.1186/s12961-023-01090-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-023-01090-7

Keywords