Skip to main content

A process evaluation accompanying an attempted randomized controlled trial of an evidence service for health system policymakers

Abstract

Background

We developed an evidence service that draws inputs from Health Systems Evidence (HSE), which is a comprehensive database of research evidence about governance, financial and delivery arrangements within health systems and about implementation strategies relevant to health systems. Our goal was to evaluate whether, how and why a ‘full-serve’ evidence service increases the use of synthesized research evidence by policy analysts and advisors in the Ontario Ministry of Health and Long-Term Care as compared to a ‘self-serve’ evidence service.

Methods

We attempted to conduct a two-arm, 10-month randomized controlled trial (RCT), along with a follow-up qualitative process evaluation, but we terminated the RCT when we failed to reach our recruitment target. For the qualitative process evaluation we modified the original interview guide to allow us to explore the (1) factors influencing participation in the trial; (2) usage of HSE, factors explaining usage patterns, and strategies to increase usage; (3) participation in training workshops and use of other supports; and (4) views about and experiences with key HSE features.

Results

We terminated the RCT given our 15% recruitment rate. Six factors were identified by those who had agreed to participate in the trial as encouraging their participation: relevance of the study to participants’ own work; familiarity with the researchers; personal view of the importance of using research evidence in policymaking; academic background; support from supervisors; and participation of colleagues. Most reported that they never, infrequently or inconsistently used HSE and suggested strategies to increase its use, including regular email reminders and employee training. However, only two participants indicated that employee training, in the form of a workshop about finding and using research evidence, had influenced their use of HSE. Most participants found HSE features to be intuitive and helpful, although registration/sign-in and some page formats (particularly the advanced search page and detailed search results page) discouraged their use or did not optimize the user experience.

Conclusions

The qualitative findings informed a re-design of HSE, which allows users to more efficiently find and use research evidence about how to strengthen or reform health systems or in how to get cost-effective programs, services and drugs to those who need them. Our experience with RCT recruitment suggests the need to consider changing the unit of allocation to divisions instead of individuals within divisions, among other lessons.

Trial registration

This protocol for this study is published in Implementation Science and registered with ClinicalTrials.gov (HHS/FHS REB 10–267).

Peer Review reports

Background

Health system policymakers make important decisions every day about the governance, financial, and delivery arrangements within which programs, services, and drugs are provided and about implementation strategies [1]. For these decisions to be evidence-informed, policymakers need timely access to the best available research evidence in a way that makes it easily retrieved (i.e. using terminology that is intuitive to them) and which allows them to rapidly scan for relevance and decision-relevant information [2,3]. For systematic reviews, which are increasingly seen as a key source of research evidence for informing the decisions made by health system policymakers [1], this means allowing for rapid scanning of how recently the search was conducted, the settings in which the included studies were conducted, and the quality of the review, as well as providing access to user-friendly summaries of the evidence whenever possible.

Policymakers also need to be alerted to the availability of new research evidence in their areas of interest to them. As outlined in the protocol for this study [4], until relatively recently no such evidence services had been developed for health system policymakers, unlike the situation for clinical and public health professionals [5,6]. To address this gap, we developed a full-serve evidence service comprised of two types of activities (efforts to facilitate ‘pull’ and ‘push’ efforts) included in a framework for supporting the use of research evidence [7]. The full-serve evidence service included (1) access to Health Systems Evidence (HSE) as a ‘one-stop shop’ for research evidence addressing questions about governance, financial and delivery arrangements within which programs, services and drugs are provided and about implementation strategies [8] (as an effort to facilitate policymakers’ efforts to ‘pull’ in research when they need it); (2) monthly email alerts about new additions to the database (a ‘push’ effort); and (3) full-text article availability (an additional effort to facilitate pull) [4]. The ‘self-serve’ evidence service consisted only of database access. Our objective was to evaluate whether (and how and why) this full-serve evidence service increased the use of synthesised research evidence by policy analysts and advisors in the Ontario Ministry of Health and Long-Term Care as compared to a ‘self-serve’ evidence service.

Methods

As detailed in our published study protocol [4], we planned for a two-arm randomized controlled trial (RCT), along with a follow-up qualitative process evaluation to explore how and why the ‘full-serve’ evidence service did or did not work. We invited all policy analysts and advisors from a purposively selected division in the Ontario Ministry of Health and Long-Term Care, which deals with health systems issues on a regular basis. For those agreeing to participate, we planned to use a stratified randomized design to assign participants to receive either the ‘full-serve’ or ‘self-serve’ evidence service. The trial was to be conducted over a 10-month period consisting of three phases: (1) a 2-month baseline period where all participants would receive the ‘self-serve’ evidence service; (2) a 6-month intervention period where the intervention group would receive the ‘full-serve’ evidence service; and (3) a 2-month cross-over period where all participants would receive the ‘full-serve’ evidence. We planned to measure the mean number of site visits/month/user between baseline and the end of the intervention period as the primary outcome and participants’ intention to use research evidence as the secondary outcome. For the qualitative process evaluation, our original plan was to conduct one-on-one semi-structured interviews with participants (15 from the intervention group and 15 from the control group) on their views about and experiences with the evidence service they received, how and why it was helpful (or not) in their work, what features were most and least helpful and why, and recommendations for next steps.

However, as we report in the results section in more detail, we terminated the RCT when we failed to reach our recruitment target of 148, which was the number of policy analysts and advisors working in the purposively sampled division at the time of publishing the protocol (this changed to 138 by the time we started recruitment). After the termination of the trial, we revised the original interview guide (provided in Additional file 1) to explore actual and potential participants’ reasons for deciding to participate or not in the trial and their views about whether, how and why HSE has been helpful in their work, what features are most and least helpful and why, and recommendations for how to improve it. Ethics approvals for the study and its amended version were received from the HHS/FHS REB at McMaster University (protocol #10-267). Written consent was obtained from each study participant prior to participating in the interview.

Study population and recruitment

We derived a sample frame for the process evaluation by reviewing the sample of 138 policy analysts and advisors who were originally invited to participate in the RCT and identified those still in the same division (Health System Strategy and Policy Division) from which the sample was drawn. Using the list of 78 individuals who met these criteria, we selected a purposive sample of 30 policy analysts and advisors, of whom 15 had originally agreed to participate in the trial and 15 had not. In selecting each half of the sample, we included a mix of participants from different branches and units within the division and from different types of positions (e.g. senior and junior policy analysts and advisors). As reported in the results, all of those who participated in the qualitative interviews were those who had agreed to participate in the RCT. Based on this, we determined that additional sampling would be highly unlikely to yield additional participants given that, of those remaining in the sample frame, most had not responded and some had declined the invitation for the RCT phase of the study.

Data collection

We invited participants by email to take part in semi-structured telephone interviews. The interviews were conducted by MGW using an interview guide and audio-taped the interviews, and we transcribed the interviews verbatim. The first part of the interview guide explored the factors influencing participation in the trial. We included an initial set of probes for these factors, such as the time commitment involved, concerns about their HSE usage being monitored, and the trial’s perceived relevance to their work. The second part of the interview guide explored (1) participants’ usage of HSE, factors explaining usage patterns, and strategies to increase usage; (2) participation in training workshops and use of other supports; and (3) views about and experiences with key HSE features. We asked explicitly about eight of HSE’s key features: (1) registration and sign-in; (2) open search; (3) advanced search; (4) search overview page; (5) detailed search results page; (6) links to one-page summaries; (7) monthly evidence service; and (8) supplementary material/portals. We added the last probe to solicit feedback about the Evidence-Informed Healthcare Renewal Portal that was added after the trial was terminated and provides a set of policy-relevant documents related to healthcare renewal in Canada [9]. Prior to the interviews, we asked participants to use HSE and familiarize themselves with its features. In addition, during the interviews, the participant and the interviewer typically signed into the database at the same time so that ‘real-time’ feedback could be received.

Data management and analysis

We used a constant comparative approach [10] to data analysis, whereby we identified key themes and findings emerging from successive interviews and refined the interview guide as needed following each interview to address these topics in subsequent interviews (which included adjusting our initial list of probes where needed). This strategy allowed us to gather additional feedback on these themes from later interviewees. Upon completing the interviews, we summarized the findings along with illustrative quotes in four key domains: (1) factors influencing participation in the trial; (2) usage of HSE, factors explaining usage patterns, and strategies to increase usage; (3) participation in training workshops and use of other supports; and (4) views about and experiences with key HSE features. We did not use qualitative analysis software such as NVivo to analyse the results given the small number of interviews conducted and because the interviews were brief. Instead, we used the verbatim transcripts to develop structured summaries for each interview according to our areas of interest outlined above.

Results

Randomized controlled trial (RCT)

From the 138 policy analysts and advisors who were working in the division at the time and hence invited to participate in the trial, only 59 responded to either of our three waves of study invitations (a 43% response rate). Of these, 21 agreed to participate (a 15% trial recruitment rate) and 38 declined. Of the 38 who declined to participate, 16 were still in the same ministry but no longer in the same division, nine did not provide a reason, seven were on short-term contract, retiring or going on leave, and six had moved to a different ministry. Given our failure to reach our recruitment target in the division addressing health system issues most directly, we terminated the trial.

Qualitative process evaluation

Of the 30 policy analysts we invited to participate in the qualitative process evaluation, nine agreed to participate, of whom seven had agreed to participate in the trial and two had not. While limited in size, the sample included participants from six different units and covered a range of sectors (including acute care, community care, and health protection and promotion), populations (e.g. people living with mental health and addictions) and decision support (e.g. economic analysis and evidence synthesis). However, the sample was not as balanced in terms of level of position with eight participants in senior analyst roles and one in a junior role.

Factors influencing participation in the trial

While two of the study participants had not agreed to participate in the trial according to our study records, during the interview all nine reported having agreed to participate in the trial. Participants cited a number of factors that encouraged their participation in the trial, which included the relevance of the trial to their own work; familiarity with the researchers; personal view of the importance of using research evidence in policymaking; academic background; support from supervisors and the Assistant Deputy Minister; and the participation of colleagues.

Three participants also reported that the time commitment had been a concern for them, and many noted that there are often a large number of tasks to accomplish and non-prioritized tasks are often ignored. For example, one participant noted that they “…think people have a variety of different pressures on their time here and they don’t understand the importance of research so they put it down lower on the priority list.” The same participant indicated that, given these pressures, “to have the manager say this was actually something that is really a priority, helped us to prioritize that as something to do.”

One participant expressed concern about having to be careful what policy analysts and advisors said as a representative of the ministry, but none of the study participants were concerned about their HSE usage being monitored as part of the trial. In general, they suggested that trials such as this one are “a good thing for the policy and planning community” as they help to ensure that policymakers “get better access to evidence and tools.”

Usage of HSE, factors explaining usage patterns, and strategies to increase usage

Six of the participants had used HSE before the interview, but these same participants indicated that they used it infrequently or inconsistently. Despite this, almost all of the participants (n = 7) stated that they felt it would be useful for their work-related projects. Two participants indicated that HSE was a helpful resource because it helps policymakers meet their need for finding relevant research evidence to support and inform the development of policy initiatives. For example, one participant noted that “with any policy initiative you have, especially with the Ministry of Health, there’s a huge push to have evidence. So when you’re writing different types of documents to support initiatives, you need to have options and they need to be supported with evidence.”

When asked for reasons why HSE was used infrequently or inconsistently, all participants indicated that they already use multiple databases and one participant specifically indicated that they often prefer to stick to the ones they know. In addition, three participants noted that they are typically not required to directly seek research evidence themselves because units in the ministry already do this for them. One participant said that rather than having different databases addressing different healthcare topics (e.g. only about health systems), it would be better to more generally include evidence across healthcare boundaries. Specifically, this participant indicated that:

It frustrates me to no end that we continue to have all of these different partitioning[s] of evidence. Which is both good, because it allows us to focus but also bad because there is […] much less cross-boundary searching that we can do. I always have to end up going back to the general databases to search […] being able to specifically say, for example, I want to look for something that crosses the boundaries of public health and primary care and community care, that would be fantastic.

When asked about what could be done to increase their use of HSE, four common themes emerged, namely reminders about the existence of the site, such as the monthly evidence service; encouragement of its use from management in general and supervisors in particular; incorporation into employee training; and promotion on the Ministry’s website.

One participant highlighted that:

if there’s buy in from senior management here to help promote a tool like this or to encourage people to use it, that’s also a helpful way in terms of getting the attention of people who are doing policy work because often, it takes people getting direction from someone above to motivate them sometimes”

Participation in training workshops and use of other supports

Six of the participants had participated in workshops on finding and using research evidence provided to ministry staff in which HSE is profiled as a key source for research evidence to address questions related to health systems. Two others had never participated but had read the material, and one had not participated or read the material. Of those who had participated, two indicated that they thought it had an impact on their use of HSE. Another two participants said the workshop was helpful because it taught them about quality ratings (like the ones provided in HSE) and another participant noted that the workshop helped them learn about the different search tools. Others indicated that attending the workshop did not influence their use of the website. For example, one said: “I actually went to [the] workshop and he [the facilitator] probably talked about it and then just went, you know, out the other ear.” None of the participants had watched the tutorial videos provided through HSE, but three stated they thought they could be helpful. Two participants also said the tips on the site itself were useful.

Views about and experiences with key HSE features

In general, most participants (six out of nine) said they found the features of HSE to be intuitive. Some said they found the information it contains more relevant for policymakers than most other search engines. One participant indicated that the documents found in HSE “were all consistently very high quality compared to if I did my usual bibliography searches on the literature databases.”

Table 1 summarises the feedback received regarding each of eight key features of HSE along with illustrative quotes. In general, participants indicated that searching is intuitive; having multiple options to conduct searches (e.g. by searching one or more topic categories) allows for greater flexibility and functionality; providing an overview of the number and types documents retrieved by a search and the ability to limit the results to specific types of documents on the search results overview page is helpful; providing a monthly evidence service is important for providing alerts about new research evidence; including access to a diversity of document types is helpful; and having access to a database focused on policy documents related to the Canadian healthcare system (the Evidence-Informed Healthcare Renewal portal) is an important source of additional evidence. However, several participants also indicated that the requirement to register and then sign-in before searching HSE either discouraged their use of the database and/or could discourage its use by others. Others found the layout of the advanced search and the results pages to be confusing and emphasized the need to more effectively highlight the most important information (e.g. the main search functionality on the advanced search page and the titles in the results page).

Table 1 Views about and experiences with key Health Systems Evidence (HSE) features

Discussion

Principal findings

We terminated the RCT – to our knowledge, the first attempt to evaluate the effects of an evidence service specifically designed to support health system policymakers in finding and using research evidence using such a design – given our 15% trial recruitment rate. Six factors were identified by those who had agreed to participate in the trial as encouraging their participation: relevance of the study to participants’ own work; familiarity with the researchers; personal view of the importance of using research evidence in policymaking; academic background; support from supervisors; and participation of colleagues. Most participants reported that they never, infrequently or inconsistently used HSE and suggested several strategies to increase its use, including regular email reminders and employee training. However, only two participants indicated that employee training, in the form of a workshop about finding and using research evidence, had influenced their use of HSE. Most participants found HSE features to be intuitive and helpful, although registration/sign-in and some page formats (particularly the advanced search page and detailed search results page) discouraged their use of HSE or did not optimize the user experience.

Strengths and limitations

The principal strength of our study is that we received in-depth feedback from a sample of policy analysts and advisors (a key target audience for us) about trial participation and HSE features and we have acted on many of their suggestions for improving HSE. The principal limitations of our study relate to sample size, both in terms of the very low recruitment rate for the trial (which resulted in its termination) and in terms of the somewhat low recruitment rate for the qualitative process evaluation, particularly among those who did not volunteer for the trial. In addition, it is likely that many of those who agreed to participate in the qualitative interviews were those already keen to use HSE in their work, which may have limited our ability to identify divergent views. However, we did observe the recurrence of common themes in each interview and with participants from a diverse set of units in the division, which leads us to believe that, if data saturation was not reached in all domains (e.g. because of potentially not reaching a sufficiently balanced sample that provided divergent views), we were likely close to achieving it.

Implications

We have identified implications related to future trial recruitment and for efforts to support the use of research evidence by policymakers. For future trial recruitment, our study provides a helpful reflection about how best to recruit policy analysts and advisors in a trial within a particular division of a health ministry even when a team has the explicit support of an Assistant Deputy Minister, who is the divisional head, in the form of both a formal letter attached to our invitation and an internal email that was sent to staff of the division. One particular point for reflection is whether our ‘unit of allocation’ for the intervention should have been divisions and not individuals within a division. Policy development is a complex process that depends on the efforts of many policy analysts and its use at the individual level is therefore likely to vary depending on policy priorities at a given point in time. As a result, the more meaningful unit of analysis could be the use of an evidence service in a division, which would more accurately capture this variability at the individual level. In other words, the focus would instead be on studying the effect of our evidence service on the use of research evidence by all policy analysts and advisors within their division as the more meaningful level of outcomes.

In addition to modifying our approach to analysis, changing the unit of allocation has significant implications for the levels at which informed consent should be obtained for a cluster RCT [11]. Specifically, changing the unit of allocation in this way would mean seeking consent at the level of the division (i.e. the Assistant Deputy Minister), plus having either a waiver of consent or additional informed consent at the level of individual participants. For example, one possible approach could be to include two divisions per Canadian province that are then randomly assigned to receive the ‘full-serve’ or ‘self-serve’ evidence service. A limitation would be a lack of statistical power, but such a design would add to a larger body of knowledge that could be replicated and then included in meta-analyses.

For efforts to support the use of research evidence, our study supports others’ findings that essential components of such efforts include the need to ensure timely, multi-faceted yet focused communication [2,12-14] and to enhance interaction between the producers and users of research evidence [2,12,13]. For example, a recent Cochrane review of interventions to improve the use of systematic reviews in decision-making by health system managers, policymakers and clinicians, indicated that simple and clear messages are best in communicating results from systematic reviews [14]. In addition, the same review indicated that multi-faceted approaches (e.g. ‘push’ efforts combined with efforts to facilitate ‘pull’ such as capacity building) are needed when the goal is not only to communicate results to inform policy and practice but also to more generally increase awareness and knowledge about the importance of using systematic reviews and to build skills for using them to inform policy and practice [14].

Study impact

Based on the feedback received about HSE, we made several modifications to enhance the functionality of the advanced search page and the detailed search results page. On the advanced search page, we modified the layout to better highlight the available search functionality. Specifically, we have incorporated a short descriptive header for the three primary groupings of search functionalities (‘topics’, ‘open search’ and ‘limits’), which are followed by a one-line description of what each can be used to do. In addition, we now highlight that the Evidence-Informed Healthcare Renewal Portal contains policy documents related to healthcare renewal in Canada. Next, we incorporated two new search limits. The first limit allows users to limit their search to documents with either a general focus (i.e. an intervention that is broadly applied across settings, populations and diseases, such as case management) or a specific focus (i.e. an intervention that is primarily focused on a single setting, a specific population or a specific disease such as case management for teenagers with diabetes in the United Kingdom). The second limit allows users to search for documents that either focus on a particular country (or countries) or include at least one study that was conducted in a country (or countries).

On the detailed search results page, we removed the columns outlining the type of question and the health system topics addressed, which were both identified by most participants as not being helpful when scanning results for relevant documents. We also now list the document title in the first column (instead of the type of document), which is then hyperlinked to the one-page summary, uses bolded text in a different colour (to make it stand out on the page) and notes whether the document has a general or specific focus in an icon below the title.

Conclusion

The qualitative data we collected has informed the redesign of HSE, which will allow users to more efficiently find and use research evidence about how to strengthen or reform health systems or in how to get cost-effective programs, services and drugs to those who need them. We plan to continue to conduct periodic user testing to ensure that HSE is an optimal tool to identify research evidence about health systems, as well as to explore ways to increase trial recruitment rates among policy analysts, and more generally approaches to future evaluations.

Abbreviations

HSE:

Health Systems Evidence

RCT:

Randomized clinical trial

References

  1. Lavis JN. How can we support the use of systematic reviews in policymaking? PLoS Med. 2009;6(11):e1000141.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Lavis JN, Davies HTO, Oxman AD, Denis J-L, Golden-Biddle K, Ferlie E. Towards systematic reviews that inform health care management and policy-making. J Health Serv Res Pol. 2005;10 Suppl 1:35–48.

    Article  Google Scholar 

  3. Lavis JN, Davies HTO, Gruen RL. Working within and beyond the Cochrane Collaboration to make systematic reviews more useful to healthcare managers and policy makers. Healthcare Policy. 2006;1(2):21–33.

    PubMed  PubMed Central  Google Scholar 

  4. Lavis JN, Wilson MG, Grimshaw J, Haynes RB, Hanna S, Raina P, et al. Effects of an evidence service on healthcare policymakers’ use research evidence: a protocol for a randomized controlled trial. Implement Sci. 2011;6:51.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Haynes RB, Cotoi C, Holland J, Walters L, Wilczynski N, Jedraszewski D, et al. Second-order peer review of the medical literature for clinical practitioners. JAMA. 2006;295(15):1801–8.

    Article  CAS  PubMed  Google Scholar 

  6. Dobbins M, DeCorby K, Robeson P, Husson H, Tirilis D, Greco L. A knowledge management tool for public health: health-evidence.ca. BMC Public Health. 2010;10:496.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Lavis JN, Lomas J, Hamid M, Sewankambo NK. Assessing country-level efforts to link research to action. Bull World Health Organ. 2006;84(8):620–8.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Wilson MG, Moat KA, Lavis JN. The global stock of research evidence relevant to health systems policymaking. Health Res Pol Syst. 2013;11:32.

    Article  Google Scholar 

  9. Kowalewski K, Lavis JN, Wilson MG, Carter N. Supporting evidence-informed health policymaking: the development and contents of an online repository of policy-relevant documents addressing healthcare renewal in Canada. Healthcare Policy. 2014;10(2):27–37.

    PubMed  PubMed Central  Google Scholar 

  10. Boeije H. A purposeful approach to the constant comparative methods in the analysis of qualitative interviews. Quality Quantity. 2002;36:391–409.

    Article  Google Scholar 

  11. Weijer C, Grimshaw JM, Eccles MP, McRae AD, White A, Brehaut JC, et al. The Ottawa statement on the ethical design and conduct of cluster randomized trials. PLoS Med. 2012;9(11):e1001346.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14:2.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Lavis JN, Catallo C. Bridging the worlds of research and policy in European health systems. Brussels: European Observatory on Health Systems and Policies; 2013.

    Google Scholar 

  14. Murthy L, Shepperd S, Clarke MJ, Garner SE, Lavis JN, Perrier L, et al. Interventions to improve the use of systematic reviews in decision-making by health system managers, policy makers and clinicians. Cochrane Database Syst Rev. 2012;9:CD009401. doi:10.1002/14651858.CD009401.pub2.

    PubMed  Google Scholar 

Download references

Acknowledgements

Funding for this study was provided through an Applied Health Research Question (AHRQ) by the Ontario Ministry of Health and Long-Term Care to the Centre for Health Economics and Policy Analysis. We thank Nicole Archer for conducting the preliminary data analysis for this paper. We also thank Kaelan Moat along with the many staff and students who contributed to building the content for Health Systems Evidence (HSE). Lastly, we would like to thank the various organizations that produce the resources on which HSE depends.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael G Wilson.

Additional information

Competing interests

John Lavis led the creation and oversees the continuous updating of Health Systems Evidence (www.healthsystemsevidence.org). Michael Wilson also helped lead the creation of Health Systems Evidence and contributes to the continuous updating of it.

Authors’ contributions

All authors contributed to the conception of the original and revised study design; MGW conducted the data analysis; JNL reviewed and checked the data analysis for consistency; MGW and JNL drafted the manuscript. All authors reviewed and approved the final version of the manuscript.

Additional file

Additional file 1:

Interview guide for qualitative process evaluation. (DOCX 17 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wilson, M.G., Grimshaw, J.M., Haynes, R.B. et al. A process evaluation accompanying an attempted randomized controlled trial of an evidence service for health system policymakers. Health Res Policy Sys 13, 78 (2015). https://doi.org/10.1186/s12961-015-0066-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-015-0066-z

Keywords