Skip to main content

Qualitative Evidence Synthesis (QES) for Guidelines: Paper 3 – Using qualitative evidence syntheses to develop implementation considerations and inform implementation processes

A Research to this article was published on 08 August 2019

A Research to this article was published on 08 August 2019

Abstract

Background

This is the third in a series of three papers describing the use of qualitative evidence syntheses (QES) to inform the development of clinical and health systems guidelines. WHO has recognised the need to improve its guideline methodology to ensure that decision-making processes are transparent and evidence based, and that the resulting recommendations are relevant and applicable to end users. In addition to the standard data on effectiveness, WHO guidelines increasingly use evidence derived from QES to provide information on acceptability and feasibility and to develop important implementation considerations.

Methods

WHO convened a group drawn from the technical teams involved in formulating recent (2010–2018) guidelines employing QES. Using a pragmatic and iterative approach that included feedback from WHO staff and other stakeholders, the group reflected on, discussed and identified key methods and research implications from designing QES and using the resulting findings in guideline development. As members of WHO guideline technical teams, our aim in this paper is to explore how we have used findings from QES to develop implementation considerations for these guidelines.

Results

For each guideline, in addition to using systematic reviews of effectiveness, the technical teams used QES to gather evidence of the acceptability and feasibility of interventions and, in some cases, equity issues and the value people place on different outcomes. This evidence was synthesised using standardised processes. The teams then used the QES to identify implementation considerations combined with other sources of information and input from experts.

Conclusions

QES were useful sources of information for implementation considerations. However, several issues for further development remain, including whether researchers should use existing health systems frameworks when developing implementation considerations; whether researchers should take confidence in the evidence into account when developing implementation considerations; whether qualitative evidence that reveals implementation challenges should lead guideline panels to make conditional recommendations or only point to implementation considerations; and whether guideline users find it helpful to have challenges pointed out to them or whether they also need solutions. Finally, we need to explore how QES findings can be incorporated into derivative products to aid implementation.

Peer Review reports

Background

One of the main activities of WHO is to support decision-makers globally by producing guidance about best healthcare practice [1]. The development of a WHO guideline is a lengthy process involving a number of stages, including agreeing on the topic, identifying and assessing the best available evidence, and reaching consensus about recommendations. Throughout this process, the developers of the guideline are expected to ensure that the topic, the evidence used, and the recommendations that emerge are as relevant as possible to guideline implementers in the future. WHO therefore involves not only staff members and external methodologists in these processes, but also content experts and end-users such as programme managers and health professionals, as it is they who will ultimately adopt, adapt and implement these recommendations [2].

While WHO guidelines aim to be as relevant as possible to implementers, they are not intended to include specific plans for implementation at a national or subnational level [2]. Global guidelines need to be adapted to local circumstances before they can be implemented, and local implementers are far better placed to develop these plans [3]. Nevertheless, the global evidence that is gathered to inform WHO recommendations represents a source of information that can also have value for local use. This is particularly the case for WHO guideline processes that include qualitative evidence. This type of evidence often describes how healthcare services are operationalised, viewed and experienced by stakeholders across many settings. Qualitative findings can offer future implementers an opportunity to learn from the experiences of others.

Evidence-to-decision (EtD) frameworks and expanding the evidence base in WHO guidelines

WHO has traditionally focused on the effectiveness of an intervention in guideline processes, but this approach has begun to change in recent years. This is at least partly driven by the WHO’s increased use of EtD frameworks [2] such as the GRADE (Grading of Recommendations, Assessment, Development and Evaluation) EtD framework [4]. These frameworks help guideline panels and other decision-makers think more systematically through additional factors before reaching their final recommendation. These factors include the cost of the intervention, its acceptability and feasibility, its impact on equity, and the value people place on different outcomes [4].

Systematic reviews of randomised trials are commonly acknowledged as the best source of evidence when assessing intervention effectiveness. Systematic reviews of qualitative research, also known as qualitative evidence syntheses (QES), are, however, better suited for questions of acceptability and feasibility. The WHO guidelines on task shifting for maternal and newborn health [5] were the first to include QES in the guideline process [6]. They used QES to explore people’s experiences of lay health worker programmes [7] and nurse-doctor substitution [8]. The reviews gave valuable information about how acceptable these programmes were to service users and health workers, shedding light on important feasibility and equity issues as well as enabling the development of considerations for the people who implement the recommendations.

Implementation considerations in EtD frameworks

Although the global guidelines of WHO generally do not include implementation plans, EtD frameworks encourage guideline panels to list broader implementation considerations [9]. These are not intended to serve as technical manuals or detailed implementation plans. Instead, they are probes, prompts, suggestions or requirements that implementers should consider when developing their local plans.

Some guidance is provided about topics that guideline teams and panel members could consider when developing implementation considerations [10]. However, there is less guidance about where this information should come from. Therefore, during our work on the technical teams of several WHO guidelines, we explored how we can use evidence from QES, combined with insights from the guideline panel, as a main source of information for the implementation considerations developed as a part of the guideline.

This is the third paper in a series examining the use of QES in developing clinical and health systems guidelines (Fig. 1). The focus of the first paper is on how to adapt QES methods for the guideline context [11], whereas the second focuses on how to use findings from QES to populate EtD frameworks [12]. In this final paper, we describe how, as members of guideline technical teams, we have used the findings from QES to develop implementation considerations for WHO guidelines. We include lessons learnt and point to areas where there is a need for more research and development.

Fig. 1
figure 1

Overview of the ‘Qualitative evidence synthesis in guidelines’ series of papers

Methods

The experiences, guidance and data presented in this series of papers are the result of processes that have evolved over a decade of engagement with qualitative research in the context of developing healthcare guidelines at WHO. To develop the ideas described in the series, we used a pragmatic and iterative approach that included the following steps:

  1. 1.

    The WHO convened a core team of authors who had been involved in WHO guideline technical teams since 2010 and in developing QES to support these guidelines.

  2. 2.

    The core author team reflected on the guideline development processes in which we had been involved (see list below), focusing on the role of QES findings in these processes. We also received informal feedback on these processes from other WHO staff involved in guideline development, and from participants in several guideline training workshops at WHO. These reflections and feedback led us to identify three key areas, each of which became a focus for one of the papers in the series.

  3. 3.

    The lead author for each paper then drafted an outline for their paper, and these were discussed during a 4-day author workshop. In the workshop, authors discussed the most important factors in the use of qualitative evidence in this context to date and agreed on what worked and what could be improved in the future. The outlines were then developed into full papers, using an iterative process of sequential writing and discussion. We also identified relevant examples from the guidelines in which we had been involved. The core authors then reviewed the draft to clarify the ideas and processes described and to add further examples where needed.

  4. 4.

    We then circulated the draft papers to key stakeholders to obtain their feedback on the ideas and processes described. These stakeholders included members of WHO guideline panels (sometimes called Guideline Development Groups), methodologists, guideline commissioners and implementation experts (see Acknowledgements).

We selected examples from the following WHO guidelines, in the compilation of which members of the core author team had been involved:

  1. 1.

    Optimising health worker roles for maternal and newborn health through task-shifting (2012) [5].

  2. 2.

    Expanding health worker roles to help improve access to safe abortion and post-abortion care (2015) [13].

  3. 3.

    WHO recommendations on antenatal care for a positive pregnancy experience (2016) [14].

  4. 4.

    WHO recommendations on intrapartum care for a positive childbirth experience (2018) [15].

  5. 5.

    Guidance on communication interventions to inform and educate caregivers on routine childhood vaccination in the African Region (under review).

  6. 6.

    WHO recommendations on digital interventions for health systems strengthening [16].

All of these guidelines were health systems focused or had a health systems component, and all used the GRADE EtD frameworks [9]. The latter are documents with a common structure that includes a question, an assessment of the evidence that addresses the question, and a conclusion, all of which facilitate explicit and transparent decision-making [4]. The examples we present here were selected in order to highlight how the technical teams used qualitative evidence in the guideline processes and the strategies that they used to interpret and use this evidence when developing implementation considerations.

Results

Moving from qualitative evidence to implementation considerations

What was the work process?

For each guideline, a number of processes took place before the technical teams could begin developing the implementation considerations (Fig. 2).

Fig. 2
figure 2

Moving from qualitative evidence to implementation considerations – the work process

First, WHO, with input from external stakeholders, agreed on the scope of the guideline and the questions it would cover. The technical team then created EtD frameworks for each of the guideline’s questions. To populate each framework’s acceptability, feasibility and equity sections, the team searched for existing QES that explored how stakeholders viewed and experienced the interventions in question. In at least two guidelines [14, 15], we also did this for the ‘values’ criterion. However, in most cases, the WHO had to commission the syntheses needed to populate the EtD frameworks, as existing QES were often lacking, out-of-date or failed in other ways to meet the needs of this guideline. Most of these commissioned QES were registered as Cochrane reviews and therefore met minimum quality standards set by Cochrane for these types of reviews [17]. For each commissioned QES, the technical team asked authors to assess how much confidence they had in each of the findings, using the GRADE-CERQual (Confidence in the Evidence from Reviews of Qualitative research) approach [18].

As soon as draft versions of the QES were ready, the technical teams extracted those findings that they considered to be particularly relevant for the acceptability, feasibility or equity of one or more of the interventions. The teams then summarised the findings, often in dialogue with the review author team, and added these to the relevant frameworks.

The approach the technical teams used when populating the feasibility, acceptability and equity criteria followed the same process used for populating the section on intervention effectiveness. In each case, the team presented findings from each review separately, made it clear that these findings were based on research evidence, and presented each finding together with its GRADE or GRADE-CERQual assessment.

Once these sections of the framework were complete, the technical team turned its attention to how the QES findings could be translated into implementation considerations. Here, the team used an approach that allowed for more interpretation of the evidence and for the inclusion of other sources of information. The team went through each QES finding and assessed the extent to which it represented a factor that might affect the implementation of the intervention in question. These findings were then used to develop a list of implementation considerations. In some cases, the team combined this information with other sources of information, including research-based or non-research-based information from the EtD frameworks, information from external sources and input from invited stakeholders. In most cases, this additional information added little to the main text about implementation considerations and the reliability of this additional information was not formally assessed.

The technical team included the first drafts of the implementation considerations in the relevant EtD frameworks. These frameworks were the main documents used by the guideline panel at the final guideline meeting, when panel members were asked to assess the evidence regarding effectiveness, resource use, acceptability, feasibility and equity before choosing to (1) recommend the intervention, (2) give a conditional or context-specific recommendations, or (3) recommend against the intervention. Where the panel chose option (3), i.e. to recommend against the intervention, the technical team removed the drafted implementation considerations as these were no longer relevant. Where the panel chose options (1) or (2), they were asked to comment on the drafted implementation considerations and to suggest additional implementation considerations.

After this meeting, the technical team edited the implementation considerations in response to comments and suggestions and presented these in the final versions of the frameworks. Once the recommendations had been formally approved by WHO, they were disseminated together with the implementation considerations.

When does the qualitative evidence lead to a conditional recommendation and when does it become an implementation consideration?

In many cases, the evidence from the QES pointed to problems with the acceptability or feasibility of the intervention or problems tied to equity. In some cases, this evidence led the panel to give a conditional or context-specific recommendation. This type of recommendation generally includes a description of the conditions under which the end-user should or should not implement the recommendation [2]. In other cases, the panel decided to recommend the intervention without conditions, and to deal with these concerns as implementation considerations only. For instance, in the antenatal care guideline [14], the panel was asked to decide whether to recommend a midwife-led continuity-of-care model. One QES finding showed that women appreciated being seen by the same healthcare professional at each appointment because this gave them the opportunity to build caring, trusting relationships with these providers [19]. However, additional information from the same QES and input from the guideline panel suggested that the lack of trained midwives and the potential for burnout of health professionals from work overload often reduced the feasibility of this model of care. Taking these feasibility issues into account, the panel agreed to make a context-specific recommendation, recommending this model of care only in settings with well-functioning midwifery programmes. In addition, under ‘Implementation considerations’, the requirements and conditions that would need to be in place for this model of care to be implemented were reiterated (Example 1, Table 1).

Table 1 Examples of implementation considerations influenced by qualitative findings

In another example from the digital health guidelines, due to the potential for harm introduced by disclosing health information, ensuring the confidentiality of a communication about sensitive health topics was perceived to be a prerequisite for executing the recommendation on whether to transmit health information via mobile devices. This was therefore formulated as a conditional recommendation. The evidence on other factors to enhance the acceptability of the intervention, such as allowing users to unsubscribe or to determine the frequency of communication content they received, was translated into implementation considerations.

For other recommendations, the guideline panel decided to recommend the intervention without conditions, and to address these types of concerns only in the implementation considerations section. For instance, feasibility concerns about network connectivity and access to electricity highlighted by the QES in the digital health guidelines [25] were dealt with as implementation considerations and did not lead to conditional recommendations. Similarly, in the intrapartum care guideline, acceptability and feasibility concerns were addressed in the implementation considerations on epidural analgesia, which was unconditionally recommended by the panel (Example 2, Table 1).

Lessons learnt

It was not always clear why some concerns identified in the qualitative evidence led the panel to make a conditional recommendation while others were only dealt with as implementation considerations. In some cases, this may have been tied to the panel’s perceptions of how serious these concerns were and if they were prerequisites to establishing the recommendation. In other cases, the panel’s decision may have been influenced by other factors, such as their assessment of the intervention’s effectiveness or its impact on equity. The panel also tended to be cognisant of not placing too many conditions on a recommendation, as this can make it difficult for stakeholders to determine whether it is appropriate to implement the intervention or not.

WHO guidelines have previously been criticised for making ‘strong recommendations’ despite only low or very low confidence evidence of effectiveness [26, 27]. Panels are now increasingly expected to consider evidence about factors such as acceptability, feasibility and equity, in addition to evidence about effectiveness. Future research should explore if and when problems about issues that have been highlighted by qualitative evidence should lead to conditional recommendations or only to implementation considerations.

How was the evidence transformed?

The process from QES finding to implementation consideration involved varying degrees of transformation. In some cases, it involved little editing of the finding, yet, in others, the information that emerged from the finding was combined with other evidence or information.

When a QES finding pointed to elements of an intervention that stakeholders, such as clients or health workers, particularly liked or wanted, the technical team often rephrased the finding as an implementation consideration with little editing. For instance, in the guidelines on communication interventions for childhood vaccination, the panel decided to recommend the use of face-to-face communication interventions directed at parents or caregivers (Meeting of the WHO African Regional Office (WHO-AFRO), 2018 [11]). One of the supporting QES showed that parents generally wanted more information than they were currently receiving and described the types of information they would like to receive [21]. In this case, the team simply listed the types of information that implementers should consider offering to parents under ‘Implementation considerations’ (Example 3, Table 1).

Similarly, when a QES finding pointed to elements of an intervention that stakeholders found less acceptable or challenging to implement, the team often rephrased the finding as an implementation consideration with little editing. For instance, in the guidelines on task-shifting for maternal and newborn health [5], the panel decided to recommend the use of lay health workers to provide continuous support to women during labour. One of the QES showed that some lay health workers were concerned about personal safety when working in the community and were reluctant to visit clients at night because of safety issues [7]. Here, under ‘Implementation considerations’, the technical team raised awareness of these problems and the need for implementers to address them (Example 3, Table 1).

In other cases, the team attempted to offer suggestions or solutions to problems identified in the QES, often drawing on other sources of information. For instance, in the guidelines on communication interventions for childhood vaccination (under review), the QES showed that people found it difficult to know which vaccination information sources to trust, and that some suspected information sources as being motivated by financial gain. In the implementation considerations, the technical team highlighted this issue, but also suggested that implementers should consider involving community-based health workers or other members of the community, including religious or political leaders, as people viewed these individuals as trustworthy (Example 4, Table 1). When making these suggestions, the team drew on other QES findings, but also on knowledge and expertise of the panel and the technical team.

Lessons learnt

It is possible that simply drawing attention to what people want or need or to existing problems may be helpful to implementers. However, these types of implementation considerations may be less useful than those that also suggest solutions. Future research should explore the extent to which implementers experience the identification of problems alone as useful. The usefulness of suggested solutions may also depend on how evidence based and generalisable they are. Some of the solutions suggested were drawn from the QES and were gathered in a systematic and transparent way. Others were less systematically developed but were drawn from the knowledge of individuals such as members of the technical team or the guideline panel. The extent to which this is a problem is uncertain but leads us to the next issue, namely, the extent to which implementation considerations should reflect our confidence in the underlying qualitative evidence.

What factors influenced how the implementation considerations were phrased?

The team’s choice of language and degree of caution when formulating implementation considerations was sometimes influenced by our level of confidence in the underlying qualitative evidence as assessed using GRADE-CERQual. For instance, the suggestion that implementers consider the use of political leaders as one way of addressing people’s trust issues surrounding sources of information on vaccination was based on a qualitative finding assessed as being of ‘low confidence’, in addition to expert opinion (Example 4, Table 1). In these cases, the technical team often tried to avoid terms such as “implementers should …”, and used more cautious language, e.g. “…consider involving … members of the community, including religious or political leaders…”. However, the different teams did not use this approach consistently.

The team’s choice of language and degree of caution exercised were also influenced by the extent to which the findings from the qualitative evidence reflected the normative positions and principles held by WHO. For instance, a QES developed for the intrapartum care guideline [16] indicated that women from all countries and settings want and appreciate respectful maternity care (RMC) [22]. The principle of RMC is emphasised by WHO and underpinned by a human rights-based approach [23]. As disrespectful care remains prevalent in all sorts of maternity contexts, particularly for vulnerable or marginalised women, the technical team used the ‘Implementation considerations’ section to emphasise to stakeholders that mechanisms need to be put in place to ensure that all women are made aware of their right to RMC, and how to raise complaints should they not receive it (Example 6, Table 1). In this case, when formulating the implementation consideration, the team deliberately used the word ‘should’ to stress this normative standpoint. Similarly, in the QES on parents’ perceptions of childhood vaccination information [21] developed for the guideline on the same topic (under review), parents called for far more information about vaccination. As information is considered by WHO to be a universal right, the language used when formulating this implementation consideration was deliberately emphatic: “Different people are likely to have different information needs. However, all people should have easy access to information about …” [16, 23] (Example 2, Table 1). However, again, the different guideline teams did not use this approach consistently.

Lessons learnt

Ideally, the language used when presenting implementation considerations should reflect the level of confidence in the evidence on which it is based. In many cases, the underlying evidence might be downgraded to low or very low confidence, for instance, because of concerns about its relevance or because of serious methodological limitations. In these cases, technical teams should consider formulating implementation considerations more cautiously, i.e. as questions, prompts or suggestions. Alternatively, they should consider basing the implementation considerations primarily on those qualitative findings that have been assessed as being of moderate or high confidence.

One circumstance that may override the confidence in the QES evidence is the presence of overarching ideals and principles held by the body producing the guideline (in this case, WHO), including principles associated with human rights). In such cases, the team has often phrased the implementation considerations in relatively strong terms that are closer to recommendations than prompts or suggestions, and this has sometimes been done irrespective of the level of confidence in the evidence. However, strong terms, including the use of the word ‘should’, need to be carefully considered. Therefore, when using this approach, guideline commissioners, technical teams and panels should consider adopting a more reflexive and transparent approach early on in the guideline process where they explicitly identify the overarching principles, including normative values, driving the guideline. To achieve this, it may be helpful to borrow from reflexive exercises recommended within qualitative research practice, as described in Paper 1 of this series [11], and to explore how these can be carried out at different stages of the guideline process.

Can existing health systems frameworks be used to organise implementation considerations and identify gaps?

For some of the guidelines, the technical team used existing health systems frameworks to help organise the implementation considerations into meaningful groups. For instance, the team used the WHO Building Blocks [28] and the SURE Framework [29] to help organise implementation considerations in the guidelines on task-shifting for maternal and newborn health [5], and the WHO/ITU National eHealth Strategy Toolkit [30] in the digital health guidelines [16, 31].

While these frameworks helped us group implementation considerations, they also highlighted gaps in the qualitative evidence. The qualitative evidence was often a good source of data for implementation considerations at the level of the facility, the community or the individual, but it was less rich for factors associated with higher levels of the health system. For instance, a QES [25] used in the digital health guidelines [16] described different challenges to the implementation of these interventions, including health worker motivation, workload, training and supervision issues; the impact on health worker relationships with clients, the community and other health workers; and access to supplies, electricity and network connectivity. Similarly, a QES used in the guidelines on childhood vaccination communication described parents’ needs and challenges regarding information content, format, source and timing, as well as issues tied to trust and decision-making [21]. However, financing, legal and political factors tended to be poorly represented in the data.

In some cases, the technical team addressed this problem by drawing on information from other sources to develop implementation considerations. For instance, in the guidelines on task-shifting for maternal and newborn health [5] and for abortion care [13], the team carried out case study syntheses to identify these more ‘upstream’ system-level factors in selected countries [32, 33]. Here, a variety of sources were used, including published research, programme reports, and interviews with key informants. However, in other cases, these domains remained underrepresented in the frameworks.

Lessons learnt

The teams did not use health systems frameworks to organise implementation considerations in all the guidelines, and where the teams did use them, this was after the implementation considerations had been developed. GRADE recommends the use of these types of frameworks, such as the TICD (Tailored Interventions for Chronic Diseases) Checklist [10], not only to organise implementation considerations but also to identify them. In future guideline processes, technical teams should consider more active use of such frameworks at an earlier stage of the process.

As mentioned, these frameworks also helped us to identify where there were gaps in the evidence. Although circumstances at higher levels of the health system are likely to be important for the successful implementation of most healthcare interventions, qualitative evidence tended to focus on the level of the individual and on lower levels of the health system. This was probably partly due to how the QES were framed. It likely also reflects traditions within qualitative research environments. However, it does not reflect any inherent limitations of this research methodology. Authors of primary qualitative research should be encouraged to explore views, experiences and processes at higher levels of the system to a greater extent. Meanwhile, future technical teams should consider the scope of commissioned QES and also consider how they can gather direct input from higher level stakeholders, for instance, by greater use of key informant interviews.

When are overall implementation considerations made versus recommendation-specific implementation considerations?

In many cases, the QES findings were only relevant for specific recommendations or groups of recommendations. However, some QES findings raised issues that were relevant across the whole scope of the guideline and highlighted how broader health barriers could potentially prevent the successful implementation of all of the included interventions. For instance, in the WHO guideline on task-shifting for maternal and newborn health [5], the QES pointed to inadequate health worker training among all relevant health worker cadres and for diverse tasks. In other cases, QES highlighted even broader barriers such as infrastructure problems beyond the health system. For instance, in the digital health guidelines [16], the QES pointed to widespread problems with network connectivity and access to electricity that could ultimately prevent the implementation of all the included interventions. The QES also identified healthcare users’ needs and preferences that cut across the individual guideline recommendations. For instance, in the intrapartum care guideline [15], the QES showed that most women preferred to have a normal birth without unnecessary interventions, although they did appreciate that medical interventions are sometimes necessary [20]. In addition to recommendation-specific implementation considerations, the technical team therefore developed an overview of cross-cutting implementation considerations applicable to all the recommendations. The team presented this information outside of the EtD framework format, as this format currently does not allow for this type of cross-cutting information.

Lessons learnt

The primary aim of the technical team was to use the QES findings to develop implementation considerations that were relevant to specific recommendations in the guideline. These recommendations are often relatively narrow, usually following the PICO (Population, Intervention, Comparison, Outcome) approach used for research on effectiveness. Qualitative research, on the other hand, is designed to be more explorative, and focuses on people’s own experiences and categorisations. In practice, this means that qualitative data are rarely confined to the boundaries of ‘PICO’, and are generally broader in scope. This has practical implications when using qualitative evidence, not only when developing implementation considerations, but also when gathering information about acceptability, feasibility and equity, as it may be difficult to categorise findings as belonging to specific EtD frameworks. While the technical team solved this problem by developing cross-cutting presentations of implementation considerations, future research should explore how the broader issues identified through qualitative evidence can be acknowledged and incorporated into the design of the EtD frameworks.

Discussion

In a series of WHO guideline processes where we worked on the technical teams, we used QES as our main source of information when developing implementation considerations. As we have gained experience, our processes for doing so have been fine-tuned and we have learnt a number of lessons, summarised in Box 1.

Nevertheless, the transition from qualitative evidence to implementation considerations has highlighted a number of issues that deserve further discussion, including whether researchers should use existing health systems frameworks when developing implementation considerations; whether researchers should take confidence in the underlying evidence into account when developing implementation considerations; whether qualitative evidence that reveals implementation challenges should lead guideline panels to make conditional recommendations or only point to implementation considerations; and whether guideline users find it helpful to have challenges pointed out to them or whether they also need solutions to be suggested. Finally, we need to explore how intervention-specific EtD frameworks can be adapted to allow for broader, cross-cutting issues identified through qualitative research.

Despite these issues, our experiences suggest that qualitative evidence represents a useful source of information for future implementers. By systematically gathering global evidence about people’s experiences of clinical and health systems interventions, guidelines can offer useful information that local implementers can learn from. However, these implementers still need to adapt the guidelines to their local settings. We are currently exploring different approaches to assist them in doing so, including how logic models based on qualitative evidence and local data can be used to inform derivative products (e.g. manuals, toolkits) for guideline adaptation and implementation.

Conclusions

As members of the guideline technical teams, we experienced QES as a useful source of information when developing implementation considerations. However, questions still remain about how researchers should prepare and present implementation considerations, how guideline panels should respond to implementation challenges when reaching recommendations, and how we can prepare information about implementation that guideline users find useful. The use of derivative products to further assist these end users also needs further exploration.

Abbreviations

CERQual:

Confidence in the Evidence from Reviews of Qualitative research

EtD:

evidence-to-decision (frameworks)

GRADE:

Grading of Recommendations, Assessment, Development and Evaluation

QES:

qualitative evidence synthesis/syntheses

RMC:

respectful maternity care

WHO-AFRO:

WHO African Regional Office

References

  1. World Health Organization. Constitution of the World Health Organization. http://apps.who.int/gb/bd/PDF/bd47/EN/constitution-en.pdf?ua=1. Accessed 21 Jun 2019.

  2. World Health Organization. WHO Handbook for Guideline Development. Geneva: WHO; 2014. http://www.who.int/publications/guidelines/handbook_2nd_ed.pdf. Accessed 10 Apr 2018

    Google Scholar 

  3. Wang Z, Norris SL, Bero L. The advantages and limitations of guideline adaptation frameworks. Implement Sci. 2018;13:72.

    Article  Google Scholar 

  4. Alonso-Coello P, Schünemann HJ, Moberg J, Brignardello-Petersen R, Akl EA, Davoli M, et al. GRADE Evidence to Decision (EtD) frameworks: a systematic and transparent approach to making well informed healthcare choices. 1: Introduction. BMJ. 2016;353:i2016.

    Article  Google Scholar 

  5. World Health Organization. WHO Recommendations: Optimizing Health Worker Roles to Improve Access to Key Maternal and Newborn Health Interventions through Task Shifting. 2012. http://www.who.int/iris/handle/10665/77764. Accessed 21 Jun 2019.

  6. Glenton C, Lewin S, Gülmezoglu AM. Expanding the evidence base for global recommendations on health systems: strengths and challenges of the OptimizeMNH guidance process. Implement Sci. 2016;11:98.

    Article  Google Scholar 

  7. Glenton C, Colvin CJ, Carlsen B, Swartz A, Lewin S, Noyes J, et al. Barriers and facilitators to the implementation of lay health worker programmes to improve access to maternal and child health: qualitative evidence synthesis. Cochrane Database Syst Rev. 2013;10:CD010414.

    Google Scholar 

  8. Rashidian A, Shakibazadeh E, Karimi- Shahanjarini A, Glenton C, Noyes J, Lewin S, et al. Barriers and facilitators to the implementation of doctor-nurse substitution strategies in primary care: qualitative evidence synthesis (protocol). Cochrane Database Syst Rev. 2013;10:CD010412.

    Google Scholar 

  9. Moberg J, Oxman AD, Rosenbaum S, Schünemann HJ, Guyatt G, Flottorp S, et al. The GRADE Evidence to Decision (EtD) framework for health system and public health decisions. Health Res Policy Syst. 2018;16:45.

    Article  Google Scholar 

  10. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, et al. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013;8:35.

    Article  Google Scholar 

  11. Downe S, Finlayson K, Lawrie TA, Lewin S, Glenton C, Rosenbaum S, et al. Qualitative evidence synthesis (QES) for guidelines: Paper 1. Using qualitative evidence synthesis to inform guideline scope and develop qualitative findings statements. Health Res Policy Syst. 2019;17. https://doi.org/10.1186/s12961-019-0467-5.

  12. Lewin S, Glenton C, Lawrie TA, Downe S, Finlayson K, Rosenbaum S, et al. Qualitative evidence synthesis (QES) for guidelines: Paper 2. Using qualitative evidence synthesis findings to inform evidence-to-decision frameworks and recommendations. Health Res Policy Syst. 2019;17. https://doi.org/10.1186/s12961-019-0468-4.

  13. World Health Organization. Health Worker Roles in Providing Safe Abortion Care and Post-Abortion Contraception. Geneva: WHO; 2015.

    Google Scholar 

  14. World Health Organization. WHO Recommendations on Antenatal Care for a Positive Pregnancy Experience. Geneva: WHO; 2016. http://www.who.int/reproductivehealth/publications/maternal_perinatal_health/anc-positive-pregnancy-experience/en/. Accessed 10 Apr 2018

    Google Scholar 

  15. World Health Organization. WHO Recommendations: Intrapartum Care for a Positive Childbirth Experience. http://apps.who.int/iris/bitstream/handle/10665/260178/9789241550215-eng.pdf. Accessed 10 Apr 2018.

  16. World Health Organization. WHO guideline: Recommendations on Digital Interventions for Health Systems Strengthening. Geneva: WHO; 2019.

  17. Noyes J, Booth A, Cargo M, Flemming K, Garside R, Hannes K, et al. Cochrane Qualitative and Implementation Methods Group guidance series-paper 1: introduction. J Clin Epidemiol. 2018;97:35–8.

    Article  Google Scholar 

  18. Lewin S, Booth A, Glenton C, Munthe-Kaas H, Rashidian A, Wainwright M, et al. Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series. Implement Sci. 2018;13(Suppl 1):2.

    Article  Google Scholar 

  19. Downe S, Finlayson K, Tunçalp Ö, Gülmezoglu AM. Provision and uptake of routine antenatal services: a qualitative evidence synthesis. Cochrane Database Syst Rev. 2019;6:CD012392.

  20. Downe S, Finlayson K, Oladapo OT, Bonet M, Gülmezoglu AM. What matters to women during childbirth: a systematic qualitative review. PLoS One. 2018;13(4):e0194906.

    Article  Google Scholar 

  21. Ames HM, Glenton C, Lewin S. Parents’ and informal caregivers’ views and experiences of communication about routine childhood vaccination: a synthesis of qualitative evidence. Cochrane Database Syst Rev. 2017;2:CD011787.

    PubMed  Google Scholar 

  22. Shakibazadeh E, Namadian M, Bohren MA, Vogel JP, Rashidian A, Nogueira Pileggi V, et al. Respectful care during childbirth in health facilities globally: a qualitative evidence synthesis. BJOG. 2018;125(8):932–42.

    Article  CAS  Google Scholar 

  23. UNHRC. Technical Guidance on the Application of a Human Rights-Based Approach to the Implementation of Policies and Programmes to Reduce Preventable Maternal Morbidity and Mortality: Annual Report. Geneva: United Nations Human Rights Council; 2012.

    Google Scholar 

  24. Bohren MA, Vogel JP, Tuncalp O, Fawole B, Titiloye MA, Olutayo AO, et al. “By slapping their laps, the patient will know that you truly care for her”. A qualitative study on social norms and acceptability of the mistreatment of women during childbirth in Abuja, Nigeria. SSM Popul Health. 2016;2:640–55.

    Article  Google Scholar 

  25. Odendaal W, Goudge J, Griffiths F, Tomlinson M, Leon N, Daniels K. Healthcare workers’ perceptions and experiences on using mHealth technologies to deliver primary healthcare services: a qualitative evidence synthesis. Cochrane Database Syst Rev. 2015;11:CD011942.

    Google Scholar 

  26. Alexander PE, Bero L, Montori VM, Brito JP, Stoltzfus R, Djulbegovic B, et al. World Health Organization recommendations are often strong based on low confidence in effect estimates. J Clin Epidemiol. 2014;67(6):629–34.

    Article  Google Scholar 

  27. Alexander PE, Gionfriddo MR, Li SA, Bero L, Stoltzfus RJ, Neumann I, et al. A number of factors explain why WHO guideline developers make strong recommendations inconsistent with GRADE guidance. J Clin Epidemiol. 2016;70:111–22.

    Article  Google Scholar 

  28. De Savigny D, Adam T. Systems Thinking for Health System Strengthening. Geneva: WHO; 2009. http://apps.who.int/iris/bitstream/handle/10665/44204/9789241563895_eng.pdf. Accessed 21 Jun 2019.

  29. The SURE Collaboration. SURE Guides for Preparing and Using Evidence-Based Policy Briefs: Identifying and Addressing Barriers to Implementing Policy Options 2011. version 2.1. Updated Nov 2011. http://www.who.int/evidence/sure/guides/en/. Accessed 21 Jun 2019.

  30. World Health Organization. National eHealth Strategy Toolkit: Overview. Geneva: WHO; 2012. https://www.who.int/ehealth/publications/overview.pdf. Accessed 21 Jun 2019.

  31. World Health Organization. WHO Recommendations on Digital Interventions for Health Systems Strengthening. Geneva: WHO; 2019. 

  32. Glenton C, Sorhaindo AM, Ganatra B, Lewin S. Implementation considerations when expanding health worker roles to include safe abortion care: a five-country case study synthesis. BMC Public Health. 2017;17:730.

    Article  Google Scholar 

  33. Gopinathan U, Lewin S, Glenton C. Implementing large-scale programmes to optimise the health workforce in low- and middle-income settings: a multicountry case study synthesis. Tropical Med Int Health. 2014;19(12):1437–56.

    Article  Google Scholar 

Download references

Acknowledgements

We thank The Norwegian Institute in Rome for hosting the writing group meeting in April 2018 and the stakeholder reviewers of this paper, Luc De Bernis and Ana Pilar Betrán, for their valuable comments and suggestions.

Funding

The work on this paper was supported through a grant from the Sanofi Espoir Foundation. The funding body had no role in the content of this manuscript and the findings and conclusions do not necessarily reflect their official position.

Availability of data and materials

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

CG, OT, SD, KF and SL devised the series concept. CG wrote the first draft of the manuscript and led the revision following stakeholder feedback with contributions from all authors. TL managed the stakeholder feedback and revision process. SR designed the figures. All authors undertook subsequent reviews and revisions of the manuscript and approved the final version. The authors alone are responsible for the views expressed in this article and they do not necessarily represent the views, decisions or policies of the institutions with which they are affiliated.

Corresponding author

Correspondence to Theresa A. Lawrie.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Glenton, C., Lewin, S., Lawrie, T.A. et al. Qualitative Evidence Synthesis (QES) for Guidelines: Paper 3 – Using qualitative evidence syntheses to develop implementation considerations and inform implementation processes. Health Res Policy Sys 17, 74 (2019). https://doi.org/10.1186/s12961-019-0450-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-019-0450-1

Keywords