Approaches and impact of non-academic research capacity strengthening training models in sub-Saharan Africa: a systematic review

Background Research is essential to identify and prioritize health needs and to develop appropriate strategies to improve health outcomes. In the last decade, non-academic research capacity strengthening trainings in sub-Saharan Africa, coupled with developing research infrastructure and the provision of individual mentorship support, has been used to build health worker skills. The objectives of this review are to describe different training approaches to research capacity strengthening in sub-Saharan Africa outside academic programs, assess methods used to evaluate research capacity strengthening activities, and learn about the challenges facing research capacity strengthening and the strategies/innovations required to overcome them. Methodology The PubMed database was searched using nine search terms and articles were included if 1) they explicitly described research capacity strengthening training activities, including information on program duration, target audience, immediate program outputs and outcomes; 2) all or part of the training program took place in sub-Saharan African countries; 3) the training activities were not a formal academic program; 4) papers were published between 2000 and 2013; and 5) both abstract and full paper were available in English. Results The search resulted in 495 articles, of which 450 were retained; 14 papers met all inclusion criteria and were included and analysed. In total, 4136 people were trained, of which 2939 were from Africa. Of the 14 included papers, six fell in the category of short-term evaluation period and eight in the long-term evaluation period. Conduct of evaluations and use of evaluation frameworks varied between short and long term models and some trainings were not evaluated. Evaluation methods included tests, surveys, interviews, and systems approach matrix. Conclusions Research capacity strengthening activities in sub-Saharan Africa outside of academic settings provide important contributions to developing in-country capacity to participate in and lead research. Institutional support, increased funds, and dedicated time for research activities are critical factors that lead to the development of successful programs. Further, knowledge sharing through scientific articles with sufficient detail is needed to enable replication of successful models in other settings.


Background
High quality research is essential to identify and prioritize health needs and to develop appropriate strategies to improve health outcomes [1]. However, despite the increase of publications from Africa during the past two decades [2], the representation of Africa in global research output is disproportionately low. For example, between 1997 and 2006, only 7 % of global tuberculosis research output came from Africa despite the region having the highest tuberculosis case rates in the world [3]. In 2004, research about Africa represented less than 1 % of scientific publications [4], growing gradually to 10 % as of 2011 [5].
In the last decade, the international call for developing research capacity in sub-Saharan Africa has grown [4,5]. Opportunities to support individuals pursuing academic studies and fellowships at academic institutions have increased [6,7]. However, there are several limitations to academic programs as the sole means for capacity strengthening in sub-Saharan Africathey can be long to complete and expensive and present a potential risk of drawing national researchers from program settings into academia, especially if no strong partnerships exist between academia and local programs [3]. Further, academic research tends to miss operational perspectives from programs [3]. To overcome these limitations and as a complement to these academic programs, local organizations/institutions across Africa, often in partnership with institutions from developed countries, have implemented short trainings targeting specific research competencies of health program staff. The term 'nonacademic' is used throughout this paper to refer to training programs that do not lead to formal academic qualifications, although they may use academic training staff and/or infrastructure.
Strengthening research capacity in non-academic settings encompasses a variety of activities, including trainings to support individuals to acquire research skills in addition to developing research infrastructure at an institutional level, creating research partnerships/networks, and providing individual support and mentorship [8]. In this paper, we focus specifically on the training activities in the research capacity strengthening programs. The goals, approaches, target audience, and effectiveness of the skill-specific research trainings in sub-Saharan Africa vary widely. However, there are few peer-reviewed published descriptions of these activities to support the replication or adaptation of such programs in other locations. The objectives of this systematic review are therefore 1) to describe the different approaches to research capacity strengthening in sub-Saharan Africa beyond academic programs, 2) to assess methods used to evaluate research capacity strengthening activities and summarize their results, and 3) to learn about challenges to research capacity strengthening and strategies/innovations to overcome those challenges. This review will contribute to research capacity strengthening efforts by providing insights from different approaches that could be applied to other locations and to encourage more complete reporting of such initiatives.

Identification of data sources
The PubMed database was searched by the principal investigator (LM) for articles describing research capacity strengthening training activities in Africa. The following search terms were used (illustrated in Fig. 1): words that indicate an increase in competency ("building", "development", "strengthening", and "training") combined with "capacity" as well as the terms "Africa" and "health" and "research". Further search criteria were 1) papers published between 2000 and 2013 and 2) both abstract and full paper available in English. The results were saved into a Mendeley library.

Study selection
The titles and abstracts were reviewed by the principal investigator (LM) to ensure they met the following inclusion criteria: 1) research capacity strengthening training activities are explicitly described, including information on program duration, target audience, and immediate program outputs and outcomes, 2) all or part of the training program took place in sub-Saharan African countries, and 3) the training activities are not a formal academic program. When all criteria were met, or more information was needed, articles were retained for full text review (Fig. 2). Articles were also assessed after full text review and dropped if not meeting all eligibility criteria. Articles not captured in the original search were added, either because they were known to the authors or were identified through a snowballing process of reviewing the reference list of retained articles.

Data extraction and analysis
Two independent reviewers extracted data from the full text articles, which was captured in a three-part data collection form. The form was developed based on the research team's experience in conducting research strengthening activities and adapted based on themes that emerged during the review of articles. The first part covered program description information, including name of the program, program duration, target audience, objectives of the training, frequency of the training, qualification of the trainers, resources required for the training, and where the training took place. The second part used Cooke's evaluation framework to assess the effectiveness of the trainings [9]. Cooke's framework was chosen because it comprehensively describes the indicators for individual training in research capacity building based on six principles: research skills, practice implications, partnerships, dissemination, infrastructure, and sustainability. We grouped the trainings based on their evaluation period. The short term evaluation period was defined as evaluations conducted up to 18 months after the training. The long-term evaluation period was any period greater than 18 months.
Finally, data were extracted on challenges faced, innovations used, and recommendations proposed for future programs. Data extractions from both reviewers were entered into a Microsoft Access database and compared for consistency. When inter-reviewer discrepancies were found, they were resolved by a third party review of the paper.

Results
The search resulted in 495 articles, of which 450 were retained following the removal of duplicates (Fig. 2). Based on abstract review, 24 articles were classified as potentially relevant, 425 were dropped, and one abstract was not available. Full text review of the 24 articles yielded 11 relevant texts. The same training program was presented in two of these articles and so only the more recent and relevant publication was retained. Four additional articles were identified through the snowball process, resulting in 14 relevant articles in total.

Target audience and trainers
In total, 4136 people were trained of which at least 2939 were from Africa. Trainees in these programs were of very different backgrounds and qualifications. Participants included clinical staff, health officers and managers working within health programs, university students and faculty, and experienced researchers. Generally, participants were selected based on their potential to influence health systems and management processes, ability to conduct research activities, and their involvement or expertise in the field. Only two training programs had a rigorous selection process whereby criteria, such as years of experience in research ethics, number of publications, institutional support, and personal commitment, were considered [15,23]. For 11 of the 14 programs, participants were from the country where the training took place. However, three of the programs required out-of-country travel of the participants to the training site. For these, one was in sub-Saharan Africa [10] and two in Europe or USA [15,23].
Specific details about the qualifications of trainers were not reported. Most of them were individuals with experience and expertise in the area of interest, either based in the country or brought in through a partnership as an international expert. They included faculty from universities, researchers, and practitioners in a given field.

Structure, duration of the trainings and follow-up
Structure and duration of research strengthening activities outside academic settings vary widely. Five of 14 training programs [10,14,17,19,20] only featured faceto-face sessions conducted over a short period of time; mostly less than a week. Seven training programs featured both face-to-face sessions and practicums. Of these, four had face-to-face sessions spread over a longer period with intervals of field activities taking place in between [13,16,23]. Face-to-face sessions took at least 3 weeks, whilst one training program featured short classes of 2 1/2 days [21]. Three training programs mixing face-to-face sessions with practicums conducted the practicum after the face-to-face sessions. Two had 6 days of classes [12,18], whilst another one had a longer period of face-to-face classes [15] and practicum ranges between 1 [18] and 12 months [15]. Further, one training program had five courses conducted over 5 days each, structured as a ladder [11], where success at a lower level determined who moved up to the next. Finally, one training program was web-based [22], taking 100 days to complete. Seven training programs provided follow-up to their trainees [11,12,15,16,18,21,23] in terms of ongoing mentorship, onsite technical assistance, and supervisory visits.

Evaluation
Of the 14 training programs, six fell in the category of short-term evaluation period [14, 17-19, 21, 22] and eight in the long-term evaluation period [10-13, 15, 16, 20, 23]. For the training programs with a short-term evaluation period, one was not evaluated [19] and only one (16.7 %) used a recognized framework for evaluation [18]. These training programs used quantitative evaluation methods, mainly surveys and tests. Of the training programs with long-term evaluation periods, 37.5 % (n = 3) used a framework [10,15,16]. All of these training programs were evaluated, using quantitative or qualitative methods including interviews, surveys, and systems approach framework.

Training programs with short term evaluation period
All training programs reported an increase in research knowledge and skills (100 %; Table 2). More than half of the training programs (50-67.7 %), reported the involvement of practitioner and program staff in the training, the relevance or use of training related research in practice, and the existence of inter-professional linkages. None of these training programs, however, reported on or used impactful dissemination (publication, conferences, workshop presentations, changes in policy and practice) as a key principle in research capacity strengthening. Further, there was no information about conduct of research after training, patient centred outcome     Training program with long-term evaluation period All training programs reported on an increase in research knowledge and skills and research undertaken after training (100 %; Table 3). More than half of these trainings (50-87.5 %), reported evidence of confidence building among trainees, the involvement of practitioner and program staff in the training, the relevance or use of training-related research in practice, the existence of inter-professional linkages, publications in peer-reviewed journals, evidence of applied research findings, continued mentorship and supervision, and enduring collaborations. None of these training programs either reported on or used availability of protected research time, budget lines, or existence of mentorship and supervision structures.

Challenges, innovations and recommendations
This review identified major themes regarding challenges to research capacity strengthening activities and suggested corresponding innovations and recommendations to address the challenges (Table 4). Common challenges to capacity strengthening were lack of mentorship and institutional support [10,13,16,18,20,21,23]; insufficient time for research activities and drop out [10,16,18,20,21]; lack of sufficient budget for research activities [11,13,18,23]; poor research infrastructure [12,13,17,18,23]; and difficulty in publishing in international journals [11,21,23]; three papers did not report any challenges [16,18,21]. Challenges faced by participants are distinguished to those faced by facilitators and organizers. On the one hand, participants who lack support and mentorship from supervisors and managers are more likely to drop out of the training or their research projects are likely to be delayed. On the other hand training organizers and facilitators find it difficult when participants are pulled out of the training because of other work responsibilities, particularly when training organizers and the organization where a participant works do not have a memorandum of understanding. Infrastructural challenges such as poor internet and inadequate space and equipment affect both participants and facilitators' performances. Further, when participants have heavy workloads they are likely to drop out of the training, thus affecting trainers and organizers. A lack of funding implies that any research requiring funds will not be performed and training activities could be hampered, for example, when participants need transport and do not have money. For organizers, a lack of funding could mark the end of training activities since they face shortages of materials, facilitators, and poor infrastructure.
Various recommendations and innovations are proposed to address the challenges to research capacity strengthening. Institutional support and mentorship is achieved in different ways such as provision of mentorship and supervision visits by programme managers [10,16], developing strong professional network [15], and seeking commitment from stakeholders [13,16,18]. Increased time for research [18,23], suitable training schedule [18], and creating web-based training helps to tackle the challenge of insufficient time. Building more funding resources for research activities [11], embedding research into a health program [21], and integrating courses into existing curriculum [16,20] are recommended as strategies to address the lack of funding. The challenge of publication could be addressed through provision of mentorship on publication process [11] and finding other means of dissemination than international journals [21], for example, through special meetings with stakeholders. Provision of further training to improve writing skills of young researchers would increase the likelihood of having a manuscript accepted for publication.

Discussion
In this systematic review, we identified 14 papers that describe research capacity strengthening activities outside of formal academic programs in sub-Saharan Africa. We found that training programs generally fell into two categories: longer training programs covering multiple competencies and shorter training programs targeting a single research competency. Generally, shorter programs did not have practicum projects as part of the training nor did they provide mentorship/support post-training. These two features make such programs less expensive and less time consuming and therefore more feasible for many settings. However, though their contribution to the increase in research skills and knowledge is recognized, we found little evidence that links these programs to the research projects conducted. Further, offering trainings that focus on narrow competencies would then require multiple trainings to enable participants to take a research question through to publication, if this is the intended goal.
Alternatively, training programs which are more comprehensive yield better outcomes in terms of the number of research projects conducted and resulting publications. They are offered over a longer period and often require ongoing mentorship/support. The demands on both human and financial resources make such trainings more expensive and time consuming and therefore less accessible to many organizations.   Lack of strategies encouraging recent trainees to apply new learning within the services [10] Difficulty getting buy-in from institutions [16,23] Provide mentorship to participants by managers to enhance application of acquired skills on the job [10] Drop out from training program because of no mentorship [16,18,20] During application approvals, organizational commitment to in-service training for capacity development [10] Delay in completing research projects because of no mentorship [16] Weak co-ordination due to incompetency of leaders [13] Support professional network and alternative communication pathways to improve intra-and inter-program collaboration [15] Lack of communication between participants and supervisors [21] Engage with institutions from the beginning and get commitment from program leadership [16,18] Sensitize policy-makers and health managers through special meetings [13] Poor research infrastructure Poor internet [17] Poor internet [17] Improve internet access [17,23] Inadequate space and lack of equipment [18,23] Difficulty in securing adequate space for research activities [12,13,18,23] Provide budget lines dedicated for improving research infrastructure [23] Insufficient time for research and program dropouts Trainees get absorbed into routine work and responsibilities [16] Loss of trainees through dropout [16,18,20] Conducting training activities at the workplace Trainees take jobs with other institutions [16] Trainers do not have resources nor authority to conduct effective follow-up within workplace [10] Mismatches between participants' capabilities and training priorities [21] Increase time allocated to research activities [18,23] Suitable training schedule [18] Establish strong selection criteria to minimize dropouts [23] Add distance learning to face-to-face classes Provide support supervision to trainees by program staff and/or mentors [16] Lack of funds for research activities Lack of resources to conduct research activities [11,13,23] Dependence on external institutions or donors for funding [13] Build more resources for funding [11] Embed research agenda into health program [21] Difficulty in accessing training location [18] Develop strong institutional infrastructure (administrative leadership) [18] Integrate courses into existing curriculum [16,20] Difficulty in publishing papers in international journals Difficulties in publishing in international journals [11,21,23] Mentor on publication process [11] Strengthen selection criteria to get strong candidates Explore other opportunities such as publishing in local journals and presenting at local meetings [21] Provide further training [21] Language barriers and differences in educational levels Trainees face communication challenges [18] Difficult to manage a group of different levels of education [21] and/or speaking different languages [18] Strategic groupings of participants with similar skill levels [21] Mugabo et al. Health Research Policy and Systems ( 2 0 1 5 ) 1 3 : 3 0 Most of the studies in our review did not report on the program implementation costs. When reported, these costs varied widely, between $500 and $20,000 per project, depending on scope of the project, location, and duration of training. While actual costing of programs is difficult, reporting of the estimated expenditure are important to other people planning these training activities, particularly because resource allocation is among the major barriers in research capacity strengthening activities. Except for one program that was a national program [21], programs primarily relied on North-South partnerships for funding, highlighting the need for strengthening partnerships with more focus on South research agenda [24], as well as galvanizing national resources and increasing South-South research collaboration.
In addition to variability in the program approaches, there was a large variability in evaluation approaches. Self-report surveys, pre-/post-tests, interviews, email questionnaires, and system approaches were all found to have been used. Self-report surveys and pre-/post-tests were used by shorter training programs and administered during or shortly after the completion of training. That period was not enough for such training to have had an impact on participants, but rather they reported on the perception of participants about the course and whether changes in knowledge have occurred. Longer trainings, on the other hand, were more likely to followup participants through implementation of research projects over which additional technical assistance and mentorship are provided. Specific deliverables for most of those training programs which include writing a protocol and/or writing and publishing a manuscript enable them to determine the level of their success. Understandably, the long period of implementation, in addition to both technical and financial support provided to complete research projects, is likely to increase the number of protocols written, research projects conducted and published, and the influence in policy and practice change among others. However, much needs to be done to fully understand the impact of such capacity strengthening trainings. For instance, better baseline assessment using comprehensive tools, such as those employed by systems approach [25], are needed as well as better reporting on whether there were other outside enabling factors.
The evaluation metrics for research capacity strengthening programs are debated in the literature. Some suggest that success should be measured in terms of papers published [26]; however, this implies that writing a paper is the ultimate goal for the training or target competency desired by the individual. Others advise that change in policy and practice should be the end goal of research capacity strengthening activities in order to improve the quality of service delivery [27]. There are few training programs that cover all necessary competencies to write and publish a research paper as an indicator of success; this requires not only substantial resources in terms of trainers and mentors, time, and money, but also strong candidates, thus limiting the number of training participants. Further, using research to change policy is difficult, requiring ongoing engagement and co-operation between all stakeholders, and documenting such change in a concrete and objective way is even more challenging. Alternatively, Harries et al. [21] advocates for embedding research training activities into existing health programs. This suggests that training is budgeted for as any other activity of the program and often times participants in that training are staff who work within health programs.
The challenges to research capacity strengthening identified in this review have been observed by others. Several studies report limited funding for research [6,8,26,28,29], no dedicated time for research [3,26,29], and a lack of mentorship and institutional support [8,13,27]. In addition, challenges identified but not discussed in papers in this review include difficulties in carrying out quality evaluation particularly for long term outcomes and the imbalanced focus on research methods and process at the expense of research advocacy, promotion, negotiation, and resource mobilization [30]. These challenges are complex and call for sustainable partnerships and commitment to the goals of research capacity strengthening in Africa.
While academic and non-academic training programs face similar challenges, some of the challenges, such as lack of institutional support or research leadership, are more pronounced in non-academic settings. Our review identified one program with institutional support [13] which also had the most significant and quantified impact on society through policy and practice changes. We believe that the research developed as part of academic trainings is more likely to be published because of the existence of such support. Furthermore, trainees in academic programs tend to have time separated out for research and thus do not face the similar challenge of balancing work and research training concurrently. Academic programs may also be appealing because of the existing accreditation process that is difficult for the non-academic program.
There were two primary limitations to this systematic review. First, for the articles identified, relevant information on important features, including features that would support replicability, were missing. For example, it is possible that some programs offered on-going mentorship, but we were unable to report this feature because it was not described in the paper. Information on financial and material resources, qualifications, and number of trainers/facilitators needed to undertake capacity strengthening activities were poorly reported, which not only is a limitation of this review, but may weaken the ability to replicate the program in other settings. A second limitation is that this review only included scientific articles that had the abstract and full paper available in English. Therefore, we believe that programs published in languages other than English or presenting their results in grey literature may have been overlooked. Though grey literature may offer more detailed information about training programs, their use is hampered by the difficulty in accessing reports years after their production and limited information on the individuals involved in producing the report. However, because of publication bias in scientific literature, this review may have missed training programs that were deemed less successful, less "innovative", or may have had less academic collaboration. On the other hand, the limited number of articles and the limited detail in the articles serves as a call-to-action for individuals developing and leading such research capacity strengthening activities to ensure that approaches and lessons learnt are shared more widely and with enough details to facilitate the replication of their activities in other settings.