Skip to main content

Table 1 Findings and supporting evidence

From: Academic contributions to the development of evidence and policy systems: an EPPI Centre collective autoethnography

Insights and lessons

Supporting evidence

Incubating innovation

 Key to successful innovation in research synthesis methods for decision making was working constructively across the interface of research with the wider world

Conventional methods minimally adapted for new fields, in this case methods for reviewing the effects of clinical interventions applied largely unchanged to reviewing health promotion interventions, provided scant evidence to inform policy or practice decisions [67]

Novel synthesis methods were inspired by discussions with policy teams, practitioners and service users with the aim of drawing learning from the evidence available to: better develop, implement and evaluate health promotion interventions [33, 76, 79, 80, 83, 104], identifying their active components [116], investigating inequalities in health [41], and reducing inequalities through community engagement [88]

Information technology was ‘designed from the bottom up’ by software developers working closely with systematic reviewers, initially within the EPPI Centre, and then with other organisations needing software to support systematic reviewing [114]

 Successful innovation in research synthesis methods required sufficient time and collaborative learning to transform exchanges from mutual criticism of different academic, policy or practice lenses to mutual understanding and ultimately integrating different sets of knowledge into coherent research syntheses

Mutual criticism and heated discussions often resulted from short term encounters in the form of single researcher-led workshops or occasional meetings with service providers commonly led to heated discussions about competing intellectual positions (Minutes of steering group meeting, March 1996). An exception was the fourth in a series of workshops where strong criticism of a systematic review was collated in a letter to the lead author who subsequently updated the review in light of the criticism [79, 80, 83]

Collaborative learning came from working relationships between producers and users of systematic reviewers sustained over several years, leading to synthesis methods being adapted for new fields such as health promotion (Oliver and Peersman 2001), health systems [47] and socioeconomic development [75]

 Application of novel collaborative research synthesis methods accelerated uptake of findings and methods

Speedy uptake of evidence appeared to result from collaborative working, with evidence informing decisions appearing within very few years. The UK Newborn Screening Programme Centre (established in 2002) published its UK national standards, policies and handbook within three years [40], at the same time as its underpinning research about communication with parents and development of information and training resources [35, 36, 110]. Pre-publication findings from a Cochrane review [52] informed NICE guidelines on smoking cessation in pregnancy [9]. The literature about time lags in translational research in health [59] suggests this is faster than other estimates of the time between publication and guidelines, as calculated by Grant et al. (2000) and HERG (2008), which were eight years and 13 years respectively. This rapid uptake has become a feature of the more recent common practice of panels commissioning reviews to develop evidence-informed guidelines. For instance, a systematic review about home based records [56] informed the WHO guideline it was commissioned for the same year [124] and health policy in Afghanistan only a year later [99]

Speedy uptake of synthesis methods: Similarly, novel methods co-developed with policy teams for reviewing health promotion [76] were applied independently within five years to inform (a) the Home Office about the drivers of perceptions of anti-social behaviour [54], and (b) the development of guidelines about health worker access to preventive health measures [125]

Spreading innovation

 Spreading innovation, in this history, is better described as researchers sharing the innovation process with wider networks and regime organisations

Disseminating novel methods had limited success. Stakeholder involvement to shape systematic reviews [92] and develop policy [109] was showcased by the Social Care Institute for Excellence (now part of NICE, https://www.scie.org.uk/almost-there) with evidence of use limited to six and one citations respectively (Google Scholar)

Disseminating systematic review evidence: The Department for International Development placed systematic review reports in the public domain. Nevertheless, few of them appear to have been used unless they were also developed collaboratively with potential review users [70]

Collaboratively developing training offered cycles of refreshing innovations for new audiences. Critical appraisal skills training, originally designed for clinicians, were adapted collaboratively with and for: consumer health information organisations [58]; health promotion organisations [82],and policy makers, practitioners and researchers in southern Africa [110]

Supporting collaborative learning helped researchers and policy makers work together to produce systematic reviews that informed policy decisions; informing decisions happened more often when the working relationship with policy teams was acknowledged in review reports [70]

Research contributions to informing public policy, recognised by the Robert Boruch Prize from the Campbell Collaboration in 2015 (https://www.campbellcollaboration.org/the-robert-boruch-award)

 Uptake of innovations, in this history, is better described as uptake of innovating teams, whose continuing innovations in research synthesis or public involvement were inspired by working for and with the regime organisations

Evidence for education was stimulated by the Department of Education commissioning the EPPI Centre to support groups producing systematic reviews [3]

Evidence for health systems gained from the Alliance for Health Policy and Systems Research (now hosted by WHO) commissioning the EPPI Centre (2007–15) to work with them to establish and support systematic review centres in the global south [48]

Evidence for socio-economic development gained from the UK Department for International Development commissioning systematic review centres (2010–2019) to strengthen capacity in systematic reviews for socio-economic development [70, 72, 96] and to develop collaboratively with the EPPI Centre a tool for assessing the impact of systematic reviews on governments and NGOs [70]

Evidence for humanitarian aid: When developing their research methods guidance for health emergency and disaster risk management, the World Health Organization [126] invited EPPI Centre authors to deliver the chapter on using logic models in research and evaluation of health emergency and disaster risk management interventions

Embedding innovation

 Embedding innovations was less about uptake and embedding of innovative packages, but more about co-developing and tailoring innovations with regime organisations, including our own university

Shared research agendas resulted from collaborations between research organisations and patient advocacy groups. Patient and public involvement developed by a team within the Health Technology Assessment programme [79, 80] was subsequently given an expanded remit across the National Institute of Health Research. The James Lind Alliance, which first worked independently with patients and clinicians to develop research agendas [17], was later integrated into the NIHR

Evidence systems and guidance for regime organisations integrated new ways of working developed with the EPPI Centre and other partners [2, 5, 38, 42, 47, 60, 126]. Collaborative learning was applied again with PEERSS partners, which led to capacity-strengthening reforms in government departments in both the Caribbean and Brazil [7]

Research infrastructure: Our host university supported our collaboration with the National Coordinating Centre for Public Engagement, to establish a new diamond open-access journal (free access for readers and contributors) about methods for public engagement with research (Institute of Education 2014 REF environment statement; OpenSciencehttps://uclpress.scienceopen.com/collection/UCL_RFA). These achievements were recognised by the UCL’s Institutional Leadership Award for Public Engagement in 2019. The EPPI Centre has similarly contributed advances in review methodology, information technology and research use to collaborative research infrastructure nationally and internationally (UCL Education Unit environment statement 2021)

Successful implementation of new policies: UK standards, policies and handbook for newborn bloodspot screening resulted from a centre co-led by a clinician, clinical scientist and social scientist commissioned by the Department of Health in 2002. A participative model of public and practitioner involvement in evidence-informed policy to create a collaborative network for the development of national newborn blood spot screening policy in the United Kingdom [107]. Parent information and professional training for newborn screening (evidence-informed and co-designed) was well received by clinicians, “well used and valued by both women and midwives” in the UK, and adopted by many programmes around the world [39, 107])

  1. Italic font indicates corroborating evidence from collaborators and others