Skip to main content

Using developmental evaluation to support knowledge translation: reflections from a large-scale quality improvement project in Indigenous primary healthcare

Abstract

Background

Developmental evaluation is a growing area of evaluation practice, advocated for informing the adaptive development of change initiatives in complex social environments. The utilisation focus, complexity perspective and systems thinking of developmental evaluation suggest suitability for evaluating knowledge translation initiatives in primary healthcare. However, there are few examples in the literature to guide its use in these contexts and in Indigenous settings. In this paper, we reflect on our experience of using developmental evaluation to implement a large-scale knowledge translation research project in Australian Aboriginal and Torres Strait Islander primary healthcare. Drawing on principles of knowledge translation and key features of developmental evaluation, we debate the key benefits and challenges of applying this approach to engage diverse stakeholders in using aggregated quality improvement data to identify and address persistent gaps in care delivery.

Discussion

The developmental evaluation enabled the team to respond to stakeholder feedback and apply learning in real-time to successfully refine theory-informed research and engagement processes, tailor the presentation of findings to stakeholders and context, and support the project’s dissemination and knowledge co-production aim. It thereby contributed to the production of robust, useable research findings for informing policy and system change. The use of developmental evaluation appeared to positively influence stakeholders’ use of the project reports and their responses to the findings. Challenges included managing a high volume of evaluation data and multiple evaluation purposes, balancing facilitative sense-making processes and change with task-focused project management, and lack of experience in using this evaluation approach. Use of an embedded evaluator with facilitation skills and background knowledge of the project helped to overcome these challenges, as did similarities observed between features of developmental evaluation and continuous quality improvement.

Conclusion

Our experience of developmental evaluation confirmed our expectations of the potential value of this approach for strengthening improvement interventions and implementation research, and particularly for adapting healthcare innovations in Indigenous settings. In our project, developmental evaluation successfully encompassed evaluation, project adaptation, capacity development and knowledge translation. Further work is warranted to apply this approach more widely to improve primary healthcare initiatives and outcomes, and to evaluate implementation research.

Peer Review reports

Background

Developmental evaluation (DE) is a growing area of evaluation practice, developed to accommodate emergent programmes and projects. DE is used to inform adaptive development of change initiatives in complex environments [1,2,3]; however, there is limited literature describing its use in Australian Aboriginal and Torres Strait Islander (hereafter respectfully referred to as Indigenous) health programmes [4] or in knowledge translation research [5, 6]. This article is based on our experience of using DE to support the implementation of a theory-informed process defined as ‘interactive dissemination’. The process engaged stakeholders with aggregated continuous quality improvement (CQI) data from Australian Indigenous primary healthcare (PHC) services. We draw on knowledge translation principles and features of DE to reflect on the rationale, benefits and challenges of using DE in this large-scale project. We discuss the potential of DE for strengthening improvement interventions and for supporting knowledge translation and dissemination in PHC contexts.

Indigenous people’s health and primary healthcare

Australia is a high-income country with large disparities in health outcomes between Indigenous and non-Indigenous people. The causes of this inequity include colonisation, land dispossession and associated trauma, socioeconomic inequality and racism [7]. Indigenous people access PHC through community-controlled and government-managed services established to meet their needs and through private general practices [8]. These PHC services are in diverse geographical settings and vary in size, resources and the range of services provided.

Improving health and well-being outcomes for Indigenous people in this complex healthcare environment requires change at multiple levels of the health system to support wide-scale improvement in the quality of PHC [9].

Knowledge translation: theory-informed and interactive

Effective knowledge translation is important for closing the gaps between what we know and what is actually done in PHC [10]. It is critically important for addressing prevailing heath equity gaps between population groups, such as those that exist between Indigenous and non-Indigenous Australians [11]. Theory-informed knowledge translation and dissemination approaches are recommended when designing and evaluating interventions because they help to understand how knowledge is generated and used, to explain clinical and organisational behaviour, to inform strategy selection, and to understand effects [12, 13]. Much knowledge translation and dissemination literature describes the benefits of dialogue-based and interactive processes for moving research results into policy and practice [14,15,16,17]. In particular, participatory approaches that engage potential knowledge users as partners in solution- and impact-focused research are advocated [18, 19]. It is argued that bringing together users’ knowledge of the topic and implementation context with researchers’ expertise in methods and content results in relevant, actionable findings that are more likely to be used to improve care [20].

Consistent with these approaches, participatory action and partnership-based research are well established in CQI research in Australian Indigenous PHC [21, 22]. They have been used to co-develop evidence-based CQI tools and processes [23,24,25,26,27,28], to co-design and collaboratively conduct a large programme of system-based research [21, 22, 29], and to implement studies at the local level. These CQI research projects reflect understanding that successful improvement interventions in Indigenous contexts are those that incorporate Indigenous values and concepts of health and wellbeing [30, 31], draw on existing strengths, and are tailored to population health needs and to social, cultural, organisational and geographical settings [32,33,34].

Developmental evaluation: utilisation and innovation focused

DE uses systems thinking to consider how multiple parts of complex and dynamic systems (such as healthcare systems) are interrelated, and focuses on users and real use of evaluation findings [35]. These features suggest suitability for evaluating projects that involve complex health system and translation issues, and which seek to engage multiple stakeholders in both research and change [36]. DE has been used to generate feedback as innovations are tested and to adapt programmes or products to their operating environments [37,38,39]. It has been used to modify products to suit new or changing contexts and users [37] and to engage communities of practice in systems change [1]. Other uses include strengthening the impact of multi-stakeholder research networks [40, 41] and developing collaborative processes between agencies addressing social challenges [42, 43]. DE positions evaluators as facilitators of change and embedded partners in innovation, and actively engages stakeholders in research, sense-making and change processes [1, 44]. These features support the utility of DE in strengthening participatory research processes and knowledge translation strategies and evaluating programmes in Indigenous settings, where DE has been used to develop or support innovative programmes that blend cultural and evaluation principles in contextually grounded approaches [4, 45, 46].

Our research team applied DE in a novel interactive dissemination strategy. The ‘Engaging Stakeholders in Identifying Priority Evidence–Practice Gaps and Strategies for Improvement in Primary Health Care’ (ESP) project (Box 1) engaged stakeholders in co-producing knowledge to inform system improvement for Indigenous health.

Discussion

Why use developmental evaluation in the ESP project?

The ESP project was novel in several respects – it was adapting knowledge translation theory [33, 47,48,49] to apply a CQI process at scale, using the largest available set of CQI data on Australian Indigenous PHC and it sought to engage people working in policy, management, CQI facilitator, health practitioner and research roles, in different geographical, organisational, social and cultural contexts, and at different levels of the health system in collective data interpretation and knowledge sharing. The ESP process aimed to draw on different types of knowledge (e.g. explicit, tacit, cultural) to identify common priorities, improvement barriers and enablers operating at individual, health centre/community and higher system levels, and possible ‘real-world’ solutions across the scope of clinical PHC. As would be expected, there was uncertainty about what processes, practices and products would work most effectively. Project implementation was certain to result in questions, challenges and successes that demanded real-time responses. We required an evaluation approach that could embrace this complexity and enable us to respond appropriately as needs and understandings evolved [1, 3, 37]. The approach also needed scope to appraise and adapt the theoretically informed research design [50].

Other factors favoured a DE approach. DE is characterised by repeated cycles of data collection, feedback, reflection and adaptation; the iterative research cycles of the ESP project were consistent with this feature of DE. Supporting innovators to bring about change that is tailored to group needs in complex, dynamic environments is a particular purpose of DE [35]. Our DE supported the engagement of stakeholders with CQI data to inform efforts to achieve multi-level system improvement in PHC systems for Indigenous people. Developmental evaluators are typically engaged as participant observers who guide data collection, inquiry and reflection-in-action [37, 51]. We had a team member who was able to undertake this role.

The objectives of the DE were to (1) explore facilitators and barriers to stakeholder engagement with the data and use of ESP project findings; (2) inform ongoing project refinement and implementation and; (3) assess the utility of the interactive dissemination process [52]. Figure 1 illustrates how the DE was concurrently and systematically applied in the interactive dissemination cycles. The developmental evaluator drew on multiple sources of data, including project records, respondent surveys and semi-structured stakeholder interviews, as outlined in the study protocol [52]. These sources were used to facilitate reflective processes through which the team, which comprised one Indigenous and three non-Indigenous members, critically appraised ESP implementation and planned responses. Agreed refinements were tested, increasing our understanding of what worked (and did not work) and informing modifications to the project design, processes and reports.

Fig. 1
figure 1

Systematically applying developmental evaluation in interactive dissemination cycles

Benefits of using developmental evaluation

Continuous tailoring to strengthen stakeholder engagement and research outcomes

The DE as planned [52] provided specific effort and resources and enabled a systematic approach to the evaluation and refinement of the ESP process as it unfolded. It structured team time to regularly reflect on what occurred, analyse meaning and consider options for change. For example, a reflective workshop 3 months after project commencement was important for refining and consolidating ESP processes, team meetings were convened following rounds of stakeholder interviews, DE was a standing item in project administration meetings and discussions took place when evaluation data suggested changes were needed. Meetings of our wider CQI research network also provided opportunities to share evaluation findings with stakeholders, discuss project adjustments and generate further research translation ideas (e.g. visual representation of common findings across ESP cycles in different areas of care) [53].

Incorporating feedback from the target audience for the ESP reports led to tailoring and improvement in the process and the quality of reports and other communication resources. Changes could be tested and refined with each iteration of the ESP dissemination process.

These processes were important for supporting and maintaining stakeholder engagement. The target audience was widely dispersed across Australia and we were a small team using an online dissemination process. Evaluation cycles of data collection, reflection and change offset our limited interpersonal contact with stakeholders – they enabled us to demonstrate that we were responsive to feedback and to incorporate our growing understanding of the factors impacting on project participation and outcomes. We were also demonstrating a systematic process to continually improve ESP project implementation, in effect modelling CQI. This was perceived to strengthen the rigour and credibility of the research.

Knowledge contribution and knowledge sharing

The ESP project design was adapted from a systematic process developed by French et al. [48] to link interventions to modifiable barriers to address evidence–practice gaps. In order to capture stakeholder knowledge about barriers and enablers operating at health centre and wider system levels [33, 54], we made innovations to a questionnaire exploring individual attributes that influence care [47] based on the Theoretical Domains Framework [49]. DE enabled the team to continually appraise and refine these innovations, and to adjust the project design (e.g. by merging two reporting and feedback phases into one). As a result, the ESP process successfully engaged stakeholders in identifying priority evidence–practice gaps, improvement barriers, enablers and strategies at individual, health centre and system levels in each area of care. It captured responses from people representing a range of roles, organisations and healthcare contexts. Input from Indigenous people (e.g. Indigenous health service staff, members of governing boards of health services) ranged from 10% of survey respondents for the child health ESP to around 52% of survey respondents for maternal health [55]. DE helped us understand how and how well the theory-based interactive processes worked, and whether and how much the intervention processes could be adapted without compromising the research outcomes.

In addition, the large amount of data generated by the DE enabled us to apply a theoretical framework post hoc to assess the utility of the interactive dissemination process. The i-PARIHS framework was identified as a suitable analytical tool because it highlights ‘how’ implementation is activated with intended recipients in their contextual settings. It comprises four key constructs – facilitation, the innovation or evidence, recipients, and context [56]. Use of i-PARIHS as an analytical framework provided a deeper understanding of how well the ESP project worked (and did not work) to engage stakeholders in knowledge co-production. The DE process emerged as a facilitator of successful project implementation [55].

Real-time responses and applied learning

Positioning the evaluator within the team as a facilitator of dialogue and change supported timely responses. For example, when some people expressed uncertainty about whether the ESP surveys required their input (e.g. some clinicians thought the survey questions were more suited to policy-makers and vice versa), we modified communication templates. The modifications conveyed how input from different professional groups added value to the research and how the findings could benefit their work.

A key benefit of DE is its developmental function. Our DE findings could be applied in real-time to improve tailoring of the ESP project to stakeholders and context. What we learnt about engaging stakeholders with evidence, and about conducting participatory research at a systems level, was applied through actual changes to the research design, surveys, reports, communications and supporting resources [57] as the ESP project progressed. These changes could be appraised and refined through iterative DE processes. Examples of decisions and adaptations made in response to evaluation feedback are shown in Table 1.

Table 1 Examples of evaluation feedback, team decisions and adaptations

Developmental evaluation challenges

Managing complexity and uncertainty

The characteristics of the ESP project that suited a DE approach – the novel use of aggregated CQI data, a previously untested dissemination process, complex PHC environment and a diverse target audience – sometimes resulted in ambiguous findings and uncertainty about the best way forward. It took time to appreciate that such uncertainty was typical in undertaking DE and to be comfortable with sense- and decision-making processes that occurred opportunistically.

We needed to be flexible and respond strategically to what was unfolding. This sometimes required us to revise previous decisions in the light of emerging patterns in feedback. For example, we initially dismissed the idea of merging separate surveys identifying barriers/enablers and strategies to maintain fidelity to the model on which the research was based. This decision was revised when competing work demands and lack of time were persistently identified as barriers to engaging with ESP reports and completing the surveys. Following the change, we monitored the quality of survey data and added an evaluation question inviting feedback about the change.

Using an embedded evaluator

Team members had experience with traditional evaluation approaches that position the evaluator externally to ensure independence and objectivity. An evaluator who was embedded in the team as a participant observer, with in-depth knowledge of the project background and context, challenged this principle. However, we found that background knowledge supported more nuanced understanding of what was occurring in the ESP project and facilitated real-time tailoring to Indigenous PHC stakeholders (e.g. providing a group facilitation guide and working with CQI network members to encourage input that reflected cultural, community and service perspectives). Reflexive practice [58] reduced the risk of making assumptions about stakeholder needs. The evaluator was based in a different physical location to the team and this provided some independence from day-to-day project operations.

An embedded evaluator blurred role boundaries. As innovators we all became evaluators [3] and the evaluator was responsible for implementing some innovations (e.g. writing plain language summaries of ESP reports). Our prior experience in action research may have helped to prevent potential role tension. Rey et al. [5] liken a DE approach to conducting action research, explaining that DE evaluators engage in experiential learning cycles to both produce knowledge and facilitate change. In spanning the boundary between researchers and stakeholders, our evaluator helped to achieve the project’s knowledge co-production aim. There are examples of this boundary-spanning role being undertaken by academic researchers who are embedded in host organisations for knowledge co-production projects [59, 60]. In addition, prior working relationships and complementary skill sets were facilitative factors in managing these interactions and making best use of time and skills across the team.

Committing sufficient resources and managing data

Responding to continual processes of reflection and inquiry impacted on the project timeframe and resource needs. It required a range of skills and team capacity (e.g. to change data visualisation, to work flexibly with experts who assisted with analysis). Having a team leader committed to DE as well as an experienced team member who wanted to do post-graduate study in this topic area enabled us to surmount the normally significant challenge of resourcing a DE over an extended period. The high level of benefit delivered by these circumstances may not be available to teams with fewer resources and tight timeframes.

Managing high volumes of evaluation data and distinguishing between evaluation data and ESP project data were ongoing DE challenges. Collecting and synthesising large amounts of data in a timely way is an identified challenge of DE [37, 61], and our evaluation aimed to build a contextualised and integrated understanding of the findings and evaluation outcomes of a data-driven research project [52]. To achieve this, the DE drew on ESP project documentation and data, reviewed online survey data and ESP reports, monitored project adaptations, and collected and analysed data obtained through stakeholder interviews. This occurred concurrently with the ESP team’s production of 18 research reports and stakeholder surveys, 6 data supplements and other knowledge translation products. The evaluator had a direct role in some of these project tasks. Balancing DE processes with task- and results-focused ESP project management demanded decisive project leadership, good planning and teamwork, and flexible DE processes.

Lack of experience with this evaluation approach

No members of the team had previously participated in a DE, including the evaluator. The uncertainty inherent in DE, and the paucity of literature describing methods used in DE, caused the evaluator to regularly reflect on whether our evaluation was indeed developmental. Patton recently identified eight principles that should be addressed within a developmental evaluation [1, 35]. To assist research teams considering the use of DE, we describe ways in which we now understand the evaluation of the ESP project to reflect these principles (Box 2).

Developmental evaluation and continuous quality improvement

We found DE to be congruent with the way we work in CQI. Evaluation literature identifies the purpose of DE as responding to changes (e.g. in understanding, participants or context) by doing something differently. Patton contrasts this with the improvement purpose of many formative evaluations [62], a comparison that suggests DE might be challenging for researchers coming from a quality improvement perspective. However, adapting the ESP project to improve the relevance and use of data and ESP reports among stakeholders seemed consistent with the DE purpose. Furthermore, both DE and CQI can involve complexity and systems thinking. Both approaches feature client-focused, participatory processes and both involve iterative data-informed reflection, decision-making and change.

Applying DE processes in the ESP project could be likened to using ‘plan-do-study-act’ cycles. We collected and interpreted data, worked out change strategies, implemented them, evaluated how they worked and repeated the cycle with different sets of PHC CQI data. DE processes also encouraged us to draw on CQI theory and practice as well as our experience of participatory research to think more deeply about the role of facilitation in the ESP project [55].

The ESP cycles could themselves be likened to scaled-up ‘plan-do-study-act’ cycles. ESP reports presented health centre performance data, which were used to identify improvement priorities and strategies that took account of contextual and workforce factors. However, applying these processes at scale to focus on system-wide improvement was new and involved complex interactions, making it difficult to predict the adaptations required to support engagement and ensure robust research findings. DE adequately addressed this challenge [37, 38].

Using developmental evaluation to advance knowledge translation

We observed that DE acted as a knowledge translation process. Firstly, the successful implementation of knowledge co-production at scale without intense facilitation effort appeared partly due to the facilitative function of DE. By supporting continuous adaptation and tailoring to stakeholders and context, DE helped to identify and foster the ESP facilitation efforts of key stakeholders and CQI champions in workplaces [55].

Secondly, participation in the surveys and evaluation interviews [52] prompted stakeholders to think more deeply about how the reports could or should be used for Indigenous health improvement. Evaluation data show that it stimulated them to use the reports in a variety of ways [63], and to pass project information on to others and encourage others to use the reports or respond to the key findings. As a result, stakeholders working at different system levels used the ESP reports for complementary purposes (e.g. reflecting on individual practice, building team skills in data analysis, programme planning, influencing policy, developing new research) [63]. A multi-level improvement approach has greater likelihood of achieving change [9]. An evaluation approach with the ability to strengthen evidence use at multiple levels potentially has a role in creating synergy for improvement.

Third, evaluation feedback provided guidance on encouraging people to engage with and discuss the data, reflect on practice, community and system contexts, and share perspectives on improving care. This included guidance to increase Indigenous stakeholder input into data interpretation (e.g. by providing resources to support the participatory interpretation of data). Stakeholders reported learning new skills in data analysis and being stimulated by the improvement ideas of others [55, 63]. CQI research in Indigenous PHC indicates that support for pooled knowledge assists engagement in improvement initiatives [21, 64, 65]. It is also recognised that co-production can have subtle impact on research capacity-building and knowledge sharing, as well as demonstrable benefits such as policy and practice [66]. The DE supported knowledge pooling and co-production (e.g. by informing survey refinement and encouraging group input). At a higher level, using a developmental approach informed adaptation of the theory-based processes used in the ESP project and, ultimately, our positive assessment of the utility of the interactive dissemination process [55].

Finally, the concurrent ESP and DE processes assisted in maintaining stakeholder engagement through iterative ESP cycles to identify common evidence–practice gaps and common perceptions of the enablers and barriers to addressing those gaps across different areas of care (e.g. child health, chronic illness) [67].

While recognising that improvement strategies need to take account of local context, the common findings can be used to target policies and system interventions to improve health service performance and Indigenous health outcomes.

Developmental evaluation limitations and future research priorities

The intended purposes of our evaluation went beyond adapting an innovative knowledge translation process. We needed to make judgements relating to its merit and utility (which is more aligned with summative evaluation than developmental purpose), and to generate knowledge to inform future translation initiatives and evaluations [52]. In reality, we needed to apply a combination of evaluation approaches, including the use of an analytical framework to assess the success of project implementation.

Further to the challenge we experienced in defining the boundaries between the ESP project work and the DE, delineating ESP project-related data and the DE data (e.g. in survey feedback) was often difficult. Despite flexible timeframes, data-related tensions regarding time and budget constraints emerged. For example, taking sufficient time to synthesise, reflect on and respond to evaluation findings was integral to the DE and important for maintaining stakeholder engagement. Conversely, project momentum was important because the ESP reports were valued as a source of robust PHC data available in real time. It was not within the scope of the project for the DE to adapt reports and processes for individual settings (e.g. PHC centres) and appropriate responses to DE data were not always feasible (e.g. the team was not resourced to facilitate groups for data interpretation, as consistently recommended by stakeholders). However, not acting on feedback risked disengagement by stakeholders. In addition, stakeholders had differing perspectives about the project changes needed. DE processes identified, but could not necessarily resolve, these tensions.

Future research could explore the use of DE to strengthen knowledge translation processes and to support Indigenous engagement in bringing about change. The use of DE to support interactive dissemination processes could be extended to engage PHC clients/consumers with CQI data for decision-making about health and context-specific improvement interventions.

Use of DE when applying interactive dissemination processes in other health settings would further the understanding of the elements and resources needed for successful knowledge co-production. DE should be further explored as a method for informing the scale-up of participatory research and improvement interventions and as an alternative to the more traditional process evaluation approaches adopted in implementation and improvement research.

Conclusion

Our experience of DE confirmed our expectations of the potential value of this type of work for strengthening improvement interventions and knowledge translation research. In the ESP project, DE encompassed project implementation, evaluation, capacity development and knowledge translation. It supported the use of implementation theory to enhance the development and evaluation of our improvement research. While every situation and group will be different, the benefits of applying DE attest to its suitability for adapting and evaluating PHC innovations in Indigenous settings. Lessons learnt have enhanced our skills and knowledge about what works to engage Indigenous PHC stakeholders with data for knowledge co-production and system-wide change and, more generally, how to add impact and value to CQI research through research translation. Available resources, including facilitation skills and time, and scope for flexibility and change within a project or programme will influence the feasibility and benefits for teams adopting this evaluation approach. Further research is warranted to advance knowledge about the effective use of DE to improve translation and healthcare initiatives and outcomes.

Availability of data and materials

Not applicable.

Abbreviations

CQI:

continuous quality improvement

DE:

developmental evaluation

ESP:

Engaging Stakeholders in Identifying Priority Evidence–Practice Gaps and Strategies for Improvement in Primary Health Care

PHC:

primary healthcare

References

  1. Patton M, McKegg K, Wehipeihana N. Developmental Evaluation Exemplars: Principles in Practice. 1st ed. New York: The Guildford Press; 2016.

    Google Scholar 

  2. Dickson R, Saunders M. Developmental evaluation: lessons for evaluative practice from the SEARCH program. Eval Int J Theory Res Pract. 2014;20(2):176–94.

    Google Scholar 

  3. Preskill H, Beer T. Evaluating Social Innovation. Washington, DC: FSG Center for Evaluation Innovation; 2012.

    Book  Google Scholar 

  4. Togni S, Askew D, Rogers L, Potter N, Egert S, Hayman N, et al. Creating safety to explore: strengthening innovation in an Australian indigenous primary health care setting through developmental evaluation. In: Patton M, McKegg K, Wehipeihana N, editors. Developmental Evaluation Exemplars: Principles in Practice. New York: The Guildford Press; 2016. p. 234–51.

    Google Scholar 

  5. Rey L, Tremblay M, Brousselle A. Managing tensions between evaluation and research: illustrative cases of developmental evaluation in the context of research. Am J Eval. 2014;35(1):45–60.

    Article  Google Scholar 

  6. Conklin J, Farrell B, Ward N, McCarthy L, Irving H, Raman-Wilms L. Developmental evaluation as a strategy to enhance the uptake and use of deprescribing guidelines: protocol for a multiple case study. Implement Sci. 2015;10:91.

    Article  Google Scholar 

  7. Australian Institute of Health and Welfare. The Health and Welfare of Australia’s Aboriginal and Torres Strait Islander Peoples 2015. Canberra: AIHW; 2015.

    Google Scholar 

  8. Bailie J, Schierhout G, Laycock A, Kelaher M, Percival N, O'Donoghue L, et al. Determinants of access to chronic illness care: a mixed-methods evaluation of a national multifaceted chronic disease package for indigenous Australians. BMJ Open. 2015;5(11):e008103.

    Article  Google Scholar 

  9. Ferlie EB, Shortell SM. Improving the quality of health Care in the United Kingdom and the United States: a framework for change. Milbank Q. 2001;79(2):281–315.

    Article  CAS  Google Scholar 

  10. European Scinece Foundation. Implementation of Medical Research in Clinical Practice. Strasbourg: European Science Foundation; 2012.

    Google Scholar 

  11. Australian Institute of Health and Welfare. Australia’s Health 2018. Canberra: AIHW; 2018.

    Google Scholar 

  12. Foy R, Øvretveit J, Shekelle PG, Pronovost PJ, Taylor SL, Dy S, et al. The role of theory in research to develop and evaluate the implementation of patient safety practices. BMJ Qual Saf. 2015;20(5):453–9.

    Article  Google Scholar 

  13. Eccles MP, Grimshaw JM, MacLennan G, Bonetti D, Glidewell L, Pitts NB, et al. Explaining clinical behaviors using multiple theoretical models. Implement Sci. 2012;7:99.

    Article  Google Scholar 

  14. Hailey D, Grimshaw J, Eccles M, Mitton C, Adair CE, McKenzie E, et al. Effective Dissemination of Findings from Research. Edmonton: Institute of Health Economics; 2008.

    Google Scholar 

  15. Wilson PM, Petticrew M, Calnan MW, Nazareth I. Disseminating research findings: what should researchers do? A systematic scoping review of conceptual frameworks. Implement Sci. 2010;5:91.

    Article  Google Scholar 

  16. National Collaborating Centre for Methods and Tools. Fostering Interactive Exchange and Dissemination: McMaster University; 2011. http://www.nccmt.ca/resources/search/79. Accessed 20 Feb 2016.

  17. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50.

    Article  Google Scholar 

  18. Canadian Institutes of Health Research. Guide to Knowledge Translation Planning at CIHR: Integrated and End-of-grant Approaches. Ottawa: Canadian Institutes of Health Research; 2012.

    Google Scholar 

  19. GI BS. Integrated knowledge translation. In: TJ SS, Graham ID, editors. Knowledge Translation in Health Care: Moving from Evidence to Practice. 2nd ed. London: BMJ Books; 2013. p. 14–23.

    Google Scholar 

  20. Kothari A, McCutcheon C, Graham I. Defining integrated knowledge translation and moving forward: a response to recent commentaries. Int J Health Policy Manag. 2017;6(5):299–300.

    Article  Google Scholar 

  21. Bailie R, Matthews V, Brands J, Schierhout G. A systems-based partnership learning model for strengthening primary healthcare. Implement Sci. 2013;8:143.

    Article  Google Scholar 

  22. Bailie R, Si D, Shannon C, Semmens J, Rowley K, Scrimgeour DJ, et al. Study protocol: national research partnership to improve primary health care performance and outcomes for indigenous peoples. BMC Health Serv Res. 2010;10:129.

    Article  Google Scholar 

  23. Si D, Bailie RS, Dowden M, O'Donoghue L, Connors C, Robinson GW, et al. Delivery of preventive health services to indigenous adults: response to a systems-oriented primary care quality improvement intervention. Med J Aust. 2007;187(8):453–7.

    PubMed  Google Scholar 

  24. Bailie RS, Si D, O'Donoghue L, Dowden M. Indigenous health: effective and sustainable health services through continuous quality improvement. Med J Aust. 2007;186(10):525–7.

    PubMed  Google Scholar 

  25. Nattabi B, Kanai S, Ferguson-Hill S, Mosca D, Murphy M, Bailie R. P13.06 knowledge translation: development of a sexual health clinical audit tool to enhance adherence to evidence-based guidelines. Sex Transm Infect. 2015;91(Suppl 2):A194–A5.

    Article  Google Scholar 

  26. Percival N. Improving Health Promotion in Indigenous Primary Health Care: Is a Continuous Quality Improvement Approach Feasible? [PhD thesis]. Darwin: Charles Darwin University; 2014.

    Google Scholar 

  27. Brimblecombe J, van den Boogaard C, Wood B, Liberato SC, Brown J, Barnes A, et al. Development of the good food planning tool: a food system approach to food security in indigenous Australian remote communities. Health Place. 2015;34:54–62.

    Article  Google Scholar 

  28. McDonald EL, Bailie RS, Morris PS. Participatory systems approach to health improvement in Australian aboriginal children. Health Promot Int. 2017;32(1):62–72.

    Article  Google Scholar 

  29. Bailie R, Si D, Connors C, Weeramanthri T, Clark L, Dowden M, et al. Study protocol: audit and best practice for chronic disease extension (ABCDE) project. BMC Health Serv Res. 2008;8:184.

    Article  Google Scholar 

  30. National Aboriginal Community Controlled Health Organisation. Aboriginal Health Definitions. Canberra: NACCHO; 2018. https://www.naccho.org.au/about/aboriginal-health/definitions/. Accessed 9 Nov 2018.

  31. Streak Gomersall J, Gibson O, Dwyer J, O’Donnell K, Stephenson M, Carter D, et al. What indigenous Australian clients value about primary health care: a systematic review of qualitative evidence. Aust N Z J Public Health. 2017;41(4):417–23.

    Article  Google Scholar 

  32. Gibson O, Lisy K, Davy C, Aromataris E, Kite E, Lockwood C, et al. Enablers and barriers to the implementation of primary health care interventions for indigenous people with chronic diseases: a systematic review. Implement Sci. 2015;10:71.

    Article  Google Scholar 

  33. Schierhout G, Hains J, Si D, Kennedy C, Cox R, Kwedza R, et al. Evaluating the effectiveness of a multifaceted, multilevel continuous quality improvement program in primary health care: developing a realist theory of change. Implement Sci. 2013;8:119.

    Article  Google Scholar 

  34. Department of Health. My Life my Lead - Opportunities for Strengthening Approaches to the Social Determinants and Cultural Determinants of Indigenous Health: Report on the National Consultations December 2017. Canberra: Commonwealth of Australia; 2017.

    Google Scholar 

  35. Patton MQ. What is essential in developmental evaluation? On integrity, fidelity, adultery, abstinence, impotence, long-term commitment, integrity, and sensitivity in implementing evaluation models. Am J Eval. 2016;37(2):250–65.

    Article  Google Scholar 

  36. Gagnon ML. Moving knowledge to action through dissemination and exchange. J Clin Epidemiol. 2011;64(1):25–31.

    Article  Google Scholar 

  37. Patton MQ. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. New York: The Guilford Press; 2011.

    Google Scholar 

  38. Hummelbrunner R. Systems thinking and evaluation. Evaluation. 2011;17(4):395–403.

    Article  Google Scholar 

  39. Walton M. Applying complexity theory: a review to inform evaluation design. Eval Program Plann. 2014;45:8.

    Article  Google Scholar 

  40. Wutzke S, Rowbotham S, Haynes A, Hawe P, Kelly P, Redman S, et al. Knowledge mobilisation for chronic disease prevention: the case of the Australian prevention partnership Centre. Health Res Policy Syst. 2018;16:109.

    Article  Google Scholar 

  41. Bailie J, Cunningham FC, Bainbridge RG, Passey ME, Laycock AF, Bailie RS, et al. Comparing and contrasting ‘innovation platforms’ with other forms of professional networks for strengthening primary healthcare systems for indigenous Australians. BMJ Glob Health. 2018;3(3):e000683.

    Article  Google Scholar 

  42. Murphy N. Nine guiding principles to help youth overcome homelessness: a principles-focused developmental evaluation. In: Patton M, McKegg K, Wehipeihana N, editors. Developmental Evaluation Exemplars: Principles in Practice. New York: The Guildford Press; 2016. p. 63–82.

    Google Scholar 

  43. Asher J, Foote N, Radner J, Warren T. Science and how we care for needy young children: the frontiers in innovation initiative. In: Patton M, McKegg K, Wehipeihana N, editors. Developmental Evaluation Exemplars: Principles in Practice. New York: The Guildford Press; 2016. p. 103–24.

    Google Scholar 

  44. Gagnon M. Knowledge Dissemination and Exchange of Knowledge. Ottawa: Canadian Institutes of Health Research; 2010. http://www.cihr-irsc.gc.ca/e/41953.html. Accessed 17 July 2019.

  45. McKegg K, Wehipeihana N, Becroft M, Gill J. Developmental evaluation's role in supporting community-led solutions for Maori and Pacific young people's educational success. In: Patton M, McKegg K, Wehipeihana N, editors. Developmental Evaluation: Exemplars: Principles in Practice. New York: The Guildford Press; 2016. p. 125–42.

    Google Scholar 

  46. Wehipeihana N, McKegg K, Thompson V, Pipi K. Cultural responsiveness through developmental evaluation: indigenous innovations in sport and traditional Maori recreation. In: Patton MQ, McKegg K, Wehipeihana NE, editors. Developmental Evaluation: Exemplars: Principles in Practice. New York: The Guildford Press; 2016. p. 25–44.

    Google Scholar 

  47. Huijg JM, Gebhardt WA, Crone MR, Dusseldorp E, Presseau J. Discriminant content validity of a theoretical domains framework questionnaire for use in implementation research. Implement Sci. 2014;9:11.

    Article  Google Scholar 

  48. French SD, Green SE, O'Connor DA, McKenzie JE, Francis JJ, Michie S, et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the theoretical domains framework. Implement Sci. 2012;7:38.

    Article  Google Scholar 

  49. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14:26–33.

    Article  CAS  Google Scholar 

  50. Laycock A, Bailie J, Matthews V, Bailie RS. Interactive dissemination: engaging stakeholders in the use of aggregated quality improvement data for system-wide change in Australian indigenous primary health care. Front Public Health. 2016;4:84.

    Article  Google Scholar 

  51. Honadle BW, Zapata MA, Auffrey C, Vom Hofe R, Looye J. Developmental evaluation and the ‘stronger economies together’ initiative in the United States. Eval Program Plann. 2014;43:64–72.

    Article  Google Scholar 

  52. Laycock A, Bailie J, Matthews V, Cunningham F, Harvey G, Percival N, et al. A developmental evaluation to enhance stakeholder engagement in a wide-scale interactive project disseminating quality improvement data: study protocol for a mixed-methods study. BMJ Open. 2017;7:e016341.

    Article  Google Scholar 

  53. Cunningham FC, Matthews V, Sheahan A, Bailie J, Bailie RS. Assessing collaboration in a National Research Partnership in quality improvement in indigenous primary health care: a network approach. Front Public Health. 2018;6:182.

    Article  Google Scholar 

  54. Wagner EH, Austin BT, Davis C, Hindmarsh M, Schaefer J, Bonomi A. Improving chronic illness care: translating evidence into action: interventions that encourage people to acquire self-management skills are essential in chronic illness care. Health Aff. 2001;20(6):64–78.

    Article  CAS  Google Scholar 

  55. Laycock A, Harvey G, Percival N, Cunningham F, Bailie J, Matthews V, et al. Application of the i-PARIHS framework for enhancing understanding of interactive dissemination to achieve wide-scale improvement in indigenous primary healthcare. Health Res Policy Syst. 2018;16:117.

    Article  Google Scholar 

  56. Harvey G, Kitson A. Implementing Evidence-Based Practice in Healthcare: A Facilitation Guide. London: Taylor & Francis Ltd.; 2015.

    Book  Google Scholar 

  57. Cawley J, Preskill H, FSG. What are the Products of a Developmental Evaluation? Boston: FSG; 2014. https://www.fsg.org/blog/what-are-products-developmental-evaluation. Accessed 17 July 2019.

  58. Alley S, Jackson SF, Shakya YB. Reflexivity: a methodological tool in the knowledge translation process? Health Promot Pract. 2015;16(3):426–31.

    Article  Google Scholar 

  59. Greenhalgh T, Wieringa S. Is it time to drop the ‘knowledge translation’ metaphor? A critical literature review. J R Soc Med. 2011;104:501–9.

    Article  Google Scholar 

  60. Vindrola-Padros C, Pape T, Utley M, Fulop NJ. The role of embedded research in quality improvement: a narrative review. BMJ Qual Saf. 2017;26(1):70–80.

    Article  Google Scholar 

  61. Gamble J. A Developmental Evaluation Primer. Montreal: The J. W. McConnell Family Foundation; 2008.

    Google Scholar 

  62. Patton MQ. A world larger than formative and summative. Eval Pract. 1996;17(2):131–44.

    Article  Google Scholar 

  63. Laycock AF, Bailie J, Percival NA, Matthews V, Cunningham FC, Harvey G, et al. Wide-scale continuous quality improvement: a study of Stakeholders’ use of quality of care reports at various system levels, and factors mediating use. Front Public Health. 2019;6:378.

    Article  Google Scholar 

  64. Gardner KL, Dowden M, Togni S, Bailie R. Understanding uptake of continuous quality improvement in indigenous primary health care: lessons from a multi-site case study of the audit and best practice for chronic disease project. Implement Sci. 2010;5:21.

    Article  Google Scholar 

  65. Larkins S, Carlisle K, Turner N, Taylor J, Copley K, Cooney S, et al. ‘At the grass roots level it’s about sitting down and talking’: exploring quality improvement through case studies with high-improving aboriginal and Torres Strait islander primary healthcare services. BMJ Open. 2019;9:e027568.

    Article  Google Scholar 

  66. Beckett K, Farr M, Kothari A, Wye L, le May A. Embracing complexity and uncertainty to create impact: exploring the processes and transformative potential of co-produced research through development of a social impact model. Health Res Policy Syst. 2018;16:118.

    Article  Google Scholar 

  67. Bailie J, Laycock A, Matthews V, Bailie R. System-level action required for wide-scale improvement in quality of primary health care: synthesis of feedback from an interactive process to promote dissemination and use of aggregated quality of care data. Front Public Health. 2016;4:86.

    PubMed  PubMed Central  Google Scholar 

  68. Bailie J, Schierhout G, Cunningham F, Yule J, Laycock A, Bailie R. Quality of Primary Health Care for Aboriginal and Torres Strait Islander People in Australia. Key Research Findings and Messages for Action from the ABCD National Research Partnership Project. Brisbane: Menzies School of Health Research; 2014.

    Google Scholar 

  69. McPhail-Bell K, Matthews V, Bainbridge R, ML R-ML, Askew D, Ramanathan S, et al. An “all teach, all learn” approach to research capacity strengthening in indigenous primary health care continuous quality improvement. Front Public Health. 2018;6:107.

    Article  Google Scholar 

Download references

Acknowledgements

The authors acknowledge the active support, enthusiasm and commitment of staff in participating primary healthcare services, and members of the ABCD National Research Partnership and the Centre for Research Excellence in Integrated Quality Improvement in Indigenous Primary Health Care. We are grateful to Frances Cunningham, Gillian Harvey and Nikki Percival for feedback on the draft manuscript.

Funding

The ABCD National Research Partnership Project has been supported by funding from the National Health and Medical Research Council (545267) and the Lowitja Institute, and by in-kind and financial support from Community Controlled and Government agencies. Alison Laycock has been supported by a National Health and Medical Research Council Postgraduate Scholarship (1094595), and by the Centre of Research Excellence: An Innovation Platform for Integrated Quality Improvement in Indigenous Primary Health Care (CRE-IQI, funded by the NHMRC ID 1078927). Ross Bailie’s work has been supported by an Australian Research Council Future Fellowship (100100087).

Author information

Authors and Affiliations

Authors

Contributions

AL conceived the manuscript, synthesised stakeholder and team feedback, and wrote manuscript drafts. JB, VM and RB made substantial contributions to manuscript content and RB led the ESP project and supervised the writing process. All authors reviewed drafts, and read and approved the final manuscript.

Corresponding author

Correspondence to Alison Laycock.

Ethics declarations

Ethics approval and consent to participate

The evaluation was approved by the Human Research Ethics Committee (HREC) of the Northern Territory Department of Health and Menzies School of Health Research (2015–2329), the Central Australian HREC (15–288), the Charles Darwin University HREC (H15030) and participating organisations. All evaluation participants provided individual informed consent.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Laycock, A., Bailie, J., Matthews, V. et al. Using developmental evaluation to support knowledge translation: reflections from a large-scale quality improvement project in Indigenous primary healthcare. Health Res Policy Sys 17, 70 (2019). https://doi.org/10.1186/s12961-019-0474-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-019-0474-6

Keywords