Skip to main content

Integrating case management for patients with complex needs in the ground practice: the importance of context in evaluative designs

Abstract

Responding to complex needs calls for integrating care across providers, settings and sectors. Among models to improve integrated care, case management demonstrates a good evidence base of facilitating the appropriate delivery of healthcare services. Since case management is a complex, multi component intervention, with its component parts interacting in a non-linear manner, effectiveness is largely influenced by the context in which the intervention is implemented. This paper discusses how to respond to implementation challenges to evaluating complex interventions for patients with complex needs. Building on the example of case management, we suggest that documenting innovation effectiveness remains important, but that evaluation needs to include theory-based and systems perspectives. We also suggest that implementation science needs to be part of intervention design while engaging stakeholders to define the most relevant research questions and implementation effectiveness, to optimize successful implementation and sustainability.

Peer Review reports

Contributions to the literature

  • This paper suggests that evaluation of effectiveness by randomized controlled trials for complex interventions has structural limitations and discusses the pros and cons of such designs.

  • We propose examples of designs to evaluate the theory of the intervention, using the example of case management for people with complex needs.

  • It invites researchers and stakeholders to start implementation science early in intervention design to optimize adoption and sustainability.

Background

Eighteen per cent of patients in primary healthcare face multiple interacting challenges among the physical, mental and social dimensions of health [1], having the most complex health needs (referred to, hereafter, as “complex needs”). These proportions increase with age, race, and ethnicity [2]. Per the inverse care law [3], with increased complexity of patient’s needs, comes decreased care availability and health equity, and thus decreased quality of life, and increased disability and mortality risk [4]. The COVID-19 pandemic has shone a light on the health inequities experienced by patients with complex needs [5]. Improving care and health equity for this population is a priority for healthcare systems worldwide [6].

Responding to complex needs calls for integrating care across providers, settings, and sectors. The World Health Organization suggests the following patient-led definition of integrated care: ‘My care is planned with people who work together to understand me and my carer(s), put me in control, coordinate and deliver services to achieve my best outcomes’ [7]. Reviews demonstrated the impact of integrated care on access and quality of care, patient satisfaction, and reduction of hospitalization [8].

Among models to improve integrated care, evidence supported case management [9,10,11] facilitates the appropriate delivery of healthcare services for patients with complex needs [7]. Case management is a highly variable, collaborative approach used to assess, plan, facilitate, and coordinate care to meet patient and family healthcare needs, through communication and coordination of available resources across all levels of health care as well as sectors outside of the health system [12].

When focusing on the care of patients with multiple clinical, behavioral and social dimensions that impact on functioning and health, interventions involve many partners and are often complex [13]. This requires interacting and collaborating with underlying organizational systems and subsystems and adaptive learning for rapid cycle changes. Multiple contextual issues such as the setting of implementation, providers involved, and organizational culture, need to be considered as part of implementation and generate issues requiring operational and clinical adaptation. Since case management is a non linear complex multi component intervention [14], effectiveness is largely influenced by the context in which the intervention is implemented [15].

To support the development and evaluation of complex interventions, the United Kingdom Medical Research Council (MRC) proposed an adapted phased approach [13, 16]. Their four phases Framework, building on qualitative and quantitative evidence and includes development, feasibility/piloting, evaluation, and implementation [16]. It was recently updated to incorporate developments in complex intervention research [17]. This revised Framework introduces more emphases on the importance of context and the need of understanding interventions as events in systems that produce effects through interactions including contextual factors associated with implementation.

Successful implementation of interventions that respond to complex care needs is critical to improving healthcare systems- and outcomes [17]. This paper discusses how to respond to implementation challenges to evaluating complex interventions for patients with complex needs, building on the example of case management.

Evaluation and implementation science

Intervention effectiveness remains important

Pragmatic randomized controlled trials [18] (RCTs) remain indispensable to develop the foundation of evidence about a new intervention and are essential to document internal validity [18]. Reviews of RCTs on case management, for example, documented reduction of emergency department costs and improvement of social and clinical outcomes (e.g. alcohol or drug use and social problems) for patients who frequently used the healthcare services [9,10,11].

However, there are multiple challenges in conducting RCTs of complex multi-level interventions in the ground practices with patients having complex needs. RCT designs have mainly focused on internal validity minimizing inherent organizational and clinical contextual variation and restrict patient populations [19, 20]. In addition, the time, expense and need for controlled research environments limit the generalizability and utility of findings and often do not respond to the immediate need of the providers [21]. Partially because of this disconnect, limited biobehavioural research makes its way into practice [22]. Many reviews of RCTs conclude that the inability to translate RCT data into clinical care may limit their utility [18, 20], and therefore many authors have proposed alternative designs to traditional RCTs [23]. Cluster randomization [24] at the practice level, acknowledges organizational and contextual variation and tests whether there are effects across practices, despite variation. At the patient level, stepped wedge [25] designs allow patients to serve as their own controls over time, with changes after intervention serving as key outcome indicators. Rather than controlling variation, it is expected and documented when reporting results. Contextual variation also helps to understand why it is so difficult to conduct meta-analysis of complex interventions with patients with complex needs. These meta-analyses of RCTs, very supportive when available, are not always feasible and cannot be the unique strategy of evaluation.

Many good RCTs concluding that an intervention is not effective are a strong argument against this intervention which will have to be significantly improved and re-evaluated. On the other hand, having almost all RCT findings documenting effectiveness of complex interventions targeting patients with complex needs remain unlikely because of variations in key ingredients of the intervention, populations recruited in the study or local contexts. Researchers and decision-makers will often have to contend with a situation between those ends.

Should we conduct a new RCT in each new context?

Some might argue we should conduct a new RCT in each new context that interventions will be implemented. A more pressing question is whether RCTs always the best designs in multi level interventions of complex patients. We suggest that there must be a balance between the internal validity RCT focus and the crucial external validity necessary for data to be taken seriously on the ground, keeping in mind that evidence is usually not the main issue when translating research into practice [26]. Translation of research into practice is challenging if local context is not well considered in replication. In addition to evidence, in real world, many feasibility aspects have to be considered in implementation design, such as budget, human resources, work-flows for intervention and monitoring, and contextual adaptation. Given limited resources and limited uptake of RCT data, investing resources into additional RCTs should be questioned, and perhaps may be unethical, if RCTs demonstrated the effectiveness in controlled settings and populations but have limited practice uptake. In that case, alternative less expensive and resource consuming designs may be more suitable to better understand contextual facilitators to increase on the ground uptake [27].

But evaluation goes beyond effectiveness

The revised MRC Framework outlines the importance of considering strategies to maximise the usefulness of research results to inform decision-making [17], in contrast to focusing exclusively on obtaining unbiased estimates of effectiveness [28]. Research questions should be developed in partnership with stakeholders, utilizing study designs that rapidly answer questions of stakeholder interest and promote adoption of findings. Beyond effectiveness, evaluation should inform the theory-based and the systems perspectives [17].

Many designs may help identifying key ingredients of complex interventions [29]. For example, different kinds of synthesis were conducted for case management with frequent users of healthcare services. A mixed systematic review [30] identified characteristics of case management that yield positive outcomes among frequent users with chronic disease in primary care. Sufficient and necessary characteristics were identified using configurational comparative methods (CCM) [31,32,33]. This review documented that it is necessary to identify patients most likely to benefit from the intervention for case management to produce positive outcomes. By definition, patient complexity is heterogeneous in clinical presentation, effect on quality of life, and available support resources. High-intensity intervention or the presence of a multidisciplinary/interorganizational care plan was also associated with positive outcomes.

The realist approaches offer an opportunity for complex interventions to be treated as complex systems [34]. Realist approaches focus not only on the outcomes, but also on the causal mechanisms that explain ‘how’ the outcomes were reached, and how context influenced outcomes [35]. Such a focus is particularly appropriate when seeking to better understand novel interventions with little information available on their effectiveness, those that have demonstrated mixed patterns and outcomes, and interventions that will be brought to broader scale [36]. For example, a realist synthesis [37] examined how and under what circumstances primary care case management improves outcomes among frequent users with chronic conditions [34]. This realist synthesis documented that the trusting relationship fostering patient and clinician engagement in the case management intervention was a key ingredient of the intervention [37].

Complex interventions are often embedded in changing organizations and systems including many parts interconnected that produce its own pattern of behavior over time [38]. ‘A systems perspective suggests that interventions can be better understood with an examination of the system(s) in which they are embedded or the systems that they set out to change’ [17]. Consideration of the relationships between the intervention and its multiple contextual factors is key [39]. Network analysis, for example, is an approach which can be used with other study designs to understand changing relationships among structures within a system of individuals or organizations [17]. Case management research for people with complex needs could benefit from this kind of analysis.

Implementation effectiveness starts with intervention design

An effective intervention needs to be designed to be useful, identifying important implementation considerations as the first phases of evaluation [17]. Identification of factors influencing implementation and effectiveness become a core element of research design [29, 40]. Without being exhaustive, a few models can support research teams and stakeholders to consider implementation early in evaluation. The PRISM Practical, Robust Implementation and Sustainability Model—[41] proposes identifiable and measurable elements to assess context [42]. It evaluates how the healthcare program or intervention interacts with the recipients to influence program adoption, implementation, maintenance, reach, and effectiveness. Such application broadens identification of contextual factors and enriches our dynamic understanding of multi-layer interventions. Implementation questions should be asked concomitantly with effectiveness and other evaluation questions. Curran et al. [43] propose three hybrid designs to assess effectiveness and implementation: (1) testing effects of a clinical intervention on relevant outcomes while observing and gathering information on implementation; (2) dual testing of clinical and implementation interventions/strategies; and (3) testing of an implementation strategy while observing and gathering information on the clinical intervention’s impact on relevant outcomes [43]. Chambers et al. [44] propose the Dynamic Sustainability Framework involving continued learning and problem solving, and ongoing adaptation of complex interventions with a primary focus on fit between interventions and multi-level contexts, and expectations for ongoing improvement instead of implementation of fixed interventions at-risk of losing effectiveness over time [44]. A large part of implementation science research [45], therefore, ‘involves the development and evaluation of complex interventions to maximize effective implementation in practice and/or the policy of interventions that have already demonstrated effectiveness’ [17].

Barriers to and facilitators of effective implementation and contextual adaptation must be a core of evaluation strategy [17]. For example, a multiple embedded case study with a mixed-methods design identified characteristics and context of case management programs to help to improve patient self-management, experience of integrated care, and healthcare services use [46]. This study underscored the necessity of an experienced, knowledgeable and well-trained case manager with strong interpersonal skills to optimize case management programs implementation such that patients are more proactive in their care and their outcomes improve.

Early consideration of implementation implies involving stakeholders in all phases of development and evaluation of a complex intervention from the beginning, to ensure asking the most relevant research questions and increasing the potential an intervention be widely adopted [17]. Collaboration between researchers and knowledge users throughout a study or a research program is a strong predictor that findings will be used [47]. This collaboration may take different forms going from a consultation at certain phases of the study/research program to full engagement in all phases of the study [47].

Conclusions

RCTs remain indispensable to develop the foundation of evidence about a new intervention and are important to document effectiveness, but evaluation should go beyond effectiveness to include theory-based and systems perspectives, choosing the appropriate designs to answer research questions. Moreover, implementation effectiveness evaluation should start with intervention design. While conducting evaluation studies, engaging stakeholders to contribute defining the most relevant research questions and designs optimizes chances of adoption and sustainability.

Availability of data and materials

Not applicable.

Abbreviations

RCT:

Randomized controlled trials

COVID-19:

Coronavirus disease 2019

MRC:

Medical Research Council

CCM:

Configurational comparative methods

PRISM:

The Practical, Robust Implementation and Sustainability Model

References

  1. de Oliveira CA, Weber B, Dos Santos JLF, et al. Health complexity assessment in primary care: a validity and feasibility study of the INTERMED tool. PLoS ONE. 2022;17: e0263702.

    Article  CAS  Google Scholar 

  2. Buttorff C, Ruder T, Bauman M. Multiple chronic conditions in the United States. Santa Monica, CA: Rand. 2017. https://www.rand.org/pubs/tools/TL221.html. Accessed 12 Dec 2022.

  3. Marmot M. An inverse care law for our time. BMJ. 2018;362: k3216.

    Article  Google Scholar 

  4. Schoen C, Osborn R, Squires D, Doty M, Pierson R, Applebaum S. New 2011 survey of patients with complex care needs in eleven countries finds that care is often poorly coordinated. Health Aff. 2011;30:2437–48.

    Article  Google Scholar 

  5. Government of Canada. Reducing COVID-19 risk in community settings: A tool for operators. https://www.canada.ca/en/public-health/services/publications/diseases-conditions/vulnerable-populations-covid-19.html. (2020). Accessed 30 May 2022.

  6. Blumenthal D, Chernof B, Fulmer T, Lumpkin J, Selberg J. Caring for high-need, high-cost patients: an urgent priority. N Engl J Med. 2016;375:909–11.

    Article  Google Scholar 

  7. World Health Organization. Integrated care models: an overview. https://www.euro.who.int/__data/assets/pdf_file/0005/322475/Integrated-care-models-overview.pdf. (2016). Accessed 30 May 2022.

  8. Baxter S, Johnson M, Chambers D, Sutton A, Goyder E, Booth A. The effects of integrated care: a systematic review of UK and international evidence. BMC Health Serv Res. 2018;18:350.

    Article  Google Scholar 

  9. Althaus F, Paroz S, Hugli O, et al. Effectiveness of interventions targeting frequent users of emergency departments: a systematic review. Ann Emerg. 2011;58:41–52.

    Article  Google Scholar 

  10. Kumar GS, Klein R. Effectiveness of case management strategies in reducing emergency department visits in frequent user patient populations: a systematic review. J Emerg Med. 2013;44:717–29.

    Article  Google Scholar 

  11. Soril L, Legget L, Lorenzetti D, Noseworthy T. Reducing frequent visits to the emergency department: a systematic review of intervention. PLOS One. 2015;10.

  12. What is a case manager? http://cmsa.org/who-we-are/what-is-a-case-manager/. Accessed 30 May 2022.

  13. Campbell M, Fitzpatrick R, Haines A, et al. Framework for design and evaluation of complex interventions to improve health. BMJ. 2000;321:694–6.

    Article  CAS  Google Scholar 

  14. Nurjono M, Yoong J, Yap P, Wee SL, Vrijhoef HJM. Implementation of integrated care in Singapore: a complex adaptive system perspective. Int J Integr Care. 2018;18:4.

    Article  Google Scholar 

  15. Dopson S, Fitzgerald L, Ferlie E. Understanding change and innovation in healthcare settings: reconceptualizing the active role of context. J Change Manag. 2008;8:213–31.

    Article  Google Scholar 

  16. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337: a1655.

    Article  Google Scholar 

  17. Skivington K, Matthews L, Simpson SA, et al. Framework for the development and evaluation of complex interventions: gap analysis, workshop and consultation-informed update. HTA. 2021;25:1–132.

    Google Scholar 

  18. Ford I, Norrie J. Pragmatic trials. N Engl J Med. 2016;375:454–63.

    Article  Google Scholar 

  19. Khorsan R, Crawford C. How to assess the external validity and model validity of therapeutic trials: a conceptual approach to systematic review methodology. Evid Based Complement Alternat Med. 2014;2014:694804.

    Article  Google Scholar 

  20. Kessler R, Glasgow RE. A proposal to speed translation of healthcare research into practice: dramatic change is needed. Am J Prev Med. 2011;40:637–44.

    Article  Google Scholar 

  21. Peek CJ, Glasgow RE, Stange KC, Klesges LM, Purcell EP, Kessler RS. The 5 R’s: an emerging bold standard for conducting relevant research in a changing world. Ann Fam Med. 2014;12:447–55.

    Article  CAS  Google Scholar 

  22. Green LW. Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract. 2008;25(Suppl 1):i20–4.

    Article  Google Scholar 

  23. Sanson-Fisher RW, Bonevski B, Green LW, D’Este C. Limitations of the randomized controlled trial in evaluating population-based health interventions. Am J Prev Med. 2007;33:155–61.

    Article  Google Scholar 

  24. Dickinson LM, Beaty B, Fox C, et al. Pragmatic cluster randomized trials using covariate constrained randomization: a method for practice-based research networks (PBRNs). J Am Board Fam Med. 2015;28:663–72.

    Article  Google Scholar 

  25. Hughes JP, Granston TS, Heagerty PJ. Current issues in the design and analysis of stepped wedge trials. Contemp Clin. 2015;45:55–60.

    Article  Google Scholar 

  26. Umscheid CA, Williams K, Brennan PJ. Hospital-based comparative effectiveness centers: translating research into practice to improve the quality, safety and value of patient care. J Gen Intern. 2010;25:1352–5.

    Article  Google Scholar 

  27. Hanley P, Chambers B, Haslam J. Reassessing RCTs as the ‘gold standard’: synergy not separatism in evaluation designs. Int J Res Method Educ. 2016;39:287–98.

    Article  Google Scholar 

  28. Deaton A, Cartwright N. Understanding and misunderstanding randomized controlled trials. Soc Sci Med. 2018;210:2–21.

    Article  Google Scholar 

  29. Petticrew M, Rehfuess E, Noyes J, et al. Synthesizing evidence on complex interventions: how meta-analytical, qualitative, and mixed-method approaches can contribute. J Clin Epidemiol. 2013;66:1230–43.

    Article  Google Scholar 

  30. Hudon C, Chouinard MC, Pluye P, et al. Characteristics of case management in primary care associated with positive outcomes for frequent users of healthcare: a systematic review. Ann Fam Med. 2019;17:448–58.

    Article  Google Scholar 

  31. Dy SM, Garg P, Nyberg D, et al. Critical pathway effectiveness: assessing the impact of patient, hospital care, and pathway characteristics using qualitative comparative analysis. Health Serv Res. 2005;40:499–516.

    Article  Google Scholar 

  32. Berg-Schlosser D, Meur G, le D, Rihoux B, Ragin CC. Configurational comparative methods: qualitative comparative analysis (QCA) and related techniques. Thousand Oaks, California: SAGE Publications, Inc. 2009. https://methods.sagepub.com/book/configurational-comparative-methods. Accessed 12 Dec 2022.

  33. Hong QN, Pluye P, Bujold M, Wassef M. Convergent and sequential synthesis designs: implications for conducting and reporting systematic reviews of qualitative and quantitative evidence. Syst Rev. 2017;6:61.

    Article  Google Scholar 

  34. Pawson R, Tilley N. Realistic evaluation. London: Sage; 1997.

    Google Scholar 

  35. Wong G, Westhorp G, Manzano A, Greenhalgh J, Jagosh J, Greenhalgh T. RAMESES II reporting standards for realist evaluations. BMC Med. 2016;14:96.

    Article  Google Scholar 

  36. Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: what are they and what can they contribute? Med Educ. 2012;46:89–96.

    Article  Google Scholar 

  37. Hudon C, Chouinard MC, Aubrey-Bassler K, et al. Case management in primary care for frequent users of health care services: a realist synthesis. Ann Fam Med. 2020;18:218–26.

    Article  Google Scholar 

  38. Meadows DH, Wright D, Wright D. Thinking in systems : a primer. White River Junction, Vermont: Chelsea Green Publishing; 2008.

    Google Scholar 

  39. Williams B. Prosaic or profound? The adoption of systems ideas by impact evaluation. Inst Develop Stud. 2015;46:7–16.

    Article  Google Scholar 

  40. Hudon C, Chouinard M-C, Lambert M, Diadiou F, Bouliane D, Beaudin J. Key factors of case management interventions for frequent users of healthcare services: a thematic analysis review. BMJ Open. 2017;7: e017762.

    Article  Google Scholar 

  41. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34:228–43.

    Google Scholar 

  42. McCreight MS, Rabin BA, Glasgow RE, et al. Using the Practical, Robust Implementation and Sustainability Model (PRISM) to qualitatively assess multilevel contextual factors to help plan, implement, evaluate, and disseminate health services programs. Transl Behav Med. 2019;9:1002–11.

    Article  Google Scholar 

  43. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:217–26.

    Article  Google Scholar 

  44. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.

    Article  Google Scholar 

  45. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1.

  46. Hudon C, Chouinard MC, Bisson M, et al. Case management programs for improving integrated care for frequent users of healthcare services: an implementation analysis. Int J Integr Care. 2022;22:11.

    Article  Google Scholar 

  47. Canadian Institutes of Health Research. Knowledge user engagement. http://www.cihr-irsc.gc.ca/e/49505.html. (2016). Accessed Dec 12 2022.

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

C.H. wrote a first draft of the manuscript which was reviewed and improved by R.K. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Catherine Hudon.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hudon, C., Kessler, R. Integrating case management for patients with complex needs in the ground practice: the importance of context in evaluative designs. Health Res Policy Sys 21, 9 (2023). https://doi.org/10.1186/s12961-023-00960-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-023-00960-4

Keywords