Skip to main content

A novel methodological approach to participant engagement and policy relevance for community-based primary medical care research during the COVID-19 pandemic in Australia and New Zealand


Community-based primary care, such as general practice (GP) or urgent care, serves as the primary point of access to healthcare for most Australians and New Zealanders. Coronavirus disease 2019 (COVID-19) has created significant and ongoing disruptions to primary care. Traditional research methods have contributed to gaps in understanding the experiences of primary care workers during the pandemic. This paper describes a novel research design and method that intended to capture the evolving impact of the COVID-19 pandemic on primary care workers in Australia and New Zealand. Recurrent, rapid cycle surveys were fielded from May 2020 through December 2021 in Australia, and May 2020 through February 2021 in New Zealand. Rapid survey development, fielding, triangulated analysis and dissemination of results allowed close to real-time communication of relevant issues among general practice workers, researchers and policy-makers. A conceptual model is presented to support longitudinal analysis of primary care worker experiences during the COVID-19 pandemic in Australia and New Zealand, and key learnings from applying this novel method are discussed. This paper will assist future research teams in development and execution of policy-relevant research in times of change and may inform further areas of interest for COVID-19 research in primary care.

Peer Review reports


Community-based primary care, such as general practice or urgent care, serves as the first point of access to healthcare for most Australians and New Zealanders [1,2,3]. As medical generalists, general practitioners, primary care nurse practitioners and practice nurses provide care for all ages, cultures, injuries and diseases. Being community-based provides opportunity for ongoing and longitudinal care across the lifespan [2, 3]. General practice was recognized as a vital component of health services during previous viral outbreaks, such as the 2009 H1N1 pandemic [4, 5]. The COVID-19 pandemic has seen acknowledgement and involvement of community-based primary care, in particular general practice.

Early in the COVID-19 pandemic, Australia and New Zealand implemented strategies intended to support COVID-19 management and maintain usual medical practice in primary care [6]. Both countries saw rapid and widespread adoption of telehealth and phone triaging for general practice, creation of community-based COVID-19 testing and treatment centres and implementation of electronic prescribing to reduce interactions between healthy and potentially infectious people [6,7,8]. Personal protective equipment (PPE) for primary care workers was recommended, although access to it was often poor [9,10,11]. Physical distancing was enforced, with clinics requiring patients to wait outside until being called in for an appointment [11,12,13]. All strategies required clinics to rapidly create and communicate workplace policies to ensure safety of staff and patients. However, information and policies governing general practice have, and continue to, evolve with volatile COVID-19 contexts [9, 10]. Despite both Australia and New Zealand having an up-to-date pandemic plan, and experiencing fewer COVID-19 infections per capita than other countries between 2020 and 2022 (prior to the omicron variant outbreaks) [14], COVID-19 created significant and ongoing disruptions to primary care. It is important to track, evaluate and report the impact of disruptions in primary care to learn from and adapt future pandemic planning.

Conducting rigorous, peer-reviewed research that can reliably inform health system decision-making has been challenging during the COVID-19 pandemic [15]. Early in the pandemic, evidence used to inform policy relied on lessons learned from non-COVID-19 illnesses (e.g. influenza), which while similar, were not reflective of the global burden and uncertainty related to COVID-19 [5, 6]. Traditional methods to gather experiential evidence during COVID-19 have been slow and of questionable quality [15], with high burden on busy clinicians [16], particularly given the lack of Australian or New Zealand primary care research infrastructure [17, 18]. As such, there was a need for data that could provide immediate utility or benefit, reflective of national or local pandemic contexts, while not placing undue response burden on participants.

Rapid, responsive, repeated, ‘light-touch’ methods have been valuable tools to capture the experiences of primary care workers during the COVID-19 pandemic, and the way that experiences vary across place and time. The objective of the paper is to describe and reflect on a novel, recurrent, rapid-cycle survey method employed to capture the evolving impact of the COVID-19 pandemic on primary care in Australia and New Zealand. The method is based on a similar survey initially developed and fielded in the United States and Canada, beginning in March 2020 [19, 20], and forms part of an international collaborative project among the four countries. Understanding the benefits, drawbacks and potential solutions of this approach may inform or facilitate more responsive and impactful research in future times of significant and precipitous change, uncertainty or crisis.


This study used a pragmatic approach [21, 22] and employed responsive, iterative, cross-sectional surveys using a combination of open and closed questions to explore and report the experience of Australian and New Zealand community-based primary care workers over the initial course of the pandemic. The study was conducted in accordance with the Declaration of Helsinki [23] and approved by Australian National University Human Research Ethics Committee (2020/273) and the University of Auckland Human Participants Ethics Committee (024659). The research team included clinicians, researchers, educators and policy-linked academics.


Surveys were conducted online, aimed at community-based primary care practitioners in Australia and New Zealand. Surveys were released from May 2020 to December 2021 in Australia, and from May 2020 to February 2021 in New Zealand. Surveys were fielded for 1 week, every 2–4 weeks for 5 months (see Table 1), covering the initial peak of the pandemic, then declined in frequency as the pandemic progressed at various intervals. Results reported are based on available data by date of publication.

Table 1 Timing of data collection and flash question themes

Participant eligibility

Eligible participants were primary care doctors, practice nurses, nurse practitioners or practice managers working in Australian or New Zealand community-based primary care practices from May 2020. Participants were required to work at the same practice for at least 12 months. GP registrars and students were excluded due to the lack of continuity during the study period. Secondary care clinicians were excluded. The exact number of potential participants within Australia and New Zealand is unknown given that primary care is delivered through private businesses in these two countries. Best estimates from government and registering organization counts and professional networks of the authors are approximately 31 500 GPs [24], 14 000 nurses [25] and one practice manager per general practice (~ 6900) [24] in Australia; and approximately 4000 GPs, 230 community-based urgent care doctors, 3400 practice nurses [26] and an estimated 900 practice managers in New Zealand.


A mixture of convenience and snowball sampling was employed for recruitment. Representative groups with large primary care worker memberships were considered most appropriate targets for recruitment to ensure a wide reach and adherence to strict COVID-19 safety requirements. In Australia, potential participants were able to access surveys through a website hosted by the College of Health and Medicine, Australian National University [27]. Organizations including the Royal Australian College of General Practitioners (RACGP), Australian Medical Association and Primary Health Networks disseminated survey links through newsletters. Australian social media groups used by general practitioners and practice nurses also shared study details. Links to the New Zealand survey were disseminated via a number of organizations including the Royal New Zealand Colleges of General Practice and of Urgent Care (RNZCGP and RNZCUC, respectively), General Practice New Zealand, the Rural GP Network, the Practice Managers and Administrators Association New Zealand, primary health organizations, the New Zealand Medical Association and relevant Facebook pages. Implied consent was detailed in participant coversheet, stating that submission of the survey implied consent for use of de-identified data for publication.

All respondents in Australia and New Zealand could sign up to receive email alerts for each new survey. MailChimp was used to automate subsequent survey distribution. Participants were asked to complete each new survey once only, despite the possibility of being exposed to multiple recruitment methods. Multiple methods were used to reduce likelihood of ballot stuffing. Software settings were used to limit to one completion per device. Through a hashing process, each participant’s email address generated a unique alphanumeric token that could not be reverse engineered to identify the participant but could be used to ensure one completion only. IP addresses and postal codes were compared with scan for any inconsistencies in submission origins.

Data collection tools

Surveys were built, managed and delivered by the United States research team at the Larry A Green Center using SurveyMonkey and MailChimp. Questions were designed jointly by the international research team, which included Australian and New Zealand researchers. Policy-relevant questions were posed by organizational and government representatives with the intention to inform decisions related to the pandemic response. The surveys used a combination of open (qualitative) and closed (quantitative) questions, with data combined and analysed using a mixed methods triangulation design [28]. Over time, surveys used a constructivist paradigm, with participant responses and stakeholder discussions interpreted by the research team to build understanding of the unique problems experienced in primary care, and sequentially inform future survey questions and analysis.

Surveys were designed to be short, taking on average 7 min to complete, and were open for 1 week. All surveys included a series of up to five ‘flash questions’, responsive to the contemporaneous epidemiological and social context, and ‘core questions’. Flash questions were designed to provide timely data on emergent issues in general practice. Topics of these questions were informed by the research team, government policy initiatives (flagged by policy-makers), national news, grey literature [29] and survey responses. In Australia, flash questions were also informed by Primary Health Network representatives (local policy implementers and advocates), and in New Zealand by consultation with general practice leaders. Flash questions were discussed by the research team and prioritized according to immediate policy needs and implications. Questions were adapted to ensure country-specific relevance wherever possible. Due to short turnaround times, piloting of flash questions was restricted to content and face validity and tested by clinical GPs within the research team. Table 1 outlines the timing of data collection and flash question themes. For information on specific questions and response formats, see Additional file 1: Table S2.

Core questions, included in every survey, were based on the survey designed by the Larry A Green Center [19] with adjustments to ensure country-specific language and relevance. For full detail on survey questions and response options, see Additional file 1: Tables S1 and S2. Core questions changed minimally over time, with answer options added relevant to the progression of the pandemic. Topics included perceived practice strain, specific stressors experienced, consultation format (e.g. telehealth or face-to-face), proportion of COVID-19 cases tested and managed, characteristics of participants (doctor, nurse, nurse practitioner, practice manager) and practice descriptors (e.g. billing model, urgent care, urban or rural). New Zealand was unable to collect data identifying Māori practices or region of practice, due to potential for identification. Australia added two survey response options later in the series due to developments of specific practice types for COVID-19. Table 2 outlines the core questions and response formats.

Table 2 Core questions and response formats asked repeatedly overtime

Data management and analysis

Survey data were downloaded and analysed at the end of each survey period. Analysis was completed using SPSS v26 and SAS v9.4 (for Australian data) and Stata 15 (for New Zealand data). Regular data cleaning included creation of new variables to calculate rurality index for Australia on the basis of postcodes using the Australian Bureau of Statistics’ 2017 postcode to 2016 remoteness area concordance. Where a postcode mapped to more than one remoteness category, it was allocated to that with the highest population proportion.

Single survey analysis of quantitative responses included frequencies and percentages to describe each cohort and the primary outcomes of interest, including: practice strain, stressors, consultation format and flash questions. All participants were counted as equal, though reports did include a breakdown of role within the practice [24, 28]. Single survey rapid analysis of open text responses was completed using inductive content analysis by reading, re-reading, developing initial codes and combining codes into general themes. Frequency of codes and relevance to specific flash questions, as well as the overarching research purpose, were used to inform themes. When applicable, triangulated approaches to analysis were employed where quantitative responses were used to support interpretation, inductive coding and thematic development [25]. Themes were reported alongside supportive quotes from participants, and where applicable alongside quantitative results.

In Australia, results were prepared by two researchers within 2 weeks of survey closing. A 2–3 page summary was produced for each survey and checked for accuracy by another two researchers. Each report included a statement on the COVID-19 context in Australia sourced from Australian Government Department of Health data [30], as well as government media releases and news articles. The triangulated mixed methods approach to analysis supported quotes and themes being reported alongside quantitative responses where possible to provide context for identified themes [25]. Summaries were shared with the Department of Health, partner organizations and participants, and published online by the Australian National University [27].

New Zealand results were analysed within 1 week of the survey closing by two researchers. An executive summary and an infographic were created. These were posted on the University of Auckland project website [31]. Executive summaries were emailed to all participating organizations, the Director-General of Health, the Chief Scientific Advisor for the Ministry of Health (who disseminated summaries to key policy personnel in the Ministry) and to the media. Results of the 11 surveys were reported by print, radio or TV media on 21 occasions between June 2020 and February 2021 [32].

Each country merged their own survey data so that participants were treated as cases, and survey questions as variables. Participant responses were matched across surveys within each country on the basis of their survey token ID and/or email address to identify individuals over time (unique respondents). Total participant numbers, number of new respondents and existing participants were calculated for each survey series for each country. Number of surveys completed for each unique respondent was recorded, and a linked longitudinal dataset created using NVivo v12 (QSR International). All participants were counted as equal, though reports include a breakdown of role within the practice. Linking is important to identify individual temporal patterns or possible sources of bias in the qualitative data, affecting future analysis. To describe the total cohort, frequencies and percentages were used to report individual and practice characteristics of unique respondents.


In Australia, there were 1267 responses to 17 surveys, representing 682 unique participants. In New Zealand, there were 1519 responses to the 11 surveys, representing 482 unique participants. Figure 1 outlines responses to each survey series by country, and the number total participants including new and existing participants.

Fig. 1
figure 1

Total new and repeat participant responses for each Australian and New Zealand survey

Response rates were not possible to estimate given convenience and snowball recruitment strategies. Retention varied across surveys. Most participants only completed one survey in the Australia or New Zealand survey series (n = 559, 82.0%; and n = 291, 60.4%, respectively). Table 3 presents the number of surveys completed by number of unique respondents.

Table 3 Number of surveys completed by unique respondents in Australian (17 surveys, n = 682) and New Zealand (10 surveys, n = 482)

Most unique respondents were general practitioners. In comparison with Australia, New Zealand had a higher proportion of practice managers (n = 21, 3.1% versus n = 109, 21.2%) and nurses (n = 24, 3.5% versus n = 56, 10.9%; respectively). Table 4 presents participant and practice characteristics for each unique respondent.

Table 4 Participant and practice characteristics for unique respondents from Australia (n = 682) and New Zealand (n = 482)

For both Australia and New Zealand, most unique respondents were from practices with four or more GPs (n = 583, 85.5%; and n = 346, 67.4%, respectively) and GP-owned and operated practices (n = 485, 71.1%; and 366, 71.3%, respectively). Approximately one fifth of respondents from Australia and New Zealand were from rural practices (n = 158, 23.2%; and n = 104, 20.3%, respectively). Few unique respondents reported being from a state- or territory-funded clinic (n = 35, 5.1%) or District Health Board-funded clinic (n = 43, 8.4%). While the general cohort is not statistically representative of general practice in Australia or New Zealand, the data are reflective of the range of practice characteristics for both countries, indicating the broad range of primary care voices being represented.

Figure 2 shows unique respondents by region. In Australia, unique respondents were obtained from all states and territories, with New South Wales and Victoria, the two largest jurisdictions, contributing the greatest number of respondents (n = 198, 29.7%; and n = 154, 23.9%, respectively). A total of 12 Australian respondents did not provide data for region. Due to potential identification of participants, New Zealand was unable to collect data on region.

Fig. 2
figure 2

Unique respondents by region of Australia (n = 682) and New Zealand (n = 482)

Compared to 2019/20 workforce statistics, respondents in this study were representative of GP distribution around Australia, apart from an overrepresentation from the Australian Capital Territory (11.2% versus 2.0%; χ2 = 380.01, p < 0.01), where the authors are based.

To best interpret and understand the data from this study, we created a conceptual analytical design that considers each single survey as an individual dataset (results previously published) and considers the holistic dataset to inform how general practice has responded to COVID-19 over time. Our analytical design used an evolving iceberg metaphor (illustrated in Fig. 3); a conceptualization that addresses many of the weakness of rapid cycle survey methodologies by highlighting the sequential and relatively constructed nature of what is knowable under extreme and changeable conditions. Collectively, responses from participants at any particular timepoint can be considered as a body of information represented by the ‘hummock’ of a given iceberg. The hummock (section of the iceberg exposed above the water) changes over time and space in response to its environment or context. The hummock alone may provide little information about the shape and size of the whole iceberg, especially of the ‘bummock’ (the portion not visible below the waterline). However, observation of changes in the hummock can help illuminate the forces to which the iceberg is exposed, and their progressive impact on its changing form.

Fig. 3
figure 3

Conceptual model for recurrent cross-sectional analysis, depicting icebergs changing due to environmental forces



This pragmatic study design employed responsive, rapid-cycle surveys to inform stakeholders and policy decision-makers about the experience of primary care workers throughout the COVID-19 pandemic in Australia and New Zealand. These methods, incorporating rapid survey development, fielding, analysis, dissemination of results and iterative development that reflected the realities of clinicians, allowed close to real-time communication between research and policy. It also enabled the voice of primary care practitioners to be heard during a period of extreme strain. An emerging challenge of the COVID-19 pandemic has been the need to develop and refine tractable yet robust research methods that can deliver usable results quickly while meeting conventional requirements for rigour and quality, and balancing risks of over-proliferation and wastage amid a “deluge of poor quality research” [33, 34]. While there is a need for flexibility and adaptation of traditional methods, reliability remains an overriding concern [35]. This discussion focusses on an examination of the unique methods employed in this study, and offers a conceptual model for understanding how the methods, their limitations and advantages can be understood and interpreted in the evolving pandemic context.

Methodological considerations to enhance rigour and interpretation

Pragmatic approaches and a constructivist paradigm to science commonly use novel combinations or adaptations of methodological approaches to apply methods that are best suited to the research question(s), while avoiding philosophical or methodological polarization [21]. Theoretically, pragmatic approaches are concerned with “practical understandings of concrete, real world issues”, and data which can inform the development of actionable knowledge [36]. While the methods used in this study may challenge traditional statistical theory concepts of representativeness; the need for immediate and evolving data to inform decisions, with the dexterity to respond to the changing circumstances of the pandemic, was crucial [15]. The purpose of the project was to describe the experiences of primary care workers during the evolution of the pandemic and to use the findings to give a ‘voice’ to primary care workers by quickly and directly providing the description to policy-makers and representative bodies. As such, the research was exploratory and descriptive, leading to a constructivist view of the unfolding pandemic crisis and its impact in primary care. The concurrent collection of both quantitative and qualitative data supported a triangulated mixed methods analysis with inclusion of quantitative measures to specify and define issues in ways that were simple for respondents to grasp, and which allowed for meaningful interpretation of qualitative responses [28]. The pragmatic approaches used in this study enabled flexibility and responsiveness to a changing environment, adapting to the COVID-19 context, and accepting some of the practical constraints of rapid-cycle research [37, 38]. While not perfect by traditional empirical standards to confirm cause and effect relationships, these methods balance rapidity and rigour to produce evidence that may be ‘good enough’ to inform policy and urgent decision-making while remaining feasible to undertake in critical or heightened operating environments (e.g. a pandemic).

Rapid survey methodologies have been used effectively in public health and field epidemiology [39], though they involve trade-offs between precision and cost. While user-friendly and efficient, rapid surveys may be subject to selection bias, and outputs should be tailored with appropriate statistical and interpretive caution [37]. Despite these constraints, and although single rapid assessments can be limited in their utility and applicability, repeated assessments over time and among different groups can yield important insights [39]. Single survey samples obtained in this study are small and may not be representative, limiting inference and transferability of results. However, over multiple series, this study recruited a wide range of respondents representing a spectrum of practice characteristics, and focussed on generating an illustrative, sequential description of the collective experience of primary care professionals during the evolving COVID-19 pandemic. No other published study currently describes the experience of Australian and New Zealand primary care throughout the pandemic, progressively, as it was experienced. While our individual samples may not capture the entire story at any single interval, as illustrated using the iceberg model, they can iteratively provide information about the changing landscape of the hummock and the environment in which it occurs. As new features emerge, these may warrant investigation, exploration and description to inform future health system and primary-care-specific planning during times of change. Compared with no data, or data that are episodic and disconnected, the value of this method for informing policy decisions is high despite its limitations. Future research may choose to use more traditional and rigorous methods to delve deeper into experiences that were brought to the forefront in this study.

Strengths and limitations of the applied method

As expected when employing recurrent survey designs, recruitment and retention of participants across the survey series was challenging. The number of responses was higher early in the study, potentially due to the novelty of COVID-19 and the survey, the authority and reach of organizations involved in recruitment, and high motivation to contribute to an understanding of the impacts of COVID-19 in primary care. While surveys developed for this study were short (taking about 7 min to complete), retention of respondents dropped over time with very few participants completing all surveys. Fewer responses over time may have been influenced by sustained high stress and fatigue of frontline primary care workers, and burden of (or reduced interest in) research participation over time. Retention appeared better in New Zealand compared with Australia, perhaps due to stronger media and organizational engagement in New Zealand [32] and lower impact of COVID-19. Of note, neither Australia nor New Zealand have national primary care research infrastructures, such as national practice-based research networks, to support recruitment, data collection or dissemination in primary care, meaning all research participation by primary care workers is wholly in addition to clinical workload, which likely increased participant burden and contributed to fewer responses over time [17, 18]. Limited research infrastructure meant convenience and snowball sampling via national organizations and professional networks was best available for recruitment, though future researchers should note potential restriction of participation to known and engaged parties. The online survey method was preferred during COVID-19 due to strict distancing measures, research ethics restrictions and strong reliance on email and online communication modalities. Still, future researchers should consider paper-based surveys, telephone and/or in person communication to increase responses. Finally, given the short, repeated nature of the survey for immediate feedback, we were unable to collect more contextual data such as primary care worker vaccination opinions or community attitudes towards COVID-19. Where possible, future researchers should triangulate with other datasets, grey literature and published opinion to support the validity of responses.

Limited retention of participants meant longitudinal analysis of individual trajectories within the data was not possible, and alternative approaches are required to describe the composite results of the survey series across time and space. A key strength of this methodological approach is our development of the iceberg metaphor, which provides a model for conceptualizing the relevance and interpretive limitations of our findings. The conceptual model supports integration of qualitative and quantitative data in two ways: data from all participants at a single timepoint can be synthesized and summarized as a unit, then differences and similarities across timepoints can be compared. Crucially however, the changing medical, economic and political context of the COVID-19 environment may have shaped participant responses between surveys. Building on existing rapid feedback reports based on individual surveys [27, 31], and incorporating publicly available government information and media reports, recurrent cross-sectional analysis [40] can be used to tell the story of how COVID-19 and associated policy and public health responses have impacted community-based primary care in Australia and New Zealand over time. Current findings reported elsewhere [24, 28] have highlighted emergent issues, including perceptions on border closures, rural–urban differences in impact [41, 42] and the burden of vaccine counselling experienced in primary care [43].

Lessons learned

Reflecting on our approach, we offer suggestions for future researchers undertaking similar studies. Early engagement and buy-in from stakeholders was key. Collaboration with influential organizations, such as RACGP and RNZCGP in this study, was essential to assist with recruitment, dissemination of findings and to provide feedback. We established early engagement with policy-makers through established professional networks to provide insight into issues of relevance and to convey findings which may help shape decision-making. Connecting with the media was also important to help disseminate immediate findings and draw the attention of both prospective and previous participants. In this study we placed immediate priority on professional engagement and policy impact with immediate results, while proceeding to academic publication was a secondary and long-term priority. While rapid translation of evidence into practice and policy is becoming more valued in the research world, traditional metrics of publication are still highly regarded [44, 45], and studies employing this method should also plan for longitudinal analysis and dissemination to ensure maximum impact.


The COVID-19 pandemic has been a time of clinical and social uncertainty and upheaval that has placed a profound burden on primary care professionals. Understanding these changes has required an equally rapid adaptation in research methods. This rapid-cycle, recurrent survey identified and responded to immediate issues experienced in community-based primary care settings, highlighting, probing and communicating these to relevant professions and policy decision-makers in near real-time instalments. As such, the method proves feasible and accessible to implement, particularly during times of rapid change. Future research using a repeated cross-sectional approach should consider applying the conceptual model presented here to provide a rich, longitudinal description of data.

Availability of data and materials

Data generated and used for this paper are not publicly available due to being re-identifiable. De-identified data may be available by reasonable request to the country team leaders: For Australian data, contact Professor Kirsty Douglas ( For New Zealand data, contact Professor Felicity Good-year Smith (


  1. Australian Institute of Health and Wellfare. Primary health care. Canberra; 2020. Contract No.: Cat. no AUS 232.

  2. Goodyear-Smith F, Ashton T. New Zealand health system: universalism struggles with persisting inequities. Lancet. 2019;394(10196):432–42.

    Article  PubMed  Google Scholar 

  3. The Royal Australian College of General Practitioners. Vision for general practice and a sustainable healthcare system. East Melbourne, Victoria; 2019.

  4. Australian Government Department of Health. Australian health management plan for pandemic influenza. Canberra: Australian Government; 2019.

    Google Scholar 

  5. Desborough J, Dykgraaf SH, Phillips C, Wright M, Maddox R, Davis S, et al. Lessons for the global primary care response to COVID-19: a rapid review of evidence from past epidemics. Fam Pract. 2021.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Desborough J, Hall Dykgraaf S, de Toca L, Davis S, Roberts L, Kelaher C, et al. Australia’s national COVID-19 primary care response. Med J Aust. 2020;213(3):104–6.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Wilson G, Currie O, Bidwell S, Saeed B, Dowell A, Halim A, et al. Empty waiting rooms: the New Zealand General Practice experience with telehealth during the COVID-19 pandemic. NZ Med J. 2021;134(1538):89–101.

    Google Scholar 

  8. Roberts L, Desborough J, Hall Dykgraaf S, Burns P, Kidd M, Maddox R, et al. Integrating general practice into the Australian COVID-19 response: a description of the GP Respiratory Clinic program in Australia. 2021.

  9. Baddock K. COVID-19-the frontline (a GP perspective). N Z Med J. 2020;133(1513):8–10.

    PubMed  Google Scholar 

  10. Sotomayor-Castillo C, Nahidi S, Li C, Hespe C, Burns PL, Shaban RZ. General practitioners’ knowledge, preparedness, and experiences of managing COVID-19 in Australia. Infect Dis Health. 2021.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Halcomb E, McInnes S, Williams A, Ashley C, James S, Fernandez R, et al. The experiences of primary healthcare nurses during the COVID-19 pandemic in Australia. J Nurs Scholarsh. 2020;52(5):553–63.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Borrello E, Pancia A, L. M. GPs under coronavirus pressure with masks in short supply and fears of what is to come: ABS News; 24 March, 2020.

  13. Australian Medical Association. AMA COVIDSafe Practice Guide: AMA; 2020.

  14. Ritchie H, Mathieu E, Rodes-Guirao L, Appel C, Giattino C, Ortiz-Ospina E, et al. Explore the global situation: Statistics and Research: Coronavirus Pandemic (COVID-19). Our World In Data. online2021.

  15. Yazdizadeh B, Majdzadeh R, Ahmadi A, Mesgarpour B. Health research system resilience: lesson learned from the COVID-19 crisis. Health Res Policy Syst. 2020;18(1):1–7.

    Article  Google Scholar 

  16. Tremblay S, Castiglione S, Audet L-A, Desmarais M, Horace M, Peláez S. Conducting qualitative research to respond to COVID-19 challenges: reflections for the present and beyond. Int J Qual Methods. 2021;20:16094069211009680.

    Article  Google Scholar 

  17. Pirotta M, Temple-Smith M. Practice-based research networks. Aust Fam Physician. 2017;46(10):793–5.

    PubMed  Google Scholar 

  18. Lau P, Barnes K, Russell G, J. E. National primary health care practice-based research networks: Australasian Association for Academic Primary Care Inc.; 2022.

  19. Larry A Green Centre. Quick COVID-19 survey Richmond, VA, USA: Larry A Green Centre; 2021.

  20. Primary and Integrated Health Care Innovations Network. Canadian Quick COVID-19 Primary Care Survey Canada Canada: SPOR PIHCI; 2021.

  21. Onwuegbuzie AJ, Leech NL. On becoming a pragmatic researcher: the importance of combining quantitative and qualitative research methodologies. Int J Soc Res Methodol. 2005;8(5):375–87.

    Article  Google Scholar 

  22. Ramanadhan S, Revette AC, Lee RM, Aveling EL. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun. 2021;2(1):1–10.

    Article  Google Scholar 

  23. General Assembly of the World Medical Association. World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects. J Am College Dent. 2014;81(3):14–8.

    Google Scholar 

  24. Steering Committee for the Review of Government Service Provision. Report on Government Services 2021: Part E, Section 10. Canberra; 2021,

  25. Australian Practice Nurses Association. General practice nursing: APNA; 2021.

  26. The Nursing Council of New Zealand. Te Ohy Mahi Tapuhi o Aotearoa. The New Zealand Nursing Workforce: a profile of nurse practitioners, registered nurses and enrolled nurses 2018–2019. Wellington, New Zealand; 2020.

  27. Douglas K, Barnes K, Hall Dykgraaf S, Douglas K. COVID-19 General practice clinicians: 5 minute survey: College of Health and Medicine, Australian National University; 2021.

  28. Creswell JW, Clark VLP. Designing and conducting mixed methods research. Thousand Oak: Sage publications; 2017.

    Google Scholar 

  29. Outcomes Health. COVID-19 Data Insight Papers 2021. 2021.

  30. Australian Government Department of Health. Coronavirus (COVID-19) at a glance infographic collection: Australian Government; 2021.

  31. Goodyear-Smith F, Bui N, Eggleton K. Quick COVID-19 New Zealand Primary Care Survey - Results: The University of Auckland; 2021.

  32. The University of Auckland. Media Summary: Quick COVID-19 New Zealand Primary Care Survey 2021 [Available from:

  33. Clarke M. How can we avoid research waste during the COVID-19 pandemic and plan for the future? BMJ Opinion. 2020 [updated April 21, 2020 December, 2021.

  34. Glasziou PP, Sanders S, Hoffmann T. Waste in covid-19 research. London: British Medical Journal Publishing Group; 2020.

    Book  Google Scholar 

  35. Karmakar S, Dhar R, Jee B. Covid-19: research methods must be flexible in a crisis. BMJ. 2020;370: m2668.

    Article  PubMed  Google Scholar 

  36. Kelly LM, Cordeiro M. Three principles of pragmatism for research on organizational processes. Methodol Innov. 2020;13(2):2059799120937242.

    Google Scholar 

  37. Macintyre K. Rapid assessment and sample surveys: trade-offs in precision and cost. Health Policy Plan. 1999;14(4):363–73.

    Article  CAS  PubMed  Google Scholar 

  38. Bradt DA, Drummond CM. Rapid epidemiological assessment of health status in displaced populations—an evolution toward standardized minimum essential data sets. Prehosp Disaster Med. 2003;18(1):178–85.

    Article  PubMed  Google Scholar 

  39. UNICEF. Chapter 4: Key lessons from the documented assessments. In: Undertaking Rapid Assessments in the COVID-19 context. Kathmandu Nepal; 2021.

  40. Grossoehme D, Lipstein E. Analyzing longitudinal qualitative data: the application of trajectory and recurrent cross-sectional approaches. BMC Res Notes. 2016;9(1):1–5.

    Article  Google Scholar 

  41. Eggleton K, Bui N, Goodyear-Smith F. Making sure the New Zealand border is not our Achilles heel: repeated cross-sectional COVID-19 surveys in primary care. NZ Med J. 2021;134(1538):68–76.

    Google Scholar 

  42. Eggleton K, Bui N, Goodyear-Smith F. COVID-19 impact on New Zealand general practice: rural-urban differences. Rural Remote Health. 2022;22(1):7185.

    PubMed  Google Scholar 

  43. O’Brien K, Barnes K, Dykgraaf SH, Douglas K. COVID-19 vaccinations and counselling—a mixed-methods survey of Australian general practice in July 2021. Aust J Prim Health. 2022;28(5):399–407.

    Article  PubMed  Google Scholar 

  44. Hicks D, Wouters P, Waltman L, De Rijcke S, Rafols I. Bibliometrics: the Leiden Manifesto for research metrics. Nat News. 2015;520(7548):429.

    Article  Google Scholar 

  45. Boland L, Brosseau L, Caspar S, Graham I, Hutchinson AM, Kothari A, et al. Reporting health research translation and impact in the curriculum vitae: a survey. Implement Sci Commun. 2020;1(1):1–11.

    Article  Google Scholar 

Download references


This study is part of an international collaboration and case comparison series between the USA, Canada, New Zealand and Australia, led by the Larry A Green Center in Richmond, VA, USA. The authors sincerely thank Sarah Reves and Jonathan O’Neal from the Larry A Green Center for their logistical support and management of the survey.


Survey work of the Green Center was supported by the Agency for Healthcare Research and Quality, the Morris-Singer Foundation and the Samueli Foundation. The New Zealand arm received funding from an MBIE COVID-19 Innovation Acceleration Grant, and no funding was received for the Australian or Canadian arms. No funder had any role in data collection, analysis or interpretation.

Author information

Authors and Affiliations



All authors were involved in study conception, design and analysis. KB and SH drafted the manuscript and NB and KOB analysed and drafted results. All authors edited and approved the final manuscript.

Corresponding author

Correspondence to Katelyn Barnes.

Ethics declarations

Ethics approval and consent to participate

The study was approved by Australian National University Human Research Ethics Committee (2020/273) and the University of Auckland Human Participants Ethics Committee (024659). Implied consent was detailed in participant coversheet, stating that submission of the survey implied consent for use of de-identified data for publication.

Consent for publication

Not applicable.

Competing interests

The authors declare they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:

Table S1 and S2. Questions and answer formats for all questionnaires in Australia and New Zealand during the study period. List of all question and answer options for all surveys conducted in Australia and New Zealand relating to this study.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Barnes, K., Hall Dykgraaf, S., O’Brien, K. et al. A novel methodological approach to participant engagement and policy relevance for community-based primary medical care research during the COVID-19 pandemic in Australia and New Zealand. Health Res Policy Sys 22, 13 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: