Skip to main content

Table 1 Overview of SPIRIT’s process effects and data sources

From: Policymakers’ experience of a capacity-building intervention designed to increase their use of research: a realist process evaluation

Desired process effects for the trial

Observed process effects

Supporting data sources

1. Leaders espouse SPIRIT and its goals

All CEOs disseminated initial information about their agency’s participation in SPIRIT, but only four had a continuing visible role in supporting the intervention, e.g. sending updates and attending workshops; some executive members participated in each site, but to very different extents ranging from a half hour ‘drop in’ to repeated and enthusiastic participation; many managers talked about SPIRIT in team meetings and encouraged their staff to attend

Interviews at two time points (early-intervention ‘context’ and post-intervention ‘perceptions and impact’), ad hoc conversations with participants

2. Liaison people facilitate the intervention effectively

The use of a liaison person was very effective in the sites where the liaison person was enthusiastic about SPIRIT; four of the six worked hard to promote, tailor and administer the intervention, harnessing insider knowledge and using creative strategies, whereas the other two did not tailor or promote the intervention as thoroughly and expressed negative views to colleagues about SPIRIT

Observations of workshops, interviews and conversations as above, feedback from the SPIRIT team about their communications with liaison people

3. Targeted policymakers participate in, and are receptive to, intervention activities

Participation levels were good in that they met the SPIRIT team’s expectations for each site; each agency targeted different groups for different components so proportions and types of participants varied, but liaison people were satisfied with attendance and were occasionally surprised by very high numbers; attendance at workshops averaged between 11 and 20 participants per workshop, with between 102 and 158 total occasions of attendance across the six sites; there was full participation in other activities (e.g. trialling the commissioned research services); receptivity varied tremendously within, but especially between, agencies: see next section for more details, including possible reasons

Quantitative fidelity data from observations (using check lists and sign-in sheets), observations, interviews and conversations as above

4. Participants actively contribute to the content of those activities

Where there was opportunity, participants contributed greatly to workshop content via questions, discussion and case examples; interactivity was limited on some occasions in all agencies, usually because the presenter provided few opportunities; in larger groups, more senior staff tended to dominate, but other participants said this was still useful. Some liaison people helped craft workshop content and provided agency-based case examples; one agency co-presented a workshop; the agency staff nominated to test the research commissioning service were actively involved

Observations of workshops, including descriptive accounts of interactions and dynamics

5. Participants identify potentially useful ideas, techniques and/or resources

94% of those who completed a feedback form said they found workshops to be both relevant to their work and realistic about policy challenges and constraints; many interviewees identified specific benefits from SPIRIT, including improved awareness of useful researchers and research resources, understanding of the evidence relating to a policy problem and access to existing agency resources

Participant feedback forms, observations of workshops, interviews and ad hoc conversations with participants and liaison people

6. Participants use, or plan to use, these ideas, techniques and/or resources

Workshops facilitated less discussion than intended about how learning might be applied, but 95% of participants who completed a feedback form agreed, “It is likely that I will use information from this workshop in my work”; some interviewees said they planned to use ideas or resources, and a few had done so, especially newer staff; three liaison people had managerial-approved plans underway for research-focused education and/or systems improvement, e.g. mandated consideration of research in policy proposals; two agencies had plans to use their commissioned research products

Desired process effects for the evaluation

Observed process effects

Supporting data sources

7. Liaison people facilitate data collection effectively

All liaison people facilitated data collection sufficiently, although it was occasionally delayed and required prompting; where liaison people championed SPIRIT they used additional strategies to encourage participation in data collection, in one agency this achieved a 100% response rate

Outcome measures completion figures, interviews with participants and liaison people, feedback from SPIRIT team

8. Targeted participants take part in data collection

In all agencies, there was full participation in the two interview-based measures, but more variable responses to the anonymous online survey; response rates dipped in the second measurement point, but stabilised after the survey was shortened; overall, the online survey response rate was 56% and there was a mean 74% response rate for process evaluation feedback forms; only three-quarters of invitees took part in a process evaluation interview

Outcome measures completion figures, interviews with participants and liaison people

9. The benefits of the intervention are judged to outweigh the burdens of the trial

Interviewees differed considerably in their assessments of the intervention, but where they felt it had benefits these were deemed to outweigh the trial’s burdens, this included those liaison people who championed SPIRIT from the start; workshops with high profile and dynamic ‘service-orientated’ presenters were especially valued; nearly 98% of all feedback form respondents agreed with the statement, “It is likely that SPIRIT will benefit my agency

Early-intervention and post-intervention interviews, ad hoc conversations with participants and liaison people, feedback form data