Design
Coding of the intervention components and parameters to the theoretical underpinnings demonstrated the intervention reflected its theoretical underpinnings (Table 2).
Table 2: Coding of the intervention components to Normalisation Process Theory (NPT; 13).
NPT Domain: Construct
|
Present
|
Evidence
|
Coherence: Differentiation
|
Yes
|
Overview of what intervention practices will need to do for the IPCAS trial. • Invite participants to an enhanced stroke review (including sending the 15 item checklist). • Perform structured stroke reviews and record outcomes on the trial specific practice template. • Distribute information about the “My Life After Stroke” (MLAS) programme. • Provide a DPoC telephone service for stroke survivors and their carers. • Attend a one-off 2 hour meeting with staff from the community and hospital stroke teams. Source of evidence from relevant intervention materials: Page 1 (IPCAS Training manual v0.4)
|
Coherence: Communal specification
|
Yes
|
Communication with specialists: • Describe who will be attending (GPs, staff conducting reviews, acute, ESD, community). • Describe the structure and purpose of the communication meeting. - To build relationships - To discuss how best to contact each service (e.g. named contact or central number?) - To discuss re-referrals to secondary services • Discuss who from the practice will be able to attend (e.g. PI and/or staff conducting reviews). Page 4 (IPCAS Training manual v0.4)
|
Coherence: Individual specification
|
Yes
|
Practice staff pair up with a member of the research team (who will act as the patient) and go through the vignette: • Ask the practice staff to vocalise/discuss how they might approach the situation - What they would they want to know? / How would they ask it? Page 6 (IPCAS Training manual v0.4)
|
Coherence: Internalisation
|
Yes
|
• Discuss with the staff how they would normally go about reviewing the needs of a patient. - Prompts: Is this a new problem? / has it recently worsened? / Has the patient seen anyone about it before? Action plan (e.g. consultations with GPs / other services / use the service mapping?) Page 5 (IPCAS Training manual v0.4)
|
Cognitive participation: Initiation
|
Yes
|
Content Briefly explain the purpose and structure of the training. • All staff will be present for the first half of the training (1 hour). • All staff will need to sign the trial paperwork at the end of the first half. • Section 6, structured review core training, will be delivered only to the staff members who will be conducting stroke reviews. This will last roughly another hour. Page 1 (IPCAS Training manual v0.4)
|
Cognitive participation: Enrolment
|
Yes
|
• Who will attend the communication meeting and what are the best days/times for them. Page 3 (IPCAS Training manual v0.4)
|
Cognitive participation: Legitimation
|
Yes
|
Direct point of contact: Make sure the practice admin lead (or equivalent) is aware that they will need to disseminate what is discussed to the rest of the practice team (e.g. reception staff). • Our development work highlighted that stroke survivors often don’t know who to contact if they have problems relating to their stroke. The DPoC component aims to address this. Page 3 (IPCAS Training manual v0.4)
|
Cognitive participation: Activation
|
Yes
|
Resources • DPoC guidance document. • Screenshots of the DPoC practice template & completion work instructions. • Hard copy directory of stroke services and ‘cheat sheet’ for staff conducting reviews. • Electronic version of the service mapping excel file. - Research team to file copies of all of these documents in the ISF. Page 3 (IPCAS Training manual v0.4)
|
Collective action: Interactional workability
|
Yes
|
• How the practice will operationalise the DPoC / who will be acting as the contact? Page 3 (IPCAS Training manual v0.4)
|
Collective action: Relational integration
|
Yes
|
• Talk staff through the excel spreadsheet and how to search. - Research team to ask what format would best suit the practice and where the resources will be kept. Page 4 (IPACAS Training manual v0.4)
|
Collective action: Skill-set workability
|
Yes
|
• How the practice will operationalise the enhanced stroke reviews and when they are planning to complete them. Page 3 (IPCAS Training manual v0.4)
|
Collective action: Contextual integration
|
Yes
|
• What format for the service mapping resource is best for the practice and where will it be stored. • How stroke reviews are usually done in the practice. Page 3 (IPCAS Training manual v0.4)
|
Reflexive monitoring: Systematisation and Communal appraisal
|
Yes
|
The process of assessing fidelity of delivery was a formal way of assessing whether providers were delivering the intervention as intended. This was part of the regular communication between the research team and intervention providers as below:
Phone calls with the research team: • The research team would like to call the practice (after every 5 reviews) to give discuss any problems and provide feedback. - Research team to discuss which days/times would be best for these calls. Page 6 (IPCAS Training manual v0.4)
|
Reflexive monitoring: Individual appraisal
|
Yes
|
• Outcomes of the review should be recorded on the practice template. Page 5 (IPCAS Training manual v0.4)
|
Reflexive monitoring: Reconfiguration
|
Yes
|
Stroke review: • Briefly describe the enhanced stroke review. - Research team to record how stroke reviews are usually done in the practice. - Research team to record who enhanced stroke reviews will be operationalised. Page 4 (IPCAS Training manual v0.4)
|
Convergence between intervention and control conditions
All 46 practices reported on their stroke review practices (i.e., ‘usual care’) before intervention rollout. Most practices undertook reviews with stroke survivors in person (n= 39) annually (n= 44). Reviews of stroke-related needs were usually done as part of a multimorbidity review (n= 32), and were undertaken by various professionals including GPs, nurses and healthcare assistants. The reviews covered Quality and Outcomes Framework (QOF) indicators (i.e., blood pressure reading, cholesterol and blood tests) as well as smoking status, Body Mass Index, and ‘lifestyle’ behaviours (i.e., diet and physical activity).
There were common components between intervention and control conditions, particularly the following: a structured review and measurement of QOF indicators. These overlapping elements reflect partial convergence between the experimental conditions. The remaining intervention components (i.e., directory of stroke services, direct point of contact, enhanced communication, bespoke training of healthcare professionals, MLAS) were all unique to the intervention condition, i.e., absent from the control condition. We found no evidence of contamination between clusters.
Adaptations to the intervention
We observed three adaptations to the intervention. To support improved communication between primary and secondary care, it was intended that the research team would set up face-to-face meetings between the specialist stroke team and general practice staff. However, this proved logistically difficult to arrange due to availability of personnel to attend such meetings in both settings. Therefore, a pragmatic approach was adopted whereby the primary care staff were provided videos of the specialist staff explaining their service and how the practice could contact them.
Originally, it was planned that a practice nurse would deliver the reviews; however, some practices used a GP or healthcare assistant or research nurse external to the practice to deliver the reviews.
Finally, it was planned that there would be a group of MLAS facilitators separate to the research team; however, due to facilitator attrition, members of the research team who were trained in MLAS delivered some of the courses.
Training
IPCAS training sessions
Sixty-three HCPs (24 nurses, 18 GPs, 18 practice administrators, 3 healthcare assistants) were trained from 23 general practices between June 2018 and July 2019. Nineteen training sessions were conducted, sessions lasted between 1-2 hours with an average of 3 trainees per session.
Four training sessions were audio-recorded (21%; 4/19). Training fidelity was high: 96.1% (SD= 6.0; range= 87.5-100.0) of the 16 planned components were delivered (Table 3 and Additional File Table e1).
Table 3: Fidelity of training scores for IPCAS sessions
Session ID
|
Trainer
|
Fidelity of training score (%)*
|
A
|
01
|
30 (93.8)
|
B
|
01
|
32 (100.0)
|
C
|
01
|
31 (96.9)
|
D
|
02
|
30 (93.8)
|
*Scoring: 2= done, 1= partially done, 0= not done (maximum score= 32 from 16 items). The two independent raters (RA, JG) achieved 86% Cohen’s kappa agreement (Prevalence and Bias Adjusted Kappa (PABAK)= 0.72).
In interviews, trainees reported that training met their needs adequately, giving them the confidence to deliver the structured review.
“No, once I'd started obviously the first patient was a bit, er, oh my word, am I doing this right or not. But after a couple I found that all the stuff that I had been taught and I've learnt actually came in useful and it all came flooding back after a couple.” [Healthcare assistant]
Perceived benefits of training included having a clear understanding of the research methodology and enhanced knowledge of the heterogeneity of stroke survivors. Furthermore, having ongoing support and knowing the research team was always on hand helped build and maintain confidence.
“It was quite an eye-opener for me [...] to see how patients’, you know, experiences were very different...Yeah, yeah [...] their journey post stroke had been very different.” [Nurse]
MLAS training sessions
Two 3-day training sessions were conducted for MLAS in May 2018 and January 2019. The first session had 13 trainees and the second session had 10 new trainees and 10 trainees from the first session (who attended the course as a refresher). Six facilitators dropped out: three before running any MLAS courses and three after delivering one MLAS course.
All MLAS training sessions from the first session were video-recorded. Training fidelity was high: 87.5% of all planned training components were delivered (88.9% of planned materials; 86.9% of planned content) (Table 4).
Table 4: Fidelity of training scores from coded video-recordings*†¥
Training Day
|
Materials score (%)
|
Content score (%)
|
Day total score (%)
|
Day 1
|
36 (81.8)
|
90 (90.0)
|
126 (87.5)
|
Day 2
|
40 (100.0)
|
95 (89.6)
|
135 (92.5)
|
Day 3
|
36 (85.7)
|
60 (79.0)
|
96 (81.4)
|
*Scoring: 2= done, 1= partially done, 0= not done
†Number of items scored: Day 1 -72 (22 materials; 50 content); Day 2 – 73 (20 materials; 53 content); Day 3 – 59 (21 materials; 38 content)
¥Note: 16 items (8 planned content; 8 materials – all from Day 3) were excluded from the analysis because video-recorded data were not obtained due to the nature of the session.
Evaluation forms:
Feedback from MLAS training evaluation forms for were positive (n=11 for day 1 and n=13 for days 2 and 3; Additional File Table e2). The majority of respondents either strongly agreed or agreed for most questions about knowledge of the MLAS curriculum and expected facilitator behaviours.
Delivery
IPCAS
Structured reviews were delivered by 24 healthcare professionals (19 nurses, 1 GP, 3 healthcare assistants, 1 research administrator) from 23 general practices. The median practice list size was 11,168.5 (95% CI 8,775, 14,317).
Thirty-four (8%; 34/421) structured reviews across 17 GP practices were audio-recorded and there were 47 structured phone calls to 25 HCPs across 22 GP practices. Audio-recorded observations found reviews were delivered with moderate fidelity: 68.6% (SD= 8.2; range= 13.9-100.0); whereas, structured phone calls found high fidelity: 83.1% (SD= 9.1, range= 41.7-100.0) (see Additional File, Table e3 and e4).
Of nine delivery questionnaire items, audio-recorded observations found one was low fidelity (use of the service mapping tool), five were moderate fidelity and three were high fidelity. In contrast, the structured phone calls found that none were low fidelity; four were moderate fidelity and five were high fidelity (Table 5).
Table 5: Fidelity of delivery of structured review average scores per item for audio-recordings and structured phone calls to healthcare providers
|
Audio-recordings (n=34)
|
Structured phone calls (n=47)
|
Delivery questionnaire item†
|
Average score (maximum score per item = 2)*
|
Average % score
|
Average score (maximum score per item = 2)*
|
Average % score
|
1a – stroke survivor completed checklist
|
1.7
|
87.0
|
1.3
|
66.0
|
1b – discussed up to 3 needs
|
1.8
|
91.0
|
1.7
|
87.0
|
3a – discussed action plan
|
1.4
|
72.0
|
1.5
|
76.5
|
3b – logged/reviewed actions
|
1.3
|
63.0
|
1.4
|
70.0
|
4a – provided MLAS information/leaflet
|
1.7
|
84.0
|
1.8
|
91.5
|
4b – provided instructions for accessing MLAS
|
1.4
|
69.0
|
1.9
|
94.5
|
5a – explained Direct Point of Contact service
|
1.2
|
60.5
|
2.0
|
99.0
|
5b – provided instructions for accessing Direct Point of Contact
|
1.1
|
56.0
|
2.0
|
99.0
|
6 – used service mapping tool
|
0.7
|
35.5
|
1.3
|
65.0
|
†Note: 2a-2c excluded because they were optional items
*Scoring: 2= yes, 1= unsure, 0= no (maximum score= 18 from 9 items)
|
Interviews generated three themes relating to HCPs’ experience of delivering IPCAS and factors influencing delivery.
HCPs experience of delivering IPCAS
The needs checklist was considered useful as it focused conversation, thereby optimising consultation time, and facilitated patient-centred consultation by encouraging stroke survivors to openly discuss needs and enabled them to identify issues important to them.
“… with them coming in with that pre- filled in it really did focus one’s mind on areas that were obviously important to the patient. So yeah, it was good.” [General practitioner]
However, some HCPs found the checklist too lengthy.
HCPs had mixed views about the usefulness of the service mapping tool. Some found it too lengthy and complex; however, others reported that, after familiarisation, it became easier to use.
“Yes, that was as clear as mud! That was a very long list of…that definitely could be improved […] Yeah, and it was very small print, and it was very, kind of, you really were having to, you know, it…yeah […] The print was small, there was too much on one…was it a piece of paper or was it a screen?” [Practice nurse]
HCPs reported that they had received clear instructions on how to deliver the direct point of contact; however, all but one participant reported that they had not knowingly received any communications from the stroke survivors using this service. Practice staff perceived that a lot of effort was put into setting up the direct point of contact service.
“None of the patients had made any contact with us on the direct point of contact, because we have a special template on for that, and all the staff were educated as to what to do if a patient who is on the IPCAS trial did phone in and needed to speak to either myself or the nurse who was involved, and we’ve not had one phone call about that. So that was interesting.” [General practitioner]
HCPs varied in how they introduced MLAS to stroke survivors, some HCPs just handed out the MLAS leaflet; whereas, others explained the course in detail.
Action planning, whether it was providing further information/advice or making onward referrals to health and social care services, appeared to make little or no difference to how HCPs referred their patients.
“I think I did a normal routine referral for somebody to a physiotherapist, and I think I did one to occupational therapy to check the house, but it didn’t change the way I would normally do it. Certainly didn’t make any difference at all. I would have done it if I’d seen the patient ordinarily, but it just came up at the IPCAS meeting that they needed that so I did the referral.” [General practitioner]
Factors influencing delivery
Mostly, the healthcare staff who delivered the IPCAS intervention were based in the general practice. However, in five practices, nurses external to the practice delivered the intervention. These external nurses experienced challenges relating to not knowing the patient beforehand and lack of knowledge of IT hardware and administrational processes. In contrast, where a practice nurse conducting reviews was an integral member of staff within the surgery, there was often an already-established rapport with the patient which led to a smoother delivery.
“I think, because I’ve seen these patients or I had seen them for the last five, six years, a lot of information that we were trying to find out, we had already discussed in previous appointments, not maybe as much in depth …” [Practice nurse]
Having the clinical autonomy to make referrals make a difference to how an action plan could be executed, which meant some staff were restricted; for example, being reliant upon GPs to refer patients.
“...it has got to be through a GP, because a lot of the hospital-based clinics don’t like nurse referrals, so, you know.” [Practice nurse]
HCPs who had previously worked in hospital found this experience helped in delivering the IPCAS review. Experience of technology use influenced delivery; for example, HCPs with limited IT experience struggled with the templates, whereas others found them straightforward and easy to use.
MLAS
Twenty-two MLAS courses were conducted. Six (27.3%; 6/22) MLAS sessions were observed. MLAS fidelity of delivery was high: 86.4% (SD= 5.4, range= 78.1-92.2, 95% CI= 81.6, 89.2) (Table 6).
Table 6: Fidelity of delivery of MLAS (observations)
Course number
|
Session observed
|
Raw fidelity of delivery score/ maximum score
|
% score
|
18
|
Group session 1
|
82/ 96
|
89.1
|
24
|
Group session 1
|
75/ 96
|
78.1
|
12
|
Group session 2
|
93/ 102
|
92.2
|
37
|
Group session 2
|
90/ 102
|
88.2
|
18
|
Group session 3
|
80/ 98
|
81.6
|
22
|
Group session 4
|
91/ 102
|
89.2
|
Receipt and enactment
IPCAS
Of 522 intervention participants, 80.7% (421/522) attended the structured review. Reviews lasted 27.9 minutes on average (range 16.9-39.4) (see Additional File 1, Table e5).
For the checklist and action plan, there were data available for 93.4% (393/421) participants from 22 GP practices.
56.3% (237/421) of participants had at least one action plan recorded. Across 22 practices, there were 431 recorded action plans, split into: 29.5% (127/431) follow-up appointments, 25.3% (109/431) referrals, and 45.2% (195/431) advice (see Additional File 1, Table e7).
67 (15.9%; 67/421) participants from 22 GP practices completed a fidelity of receipt questionnaire by telephone. 63.2% of structured review components were reported to be received by participants, which indicates moderate fidelity (see Additional File 1, Table e8). Of the nine receipt questionnaire items, two were low fidelity; five were moderate fidelity; and two were high fidelity (Table 7).
Table 7: Fidelity of receipt of structured review average scores per item (n= 67 participants across 22 practices)
Receipt questionnaire item†
|
Average score (maximum score per item = 2)*
|
Average % score
|
1 – attended structured review
|
2.0
|
98.5
|
2a – completed 15-item checklist
|
1.7
|
87.0
|
2b – discussed up to 3 needs
|
1.4
|
69.5
|
4a – discussed action plan
|
1.0
|
51.0
|
4b – opportunity to review/note agreed action plan
|
1.0
|
50.0
|
5a – received MLAS information/leaflet
|
1.6
|
79.0
|
5b – received instructions for accessing MLAS
|
1.4
|
72.0
|
6a – received information about Direct Point of Contact
|
0.7
|
35.5
|
6b – received instructions on accessing Direct Point of Contact
|
0.6
|
31.0
|
*Scoring: 2= yes, 1= unsure, 0= no
†N.B.: Items 3a-c excluded from the analysis because they were optional items
|
Qualitative interviews were conducted with 19 intervention participants. Four themes were identified: Views and experiences of structured stroke reviews; Perceptions of eligibility for stroke care or support influential to engagement; Engagement with other intervention components, materials and resources; and Benefits gained from participation in IPCAS.
Views and experiences of structured stroke reviews
Stroke survivors had mixed opinions of the value of structured stroke reviews. Some intervention participants reported feeling cared for and that reviews addressed issues they might not have associated with their stroke (e.g., daily living activities, exercise). In contrast, several stroke survivors could not distinguish the IPCAS structured stroke review from other contacts with the GP practice, owing to high volume of attendance in primary care for other health issues, or impaired memory. Barriers to attendance at the structured stroke review were work commitments and identifying as well-recovered from their stroke.
“I was invited to go to the GPs, but the days that they…because they wanted to do an interview there, and I’d had to say, I’m more than willing to do the interview, but I work fulltime, I physically can’t get there to do that at the moment.” (Female, 35 years old, 10 years 11 months post-stroke, East Midlands)
Perceptions of eligibility for stroke care or support influential to engagement
Intervention participants who were several years post-stroke found the IPCAS model of care less relevant or useful to them; however, they acknowledged that structured stroke reviews could be beneficial at earlier stages post-stroke.
“What most people need, I would imagine, is support when they first come out and then to review it as they go along.” (Male, 84 years old, 5 years 4 months post-stroke, Norfolk)
A barrier to attendance/participation in IPCAS activities was participants’ views that they were fit and well, and therefore any support or care offered to them should be allocated to those experiencing severe post-stroke impacts.
“I didn’t think I was serious enough. I thought I was so fit I thought I’d be taking up space for somebody else.” (Male, 77 years old, 1 year 6 months post-stroke)
Participants emphasised the importance of providing individualised or tailored support. Some intervention participants reported the 15-item checklist questions did not match their age or experience of stroke (i.e. not severely affected/recovered fully).
Engagement with other intervention components, materials and resources
Many participants did not recall receiving information about the direct point of contact and, of those that did have an awareness, none accessed it as they felt it unnecessary. Many reported the direct point of contact was a good idea in principle; however, others thought it was superfluous given the availability of other services such as 111 or participants having an established relationship with their GP surgery.
Participants who completed the checklist found it useful for identifying needs and an opportunity to share their experience or symptoms with a healthcare professional, which they had not had opportunity to do so in the past. Participants reported that the checklist enhanced their understanding of their experiences post stroke and address concerns where needed.
“Yes, it [15-item checklist] was useful, because it focused me on what has happened to me since or what I’ve found problems, and I think exercise was one of the main things”. (Female, 74 years old, 4 years 6 months post-stroke, Ipswich)
Some participants found the tailored service directory (‘service mapping tool’) provided during MLAS useful, specifically information on services that were local to them that they wouldn’t have heard about otherwise.
“It [service mapping tool] was very helpful, especially when they gave […] us a sheet of different organisations to try and help. One was to do with transport.” (Male, 61 years old, 6 years 4 months post-stroke, Norfolk)
Benefits gained from participation in IPCAS
Participants found that through engaging with intervention components such as the 15-item checklist of needs and MLAS, they deepened their knowledge and understanding of stroke impacts, increasing their confidence to seek support where needed.
“I feel that now, albeit it is a while afterwards, if there was a problem and I thought it was connected, I feel a lot more confident in ringing the doctors and not having to explain myself.” (Female, 65 years old, 4 years 2 months post-stroke, Norfolk)
Concerning action planning, some participants reported being offered and accepting extra support or referrals (e.g. memory assessment, physiotherapy) to services at the stroke review.
“One thing…yes, he did suggest [at the stroke review] that I had a memory test, and he wasn’t available to do it, he said I could see one of his colleagues and I did go for that.” (Female, 74 years old, 4 years 6 months post-stroke, Ipswich)
MLAS
420 participants were invited to an MLAS course, but only 139 participants took part, of whom 102 completed it. However, those that did attend appeared to value it. Participant data on receipt and enactment of the MLAS self-management programme are reported elsewhere (14).