Setting and context
Ontario is Canada’s largest province with a population of more than 14 million people and 123 acute care and 58 rehabilitation hospitals, spread across 14 Local Health Integration Networks (LHINs), designated geographical areas. The hospitals are commonly grouped as being rehabilitation or acute care and as being academic, community, or small hospitals.[13, 14] This project was the result of an Accelerated Research to Improve Care (ARTIC) grant, government funding for province-wide quality improvement (QI) implementation of evidence based initiatives.[15] Following the consolidated framework for implementation research (CFIR),[16] factors known to facilitate successful implementation were built into both the processes of identifying participating hospitals and supporting the implementation. As part of the project application process, hospitals were selected based on the presence of an executive sponsor and a clear rationale for implementing PODS. Organizations also required previous quality improvement experience and sufficient capacity for implementation. Hospitals were chosen from all 14 LHINs and from within all hospital categories to be representative of the province as a whole.
Of 42 hospitals that applied, 21 were invited to participate in a community of practice (COP) and received a stipend to support their implementation and data collection. The hospitals involved all had a high level of organizational commitment and readiness, but ranged widely in size, geographic area, target patient population, discharge process (i.e. what members of the healthcare team were involved in providing patient education at discharge), whether PODS was implemented in isolation or as part of broader discharge process improvements, and whether the process was supported through the electronic medical record (EMR). There were eighteen acute care and three rehabilitation hospitals, eight which were considered academic hospitals, five large community hospitals and eight small community hospitals with under 100 beds (Table 1).
Table 1
Description of target populations in each hospital implementing PODS (n = 21 hospitals).
ID | Type | Size | Target population | Rationale for implementing PODS | Main Responsible Provider | When is it done |
1 | A | A | Mental health inpatients | Poor existing process | Social work or nurse | Week of discharge |
2 | A | A | all inpatients | quality and consistency | Physician | week of discharge |
3 | A | A | all medicine acute and sub-acute | quality and consistency | Physician | week of discharge |
4 | A | A | all inpatients | general patient centred care | Multidisciplinary | week of discharge |
5 | A | A | medicine, chronic disease, oncology, surgery | general patient centred care | Nurse | day before discharge |
6 | A | A | Medicine - focus on elderly | general patient centred care | Team | day of discharge |
7 | A | C | all surgery | general patient centred care | Nurse | over whole stay |
8 | A | C | mental health ED and inpatient followed by rehab | reduce readmissions | nurse | week of discharge |
9 | A | C | all inpatients and ED (QBPs at first) | poor existing process | nurse | day before discharge |
10 | A | C | Surgery | poor existing process | nurse | day before discharge |
11 | A | S | medicine, surgery, and rehab | poor existing process | nurse | day before discharge |
12 | A | S | medicine, surgery, and ED | poor existing process | nurse | week of discharge |
13 | A | S | all inpatient (target CHF, COPD, and Stroke at first) | reduce readmissions | nurse | week of discharge |
14 | A | S | all inpatients and ED | general patient centred care | nurse | day before discharge |
15 | A | S | all inpatients | quality and consistency | nurse | week of discharge |
16 | A | S | all geriatric inpatients | general patient centred care | nurse | week of discharge |
17 | A | S | medicine, surgery, and obstetrics | poor existing process | nurse | day before discharge |
18 | A | S | medicine, surgery, obstetrics | quality and consistency | nurse | week of discharge |
19 | R | A | all inpatients (rehab includes stroke) | poor existing process | social work or nurse | over whole stay |
20 | R | A | inpatient Stroke | quality and consistency | nurse | over whole stay |
21 | R | C | Rehab including stroke | General patient centred care | nurse | Week of discharge |
Type: A = acute, R = rehab; Size: A = academic, C = community, S = small |
Our team supported the implementation of PODS at the 21 hospitals in three stages: (1) Start Up: hospitals were guided through the process of adapting the PODS tool and process together with stakeholders; (2) Plan Do Study Act (PDSA): hospitals “went live” with either a pilot group of their full target group and then iteratively tested, refined, and evaluated the tool and process while implementing; and (3) Scale Up: hospitals spread the use of the tool and mentored others who were interested. Using a supported community of practice model, our team hosted regular meetings for education, shared knowledge and mentorship. A website was developed to house central resources and collective knowledge.[17] Project teams were encouraged to engage with patients and families throughout the project. A project advisory group was formed to guide the project with representation from the hospital, community, and patients including authors (SHG, TH, AC, HA, CB, KO). The group provided advice on project implementation, evaluation, and interpretation of results.
The project started in April 2017, with hospitals beginning implementation of PODS between April 2017 and March 2018. Majority of hospitals “went live” in October and November of 2017. Evaluation data was collected between April 2017 and December 2018 and spanned all three stages of start up, when hospitals had yet to discharge any patients with the PODS, and PDSA and scale up, when hospitals expanded the number of patients receiving the intervention.
Study design
Due to variation in adaptation and implementation across organizations, we chose to use an effectiveness-implementation hybrid design that uses both quantitative and qualitative methods for evaluating complex interventions.[18] Specifically, we employed a hybrid design that tested implementation while also collecting information on the intervention and related outcomes.
Data collection and analysis
Implementation was studied by examining the percentage of the target population reached, the pattern and quality of implementation, and spread of implementation outside of the target group. Individual, organizational, and system factors impacting implementation were also studied. Effectiveness was studied through patient-centred processes, patient understanding, patient and provider experience, and unscheduled healthcare resource utilization.
Quantitative
Implementation data was collected through quarterly submissions of implementation reach within the target population, and spread outside of the target population reported by hospitals. Additionally, during end of project interviews, project teams were asked to rate the quality of their implementation using measures of consistency, completeness, quality of the content of their PODS, and quality of their process of providing PODS to patients. Factors impacting implementation were collected through demographic data on the target populations reported quarterly by hospitals and through information about the organization and their discharge process collected during end-of-project interviews with each project team (see Additional File 1 for the interview guide). Factors were chosen based on whether they were thought of as important by the project advisory team, whether they were present in the literature on implementation success, or whether they were found as a theme that emerged through qualitative data collected throughout the project.
Effectiveness data was collected through provider and patient/family surveys developed for this project (see Additional File 1), as well as hospital-reported data. Providers involved in the discharge process as well as a random sample of patients and families in the target population completed hospital-administered surveys pre- and post-implementation. Provider survey measures included the use of teach-back methods and involvement of family during discharge education, in order to see if using PODS impacted important and evidence-based patient-centred processes. Additionally, providers reported whether the intervention added to workload and whether the intervention added value to the discharge experience, chosen as balancing measures that might impact success and sustainability of the implementation. Patient and family survey measures included whether the patient had a discussion in hospital about the help they would need at home, a rating of understanding of medications, a rating of understanding of what to do if they are worried about their condition, and whether the intervention added value to the discharge experience. These questions were chosen because they are relevant questions related to transitions in care found in the Canadian Institute for Health Information Patient Experience Survey for Inpatient Care, a validated survey used at many Canadian hospitals [28]. Quarterly submissions from hospitals included data on all-cause 30-day return ED visits and readmissions for the target population.
Qualitative
Qualitative data was collected throughout the project through surveys and presentations at community of practice meetings and open-ended questions in both provider and patient and family surveys. The themes that emerged through the data were used to create a semi-structured interview guide developed specifically for this project (see Additional File 1) that was then used to conduct end-of-project interviews over the phone with project teams from each of the 21 hospitals. Interviews were conducted by authors SHG (PhD) and CM, who were coordinating the CoP and are trained in qualitative research methods. The purpose of collecting the qualitative data was known to the project teams and was collected on each hospital’s implementation process, the patient and provider experience, and barriers and facilitators impacting the implementation guided by the CFIR framework.[16] Interviews were between 30 and 60 minutes. They were recorded and transcribed for analysis.
Statistical analysis
Implementation was evaluated by measuring the proportion of patients in the target population who received a PODS in each quarter. To this end, a generalized linear mixed effects model was used, with fixed effects for quarter relative to implementation and between site covariates, and random slopes and intercepts for each site to model within site temporal trends. PODS spread outside of the target population was assessed using a zero-inflated mixed effects negative binomial model. Exploratory analysis was conducted to examine any association between factors at the patient, hospital, and process-level, as well as quality ratings and whether a hospital had high implementation within the target population (defined as 75% of target reached in Q4 after implementation). Factors were chosen as supported by literature and the qualitative findings.
Numerical summaries of provider, patient, and family reported data on patient-centred processes, patient understanding, and discharge experience were assessed by comparing means before and after implementation using generalized estimating equations (GEE) to account for within site clustering of responses.[19, 20] Change in healthcare utilization was analysed by plotting change over time. Linear mixed effects regression was used to assess percentage of 30-day return ED visits and readmissions, with percent PODS, within site, as the predictor of interest.
The analyses and data visualizations were performed using R version 3.6.2 by John Matelski and supervised by George Tomlinson (Biostatistics Research Unit, UHN).
Qualitative data was analysed using an iterative constant comparative process involving descriptive and interpretive analyses, open-coding, and identifying themes in the data. Study leads and members of the research team (SHG, TH, CM), read transcripts, then met to discuss initial codes and develop a preliminary coding framework. Interpretation of themes was discussed among team members to achieve consensus. Theoretical saturation, constant comparative analysis, trustworthiness, and validity checks provided assurance of data quality and rigor.
Using a triangulation approach, qualitative and quantitative results were interpreted together, towards the understanding of factors that may influence successful implementation and to draw inferences from the data.
This study was approved by the University Health Network Research Ethics Board.