Participant characteristics
Sixty-two professionals with experience in implementation science, health economics or digital health were invited to participate in the study. Interviews were conducted with 16 consenting participants: five implementation scientists, two health economists, four digital health specialists and five with experience across more than one of these fields (Fig. 1). Participants worked across a range of healthcare disciplines, clinical areas and settings including nursing, surgery, maternal health, nutrition and dietetics, pharmacy, heart disease, lung cancer, clinical excellence, information systems, and digital health including telehealth and AI. Most participants were female (n = 10), worked in academia (n = 14), and were located in the same geographical region as the research team (n = 9) (Table 2).
Table 2
Participant characteristics.
Characteristic | Count |
Gender |
Female | 10 |
Male | 6 |
Location |
National (Australia) | 14 |
Queensland (local) | 9 |
New South Wales | 2 |
Victoria | 2 |
South Australia | 1 |
International | 2 |
United States | 1 |
Canada | 1 |
Industry |
Academic and Healthcare | 7 |
Academic only | 7 |
Government | 1 |
Consultancy | 1 |
Themes
Five major themes, each containing two or three subthemes, were derived from the data (as seen in Fig. 2) and are explained in more detail below and summarised in Table 3. Three deductive themes of types of costs, why implementation is costed, and how to cost implementation were developed a priori from the interview guide based on the research question. Two themes of terminology and boundaries of implementation, and barriers and enablers to costing implementation occurred inductively from the data.
Table 3
Themes and subthemes explanation. (NVivo File V6.2).
Theme | Subtheme | Description |
Types of costs | Implementation costs | Explicit costs and resources incurred due to the implementation process. The explicit costs can be mapped to the ERIC framework [17] except project manager and workflow alterations. |
Non-implementation costs considered | Explicit costs and resources relating to the invention and not the implementation process. Such costs include the intervention itself or related necessary legislation. Digital interventions had costs relating to cyber security and the digital backbone. Other costs included space. |
Terminology and boundaries of implementation | Lack of common language | Differing terminology used across the fields of implementation science, health economics and digital health. |
Pre implementation phase | The initial phase in the implementation process. This phase includes considerations about feasibility, planning, deployment (IT systems), procurement, and required research governance. |
Peri implementation phase | A phase within the implementation process which involves active implementation, or roll-out of the intervention. |
Post implementation phase | The final stage of the implementation process. This phase involves outcome evaluation, sustainment, delivery, and scale. |
Why implementation is costed | Supporting high value care | Implementation was costed to support high value care by demonstrating the value of the clinical innovation or of implementation strategies, by informing financial decision making and those most affected by the decision, and determining the future scalability. |
Gap in implementation science research | Implementation was often not costed which was identified as a gap in implementation science research. |
How to cost implementation | Data Collection | The process of collecting data on implementation costs including the burden on the collector, and the sources or access to informative data. |
Data Analysis | Approaches used in practice to determine implementation costs including staff time tracking, best estimates, economic evaluations, and fixed budgets. |
Attributes | Recurring characteristics in the process of data collection and analysis to determine implementation costs. Attributes included built in flexibility or variability, an ability to differentiate implementation and intervention costs, digital recording, planned in advance, and practical tools. |
Barriers and enablers to costing implementation | Collaboration enables | Collaboration between disciplines enabled the process of costing implementation, even if there was a lack of expertise in implementation science within the collaboration. |
Challenges to costing implementation | Barriers to costing implementation included difficulties demonstrating value, the use of existing resources, implementers with part-time appointments, intangible costs, and a lack of awareness, and funding. |
Terminology and boundaries of implementation
Across the fields of implementation science, health economics and digital health, terminology differs and caused confusion when costing implementation. In digital health, “they [digital health solutions] typically get kind of deployed, as in I want to put this system in and then I want to use it right? So that that business of putting it in and using it is what we describe as implementation. …deployed/implementation, we use them interchangeably” [digital health expert]. Whereas in implementation science, the process of implementation would be broader and include considerations at the patient, provider, system and/or policy levels. A common language across the fields was lacking and was important to costing implementation.
“Even within digital health, you get people with clinical backgrounds, you get people from technical backgrounds and they talk a different language sometimes. But digital health sort of brings that together somewhat. Then implementation science framework background is different again to health economists. Yeah, I agree that common language is really important. And that's why we need to be clear about this sort of stuff.” – digital health expert/health economist
The boundaries of implementation were difficult to delineate for participants. This resulted in some difficulties when identifying what was and was not an implementation cost- despite the clear distinction between implementation and intervention costs outlined below.
“The line as to when that becomes implementation, when that starts to be implementation is probably a bit murky.” – digital health expert/health economist
Although the bounds of implementation were unclear to the participants, certain activities and associated costs (both implementation and non-implementation costs) were often discussed in phases. We defined these phases as pre-implementation, peri-implementation or post-implementation and these are summarised in Fig. 3. The phases were not linear but were discussed in a logical order, from pre to post, while acknowledging the cyclical nature of implementation projects. Arranging the activities and associated costs into these phases helped to circumvent the issue of a lack of common language (mentioned above) in our analysis.
Types of costs
When costing digital health implementation, participants outlined various types of costs which either fell into the subthemes of implementation costs or non-implementation costs. Participants could delineate implementation costs from intervention costs.
“I think there's clearly a distinction between implementation costs and the costs that we're trying to intervent.” – health economist
“… so your intervention will be- have some sort of ongoing cost. Most of them do, whether it's staff or like your ongoing digital support costs. Whereas your implementation ones should have a- I would normally say should have a reasonable time frame associated with them. They're not forever because eventually something should become part of standard practice”. – health economist
The most frequently mentioned clusters of implementation strategies mapped to the ERIC framework were ‘use evaluative and iterative strategies’, followed by ‘develop stakeholder interrelationships’ [17]. There were also implementation costs identified which could not be readily mapped to this framework, these included the cost of a project manager and costs associated with workflow alterations. A project manager’s role ranged from conducting some implementation strategies to completing the administration duties to progress the project. A project manager appeared to facilitate implementation and therefore was considered an implementation cost. Participants mentioned that it was important to understand current clinical workflows and how they may be impacted from the introduction of the intervention, “or you’re never going to get your clinicians to do anything” [health economist]. Consequently, the need for workflow alterations may be considered within the scope of implementation processes and costed accordingly.
Non-implementation costs mentioned by participants related to the intervention itself, rather than the implementation process. For digital interventions, this may include costs relating to cyber security and the digital backbone including hardware, software, ongoing management of a data base, and ongoing technical management.
Physical space was mentioned by some health economists as a potential implementation cost but it would not be included in their costing analysis if it was not a “big item” [health economist/implementation scientist/digital health expert] or if “slack is built into the system” [health economist] allowing meetings to be conducted internally rather than renting a space (which would have been costed).
Why cost implementation
Capturing implementation costs was perceived to be important for demonstrating the value of the intervention, particularly to decision makers tasked with continuing or scaling the intervention. Implementation costs were used to show that the intervention was either cost saving or was justified by other benefits including improved patient experience, patience safety, and clinical outcomes. It was also suggested that mplementation costs can also be used to inform future scalability of the intervention.
“You have to show some benefit to the system. It either has to be beneficial for outcomes, clinical, because that's cost saving. Or beneficial for patient experience, because that's really important to health systems and should be. Or financially beneficial. So it has to fulfil at least one of those 3 criteria I think if you're gonna do a service redesign.” – digital health expert
Including implementation costs in grant proposals or business cases can also assist in informed financial decision making, including when to proceed with pilot implementation projects.
“I've been involved with projects that get up one of two ways. They either get a small pilot grant to run a pilot, in which case you need to outline a budget and how you're going to do something. Or through a business case…If it goes on to be a permanent service, that you know you again, you need a budget to justify how you're going to do it. Yeah, because you know the execs are not going to support something that hasn't got funding behind it.”– health economist/digital health expert
Implementation costs reported during and at the end of implementation projects assisted in decisions to continue or scale the project.
“I really want to produce a really informative report that outlines what it would take to implement this digital health initiative statewide.” – implementation scientist/health economist/digital health expert
The value of implementation strategies could also be captured via implementation costing. While it was accepted that implementation strategies are necessary for successful implementation, costing is still important to demonstrate that funds are being used appropriately. Some funding bodies required ongoing reporting of cost spending in which implementation costs were included. For others, there was no requirement to cost implementation, even though they believed it was important. Even when not a requirement, some participants would still report implementation costs as part of their project management practice. Some included implementation costs in disseminated reports or publication to assist others who may want to emulate the same project in their institutions.
The value of evaluating implementation costs differed between audiences. It was important to implementers and health economists. However, those with experience in implementing digital health initiatives in health services did not also share this perception.
“And deciding whether or not they're [implementation costs] actually relevant to the end user of your evaluation. It really depends who your evaluation is for.” – health economist/digital health expert
For some participants, the lack of research and knowledge of implementation costs within implementation science contributed to their decision to cost implementation. These participants purposefully gathered implementation cost data to address this under-researched area of implementation science.
“Because I think it's been such a massive gap in implementation science.” – implementation scientist
How to cost implementation
Participants discussed ways in which they retrieved informative data to cost implementation. Publicly available information including pay rates and awards could be used as a resource. However instead of using this resource, most participants contacted relevant teams within the organisation, for example the finance team, to obtain salaries of the personnel involved. Navigating large organisations to obtain this information was at times difficult. If no other information was available, participants reported collecting the information through methods including template completion, interviewing, surveying, and manually counting units. Template completion was most frequently used. Most participants combined contacting relevant teams within the organisation with primary collection of information.
“So it sort of goes from easy if you can just … have somebody pull the data all the way through to sort of grinding to get the data yourself.”– health economist/digital health expert
Implementation was frequently costed via staff time tracking methods where staff time was tracked against specified activities, and salaries were applied to calculate the cost associated with each activity. Although labour intensive, this method was not seen as complex and could provide contextual insight specific to the site. However, significant variation in practices were reported. Detailed approaches captured all activities and personnel involved in the implementation, while other more simple approaches estimated wages only for the major personnel involved. Some project managers developed an activity template and had staff complete it prospectively with their own time allocations. Others estimated staff time and did not request staff to complete it themselves.
“We like breakdown every single task that they're gonna do in the study. It's not always super accurate, but you know we get as close as we can.”– digital health expert/implementation scientist
“So every time one of those outreach workers did anything we asked them to complete a form. So…whenever they talked to the women we wanted, like the minutes and sort of what was done in the implementation activities.” – health economist
The collection of appropriate data to cost implementation was seen as a burdensome task. This was particularly true for collectors that were not part of the implementation project team (for example clinical staff using the intervention) and when tracking staff time, as it required personnel involvement in collecting the data themselves.
“*sigh* staff time tracking is difficult because it requires your staff members to be on board.” – health economist/digital health expert
Achieving high accuracy and precision through frequent and comprehensive data collection was also seen as burdensome. For costing via staff time tracking, there were additional difficulties in delineating the time spent in roles and responsibilities relating to the implementation activity and usual job duties.
"It was such a burden to capture that level of precision." – health economist
“And the problem is, if you're talking implementation time, your staff aren't going to differentiate between implementation as such like and just time doing their job and everything. So you also have to have your categories really clear for what you want them to be tracking. And you can't have too many, otherwise they won't pay any attention or use any of them. So time tracking is like amazing in theory and really difficult in practice.”– health economist/digital health expert
Utilising incentives, involving the collectors in the design of data collection and incorporating the task of data collection with other required tasks were suggested strategies to encourage data collection completion.
“If you ask someone else to do that on top of the existing work, and there’s no incentive for them to do it, then that’s going to seem like a huge task. But if it was someone who, as part of the project implementation they were expected to record who was at the meeting and people wrote down their job classification, that wouldn’t be unreasonable.”– health economist/ digital health expert
Data collection was also aided when implementation activities were clearly defined in advance which was commonly achieved through a purpose-built template. Participants expressed the importance of having a few clear categories for collecting the required information. The required information for costing implementation should be planned in advance. At times this may be with advice from health economists and at other times serve a purpose for defining roles and responsibilities for implementers.
“So all of that was planned out ahead of time with a health economist and getting advice from him in the design of the project”– implementation scientist
Other less common approaches to data collection were also reported. Participants discussed estimating implementation costs from expert opinion, usually through experience from similar projects. Economic evaluations were also mentioned, although implementation costs were not frequently included in these types of analyses. Some participants expressed that the amount of available funding determined the amount of implementation costs.
“I think a lot of the time, it's based on what had been done in similar projects.” – health economist/implementation scientist
Several attributes were considered important by participants for successful collection of data about implementation costs. Costing implementation and recording the required information digitally was favoured by participants using programs that were available and familiar, including MS Excel, RedCap and Qualtrics. Other considerations included flexibility of tools and capture formats to suit local teams, as well as ease of integration with statistical analysis software. Participants expressed the want for practical, pragmatic, and simple tools for local implementers. A checklist-like format was suggested, as well as aligning the input with data which is already collected for another purpose.
“If we're listening to what people want on the ground, they want tools. And they want really practical tools and they want tools that directly help them solve the problems that they're creating.” – implementation scientist
Barriers and enablers to costing implementation
Collaboration across disciplines enabled the overall implementation process as well as costing of implementation, even if there was a lack of expertise in implementation science within the collaboration. Some participants had not heard about implementation science prior to starting implementation projects but had been using analogous approaches in the past. Most participants mentioned that multidisciplinary collaborations provided a rich range of perspectives which supported the implementation project. Collaboration from the beginning of a project, particularly during study design, was most beneficial and often sought out.
Participants expressed several challenges associated with costing implementation. Implementation projects were often underpowered, with limited data available for meaningful analysis, outside of descriptive analysis. In addition, if the primary objective was not achieved further analysis (including implementation costs) was not typically performed.
“…it didn't meet the primary objective and didn't end up going down that pathway of analysing the data further.”– digital health expert
Existing resources were perceived as difficult to cost because either the cost was not incurred or the additional labour to cost implementation had “limited benefit unless there's some bigger picture” [health economist/digital health expert]. Additionally, as previously mentioned, costing labour associated with implementation was challenging when staff had to differentiate between their implementation and clinical (or regular) duties.
“The videos didn't cost us anything 'cause we have in-house marketing teams, so it didn't cost, we didn’t get a bill for that.” – digital health expert
Intangible costs including soft skills, personal reflection time, existing relationships, level of authority, and mental load were highlighted as contributing to implementation but challenging to cost. Additionally, implementation activities may not be costed because they were not identified, did not require reporting, or were not considered important, perpetuating reported lack of awareness to costing implementation.
“I mean because when we've done evaluations in the past nobody ever asked for those.”– digital health expert
Costing implementation was a challenge when funding was not available for a long enough period for proper evaluation or to fill absent expertise including health economics.
“No, I haven't [evaluated implementation costs]... We did write in a health economist, but we just didn't get the funding.” – implementation scientist