Modified e-Delphi process
Eighteen professionals with experience in implementation science, health economics and/or digital health were invited to participate in the study. Fourteen professionals expressed interest in participating, but two were lost prior to Round 1. The final expert panel of twelve consenting participants contained a sufficient representation of the desired expertise: 50% had expertise in implementation science, 50% had expertise in health economics and 58% had expertise in digital health (Additional File 4: Table 1). Participants included: two implementation scientists, one health economist, three digital health specialists and six with experience across multiple fields (Additional File 4: Fig. 1). Participants worked across a range of healthcare disciplines, clinical areas and settings including nursing, surgery, maternal health, nutrition and dietetics, lung cancer, infectious disease, clinical excellence, and digital health (including telehealth and artificial intelligence). Most participants were female (n = 8, 67%), worked in academic contexts (n = 11, 92%), and were located in Australia (n = 7, 58%) (Additional File 4: Table 2).
In Round 1, consensus was reached on almost all questions, with the exception being a question asking if research activities should be considered as an implementation cost (Additional File 4: Table 3 - question 2.3.1: 42% agreement) and two questions regarding the supporting material ‘Appendix C: Common activities and resources to operationalise implementation strategies’ (Additional File 4: Table 3 - question 3.5.1: 42% agreement and question 3.5.2: 50% agreement). Percentage agreements from all questions in Round 1 can be found in Additional File 4: Table 3. Feedback and comments from Round 1 resulted in changes to the costing instrument (discussed below). Round 2 of the e-Delphi was used to obtain consensus on the components that did not reach consensus in Round 1 as well as additional questions regarding updates to the instrument made in response to Round 1.
The costing instrument was updated in response to the feedback from Round 1, summarised in Additional File 4: Table 4, and consensus was achieved on these updates in Round 2 (Additional File 4: Table 3 - question 4.1.1: 100% agreement; question 4.2.1: 92% agreement; question 4.2.2: 92% agreement). As a result of the consensus on these integral questions, the e-Delphi process was terminated after Round 2. Integral questions related directly to the design and components of the instrument. Non-integral questions were in-directly related to the instrument including its use and users. There were three non-integral items that did not reach consensus in Round 2) that are described below. Percentage agreements from all questions in Round 2 can be seen in Additional File 4: Table 3.
Areas of non-consensus
The nature of research costs
The responses from Round 1 indicated that including research costs as an implementation cost is dependent on the study type and reason. For example, research for the purpose of furthering implementation science knowledge may not be relevant when quantifying implementation costs, as these costs would not extend to other institutions or sites considering the implementation of a particular innovation. Conversely, research costs may be relevant to include as an implementation cost when conducting quality improvement studies or when the intervention would otherwise not be implemented without local evidence to support its safety, efficacy, or cost-effectiveness. As a result of this feedback, it was decided to acknowledge research costs as being a potentially relevant implementation cost within the costing instrument, with an explanation that the relevance of research costs is context-specific and should be determined by the user of the instrument. Consensus was achieved on this update to the costing instrument in Round 2 (Additional File 4: Table 3 - question 2.1.1: 100% agreement).
The user’s prior implementation science knowledge
The initial implementation costing instrument prototype included supporting material designed to provide reference explanations for the user on implementation science concepts including phases, and common implementation strategies, activities, and resources (Additional File 2: Appendix A, B, C). The information on implementation phases reached consensus (Additional File 4: Table 3 – question 2.1.2: 75% agreement) but some respondents felt it gave a linear impression of implementation, when such processes are often iterative. Providing examples of common implementation strategies reached consensus (Additional File 4: Table 3 - question 3.3.1: 75% agreement) but some respondents suggested inclusion of references to key implementation science articles for those lacking foundational knowledge. The purpose of providing examples of common activities and resources was not clear to participants and (as mentioned above) did not reach consensus (Additional File 4: Table 3 - question 3.5.1: 42% agreement and question 3.5.2: 50% agreement). The research team considered that the mixed responses to these instrument supporting materials was likely due to ambiguity in the scope of the instrument.
In response to the feedback from Round 1, the research team decided to refine the purpose and content of the instrument to more clearly align with the intended aim to provide practical, user-friendly templates to assist in the collection of appropriate costing data. It was determined that reference explanations with the intention of educating users on implementation science phases and strategies was beyond the scope of this costing instrument. Hence, the supporting education-related materials were removed from the costing instrument (Additional File 2: Appendix A, B, C). This information was replaced with appropriate references to key studies within the implementation science literature to assist users in deepening their understanding as required. These updates were made to the costing instrument in Round 2 and consensus was achieved on both the removal of supporting material (Additional File 4: Table 3 - question 4.4.2: 83% agreement) and the refined scope of the instrument (Additional File 4: Table 3 - question 3.1.1: 92% agreement).
Through this refinement, the research team recognised that there was an implicit assumption that the user will likely have some prior understanding of implementation science, which we contend is reasonable given the intention to use and cost implementation strategies. In Round 2, we asked the participants if it is appropriate to assume users of the costing instrument will have some level of prior implementation science knowledge; this statement did not reach consensus (Additional File 4: Table 3 - question 3.1.3: 67% agreement).
Specificity to the digital health setting
The costing instrument was initially framed for application in digital health contexts and there was a suggestion from Round 1 indicating that more digital health specific examples would be helpful. Given the refinements in the overall instrument scope (outlined above), the research team was also prompted to consider making the instrument more generic in nature to allow for potential application beyond digital health contexts. The rationale for this related to the recognition that the costing categories for implementation strategies (as opposed to specific interventions or technologies) used within digital health contexts may be transferrable across settings. Although consensus was not reached on this update to the costing instrument in Round 2 (Additional File 4: Table 3 - question 5.1.2: 67% agreement), most participants recognised it was plausible the instrument could be generic. The research team concluded that subsequent piloting would be required to confirm or refute the extent to which the instrument was generalisable beyond digital health.
Additional digital formats
In response to the feedback from Round 1, the digital functionality of the costing instrument was improved. An electronic version of the data collection templates was created in Microsoft Excel, including use of ‘drop-down’ options where possible to optimise data quality. The Excel file included two additional summary tables that automatically populated with data entered from the templates. Consensus was achieved on this update to the costing instrument in Round 2 (Additional File 4: Table 3 - question 6.1.1: 92% agreement; question 4.1.2: 92% agreement). Participants were satisfied with the Microsoft Excel version and did not indicate interest in any additional digital formats suggested, including REDCap, Microsoft Word, or PDF (Additional File 4: Table 3 - question 6.1.3: 33% agreement).
The final implementation costing instrument
The final costing implementation strategies (Cost-IS) instrument is presented through a worked example below (Additional File 5). The aim and scope of the instrument is to collect data on the costs associated with implementation strategies for digital health solutions. The instrument comprises of three data collection templates. It can be found online at https://cost-is.github.io/instrument/.
Cost-IS Template 1: Planning
The purpose of Template 1 is to help identify specific data items that need to be collected. This will allow for comprehensive and targeted data collection later in the costing instrument. In Template 1, users document the relevant implementation strategies and then outline which activities are needed to operationalise each of the strategies. Both labour and non-labour resources used to deliver the activities are listed in the final column. Table 1 provides a worked example of Template 1, including four implementation strategies with associated activities and resources.
Table 1
Cost-IS template 1 worked example
Strategy
|
Activities
|
Resources
|
Audit and feedback
|
• Meet with stakeholders to identify outcomes
• Retrieve and analyse data on outcomes
• Present data to stakeholders
|
• Project officer
• Team leader A
• Team leader B
• Clinical team A - champion
• Clinical team B - champion
|
Involve existing governing structures
|
• Meet with executives
• Meet with clinical team/s
|
• Project officer
• Executive A
• Executive B
• Team leader A
• Team leader B
|
Identify and prepare champions
|
• Engage with stakeholders to identify potential champions
• Engage (meetings or emails) with possible champions
• Ongoing support for champions
|
• Project officer
• Team leader A
• Team leader B
• Clinical team A - champion
• Clinical team B - champion
|
Train-the-trainer
|
• Adapt training with stakeholders
• Train the champions to be trainers
• Create opportunities for the trainers to train others
• Monitor training progress
|
• Project officer
• Team leader A
• Team leader B
• Clinical team A - champion
• Clinical team B - champion
• Training material
• Training room
|
Table 1. Cost-IS template 1 worked example
Cost-IS Templates 2A/B: Data collection
Templates 2A and 2B are used to collect the data necessary to quantify the implementation costs; 2A collects data on labour resources associated with the implementation strategies, while 2B collects data on non-labour resources. In the worked example of Template 2A (Table 2), all activities associated with the hypothetical implementation were recorded. Each activity instance was given a specific index number because an activity occurred more than once. Similarly, a purpose was recorded for each activity to distinguish it from other similar activities. The implementation strategy related to the respective activity was documented in the same row. Personnel involved in the activity were documented. Each personnel type/role was recorded on a separate row, and roles were distinguished by wage rate or title classification. For each activity, the number of personnel for each role was recorded. Finally, the time spent on the activity for that role was documented. The digital version of this template includes two additional columns which automatically calculate labour costs when the columns presented in Table 2 are completed. In the digital version the entries columns ‘Activity’, ‘Strategy’ and ‘Role’ are restricted by drop down menus containing the items listed in Template 1. Template 1 can be completed iteratively as required by the project.
Table 2
Cost-IS template 2A worked example
Index
|
Activity
|
Purpose
|
Strategy
|
Role
|
Hourly wage rate ($)
|
No. of personnel involved
|
Total person (mins)
|
1
|
Meet with executives
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Project officer
|
85.21
|
1
|
30
|
1
|
Meet with executives
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Executive A
|
134.08
|
1
|
30
|
1
|
Meet with executives
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Executive B
|
150.48
|
1
|
30
|
2
|
Meet with clinical team/s
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Project officer
|
85.21
|
1
|
30
|
2
|
Meet with clinical team/s
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Team leader A
|
97.29
|
1
|
30
|
2
|
Meet with clinical team/s
|
present intervention aims and outcomes
|
Involve existing governing structures
|
Team leader B
|
115.62
|
1
|
30
|
3
|
Meet with stakeholders to identify outcomes
|
meeting to identify what needs to be audited and how to feed it back
|
Audit and feedback
|
Project officer
|
85.21
|
1
|
60
|
3
|
Meet with stakeholders to identify outcomes
|
meeting to identify what needs to be audited and how to feed it back
|
Audit and feedback
|
Team leader A
|
97.29
|
1
|
60
|
3
|
Meet with stakeholders to identify outcomes
|
meeting to identify what needs to be audited and how to feed it back
|
Audit and feedback
|
Team leader B
|
115.62
|
1
|
60
|
4
|
Engage with stakeholders to identify potential champions
|
email asking for champion suggestions
|
Identify and prepare champions
|
Project officer
|
85.21
|
1
|
10
|
4
|
Engage with stakeholders to identify potential champions
|
champion suggested via email
|
Identify and prepare champions
|
Team leader A
|
97.29
|
1
|
10
|
4
|
Engage with stakeholders to identify potential champions
|
champion suggested via email
|
Identify and prepare champions
|
Team leader B
|
115.62
|
1
|
10
|
5
|
Engage (meetings or emails) with possible champions
|
met with clinical team A champion
|
Identify and prepare champions
|
Project officer
|
85.21
|
1
|
30
|
5
|
Engage (meetings or emails) with possible champions
|
met with clinical team A champion
|
Identify and prepare champions
|
Clinical team A - champion
|
78.80
|
1
|
30
|
6
|
Engage (meetings or emails) with possible champions
|
met with clinical team B champion
|
Identify and prepare champions
|
Project officer
|
85.21
|
1
|
30
|
6
|
Engage (meetings or emails) with possible champions
|
met with clinical team B champion
|
Identify and prepare champions
|
Clinical team B - champion
|
56.87
|
1
|
30
|
7
|
Adapt training with stakeholders
|
discuss training with stakeholders and adapt to clinical context if needed
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
60
|
7
|
Adapt training with stakeholders
|
discuss training with stakeholders and adapt to clinical context if needed
|
Train-the-trainer
|
Team leader A
|
97.29
|
1
|
60
|
7
|
Adapt training with stakeholders
|
discuss training with stakeholders and adapt to clinical context if needed
|
Train-the-trainer
|
Team leader B
|
115.62
|
1
|
60
|
7
|
Adapt training with stakeholders
|
discuss training with stakeholders and adapt to clinical context if needed
|
Train-the-trainer
|
Clinical team A - champion
|
78.80
|
1
|
60
|
7
|
Adapt training with stakeholders
|
discuss training with stakeholders and adapt to clinical context if needed
|
Train-the-trainer
|
Clinical team B - champion
|
56.87
|
1
|
60
|
8
|
Adapt training with stakeholders
|
Incorporate adaptations to training
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
60
|
9
|
Train the champions to be trainers
|
same as activity
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
60
|
9
|
Train the champions to be trainers
|
same as activity
|
Train-the-trainer
|
Clinical team A - champion
|
78.80
|
1
|
60
|
9
|
Train the champions to be trainers
|
same as activity
|
Train-the-trainer
|
Clinical team B - champion
|
56.87
|
1
|
60
|
10
|
Create opportunities for the trainers to train others
|
book meeting room for monthly training sessions for champions to train
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
15
|
11
|
Ongoing support for champions
|
check in with champions
|
Identify and prepare champions
|
Project officer
|
85.21
|
1
|
30
|
12
|
Monitor training progress
|
request current training numbers
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
10
|
13
|
Retrieve and analyse data on outcomes
|
same as activity
|
Audit and feedback
|
Project officer
|
85.21
|
1
|
30
|
14
|
Ongoing support for champions
|
check in with champions
|
Identify and prepare champions
|
Project officer
|
85.21
|
1
|
30
|
15
|
Monitor training progress
|
request current training numbers
|
Train-the-trainer
|
Project officer
|
85.21
|
1
|
10
|
16
|
Retrieve and analyse data on outcomes
|
same as activity
|
Audit and feedback
|
Project officer
|
85.21
|
1
|
30
|
17
|
Present data to stakeholders
|
ensure stakeholders are happy with progress, and address any issues
|
Audit and feedback
|
Project officer
|
85.21
|
1
|
30
|
17
|
Present data to stakeholders
|
ensure stakeholders are happy with progress, and address any issues
|
Audit and feedback
|
Clinical team A - champion
|
78.80
|
1
|
30
|
17
|
Present data to stakeholders
|
ensure stakeholders are happy with progress, and address any issues
|
Audit and feedback
|
Clinical team B - champion
|
56.87
|
1
|
30
|
17
|
Present data to stakeholders
|
ensure stakeholders are happy with progress, and address any issues
|
Audit and feedback
|
Team leader A
|
97.29
|
1
|
30
|
17
|
Present data to stakeholders
|
ensure stakeholders are happy with progress, and address any issues
|
Audit and feedback
|
Team leader B
|
115.62
|
1
|
30
|
Table 2. Cost-IS Template 2A worked example
Summary table examples
Summary tables can be readily created from the data in the completed templates in a meaningful way as determined by the analyst. The templates were designed to collect data at varying levels of detail because of the wide range and adaptable nature of implementation projects. Table 3 and Fig. 1 demonstrates how implementation costs from the worked example can be summarised by both role and implementation strategy.
Table 3
Cost-IS summary table worked example (role and strategy)
Personnel
|
Implementation strategies
|
Labour totals
($)
|
Train-the-trainer
($)
|
Audit and feedback
($)
|
Involve existing governing structures
($)
|
Identify and prepare champions
($)
|
Project officer
|
305.34
|
213.03
|
85.21
|
184.62
|
788.19
|
Team leader B
|
115.62
|
173.43
|
57.81
|
19.27
|
366.13
|
Team leader A
|
97.29
|
145.94
|
48.65
|
16.22
|
308.09
|
Clinical team A - champion
|
157.61
|
39.40
|
-
|
39.40
|
236.41
|
Clinical team B - champion
|
113.75
|
28.44
|
-
|
28.44
|
170.62
|
Executive B
|
-
|
-
|
75.24
|
-
|
75.24
|
Executive A
|
-
|
-
|
67.04
|
-
|
67.04
|
Total
|
789.60
|
600.23
|
333.94
|
287.94
|
2,011.72
|
Table 3. Cost-IS summary table worked example (role and strategy)
Figure 1. Cost-IS summary figure worked example (role and strategy).