Design
This was a prospective, single-group, pre-post educational intervention study with blinded outcome assessment.
Participants and Setting
First- and second-year medical students who had completed the 12-week pre-clinical cardiovascular and pulmonary core curriculum at John A. Burns School of Medicine (JABSOM), University of Hawaii, USA were eligible for the study. We recruited participants through e-mail and public postings. This study was conducted at the JABSOM SimTiki Simulation Center (SimTiki) between September 2019 and June 2020. The University of Hawaii Human Studies Program approved the study (Protocol number: 2019 − 00265). All participants provided informed consent, and all data were de-identified after collection. No incentives or reimbursements were provided to participants. We carried out this study in accordance with The Code of Ethics of the World Medical Association (Declaration of Helsinki).
Cardiac POCUS Curriculum
In our previous pilot study, we developed a basic cardiac POCUS curriculum for pre-clinical medical students based on the ASE-recommended framework that encourages the use of a flipped classroom/blended-learning model with online modules [20]. Student goals for this curriculum were to independently obtain basic cardiac POCUS views in a healthy volunteer and to identify normal anatomic structures seen in cardiac POCUS views. Concepts of curriculum design were underpinned by educational principles for effective learning and skill retention, which include concurrent feedback, deliberate practice, mastery learning, and range of difficulty [21]. Curriculum developers were echocardiography subject matter experts including a Fellow of the European Society of Cardiology (KA), a Fellow of the American Society of Echocardiography (KK), and experienced simulation curriculum developers (BWB and JJL). The curriculum timeline is shown in Fig. 1. The cardiac POCUS curriculum included a pre-training self-study of the ASE cardiac POCUS online module and a hands-on training session with a healthy volunteer. The students used an HHU probe (Butterfly iQ; Butterfly Network, Inc., Guilford, CT, USA) with a 9.7-inch tablet display during the training. Student image acquisition skill and anatomical knowledge were assessed before, immediately after, and 8 weeks after training.
ASE cardiac POCUS online module for medical students: The ASE POCUS task force has a free cardiac POCUS online module for medical students (https://aselearninghub.org/). We utilized the ASE online module titled “Cardiovascular Point-of-Care Imaging for the Medical Student and Novice User” as the pre-training didactic. The complete ASE online module comprised 8 sub-modules: Introduction, Basic Anatomy Correlating to Cardiac POCUS Views (module A), Complete Cardiac POCUS Scan (module B), Integrated Cardiac Point-of-Care and Physical Exam (module C), Pathology-I (module D), Pathology-II (module D), Teaching the Teacher (module E), and Standards and Testing (module F). Our pre-training self-study curriculum included the first 4 ASE modules on normal anatomy and physiology (Introduction, modules A, B, and C), which were matched to the learner level of pre-clinical medical students without extensive prior knowledge of cardiac pathology. The 4 ASE modules were designed to be completed in approximately 35 min. Students independently reviewed the online modules 1 day to 1 week before hands-on training.
5 cardiac POCUS views selection: We selected 5 cardiac POCUS views for hands-on training: parasternal long-axis (PLAX), papillary muscle level of parasternal short-axis (PSAX), apical 4-chamber (A4C), subcostal 4-chamber (S4C), and subcostal inferior vena cava (SIVC) views. The 5-view selection was based on recommendations by the World Interactive Network Focused on Critical Ultrasound [2], European Association of Cardiovascular Imaging [22], and ASE [5].
Cardiac POCUS hands-on training session: One instructor (SJ) delivered a 30-min interactive 1-on-1 lecture using PowerPoint slides of the ASE online module and a life-size model heart (Cardiac POCUS lecture). Content of the lecture is in Additional File 1, and a pre-recorded video of 5-view image acquisition instruction in the lecture is in Additional File 2 (https://youtu.be/3PfRzsYjKQg) (The video is a short edited version of the actual video for this article.). Following lecture, students engaged in a supervised, 1-on-1 hands-on training of the 5-view image acquisition on a thin, healthy male volunteer for 30 min (Cardiac POCUS hands-on training). The instructor assumed the role of the healthy volunteer during the hands-on training while providing concurrent, verbal, and tactile feedback to guide student skill development. During hands-on training, students deliberately practiced until they obtained each image with clinically acceptable quality. Image acquisition instruction was designed with reference to an imaging protocol in the ASE comprehensive transthoracic echocardiography guidelines and a point-of-care ultrasound textbook [23, 24]. The main instruction points for the 5-view image acquisition are presented in Additional File 3.
Skill Test Scoring System
Skill test: We assessed image acquisition skill at pre-, immediate post-, and 8-week post-training, using a 10-point maximum skill test scoring system. The skill test is demonstrated in Additional File 4 (https://youtu.be/9KOO_vdNf-c) (One of authors, JJL, played the role of a student in the video). During the skill test, students demonstrated the 5 cardiac POCUS views on the same single healthy volunteer as in the hands-on training without guidance. Students were given 2 min to obtain each view, for a total of 10 min for 5 views. Once the students found their “best” view, they pressed the record button on the tablet for a 5-second clip. Students were allowed to record a maximum of 2 clips for each view. If they had 2 recordings, they selected a single recording for evaluation. We utilized the Butterfly iQ application predefined cardiac ultrasound preset for gain and other ultrasound imaging parameters [25]. We preset the imaging depth to 16 cm for PLAX and PSAX, 18 cm for A4C, and 20 cm for S4C and SIVC. The healthy volunteer was in the left decubitus position for PLAX, PSAX, and A4C, and the supine position with bent knees for S4C and SIVC. The healthy volunteer controlled his respiratory rate at 6 per min and held his breath for 5 seconds when the view recording started.
10-point maximum skill test scoring system: We developed a 10-point maximum scoring system by modifying an existing assessment tool for transthoracic echocardiography views in our previous pilot study [20, 26]. The scoring system was designed to assess the 5-view image quality for rapid bedside cardiac assessment, not for a formal diagnostic comprehensive echocardiography examination. The 10-point maximum skill test scoring system rated the 5 views; each received a score ranging from 0 to 2 points (Table 1). Each view was assessed as excellent (2 points), acceptable (1 point), or poor (0 point) for cardiac POCUS use. The scores from 5 views were summed for a 10-point maximum test score. Excellent quality reference images and videos of the 5 views obtained by a cardiologist (MI) on the healthy volunteer are in Fig. 2A and Additional File 5 (https://youtu.be/DrPp2C7ET8c). Examples of acceptable and poor quality images and videos obtained by participants are in Figs. 2B, 2C, and Additional File 6, 7 (https://youtu.be/fCuYUNW87XY, https://youtu.be/25wj2ml51Pk), respectively. After de-identifying skill test clips including information of pre-, immediate post-, and 8-week post-training, we downloaded the de-identified clips in an electronic database and arranged the clips in randomized order using the random number table in Microsoft Excel for blinded assessment. Three independent blinded raters scored the image quality using the scoring system, and the average of the scores from the 3 raters was then utilized as a representative score. The 3 raters were echocardiography experts. In our pilot study, the skill test scoring system demonstrated excellent interrater reliability and test-retest reliability of the 3 raters [20]. It also demonstrated outstanding discriminatory ability between novices and experts for echocardiography in a validation study using skill test scores from 60 medical students in our pilot study and the current study (Additional File 8).
Table 1
10-point maximum skill test scoring system
5 cardiac POCUS views | Points | Image quality criteria |
PLAX | 2 | Excellent: | All 7 chambers and anatomical structures (LA, LV, LVOT, RV, AV, MV, and IVS) visualized or similar to the excellent quality reference*. |
1 | Acceptable: | One chamber (LA, LV, or RV) severely foreshortened or 1 anatomical structure (LVOT, AV, MV, or IVS) not visualized well. |
0 | Poor: | Any 2 chambers or structures (LA, LV, LVOT, RV, AV, MV, and IVS) severely foreshortened/not visualized well, the left and right sides of the image are flipped, raters do not recognize the view as a parasternal long-axis view, or no image obtained. |
PSAX | 2 | Excellent: | All 4 chambers and anatomical structures (round LV, RV, papillary muscles, and IVS) visualized or similar to the excellent quality reference*. |
1 | Acceptable: | One chamber or anatomical structure (round LV, RV, papillary muscles, or IVS) not visualized well, oval LV, significant lateral wall drop out of LV compared with the excellent quality reference*, or mitral level of parasternal short-axis view. |
0 | Poor: | Any 2 chambers or anatomical structures (round LV, RV, papillary muscles, and IVS) not visualized well, apical level or aortic valve level of parasternal short-axis view, the left and right sides of the image are flipped, raters do not recognize the view as a parasternal short-axis view, or no image obtained. |
A4C | 2 | Excellent: | All 8 chambers and anatomical structures (LA, LV, RA, RV, MV, TV, IAS, and IVS) visualized or similar to the excellent quality reference*. |
1 | Acceptable: | One chamber (LA, LV, RA, or RV) severely foreshortened, 1 anatomical structure (MV, TV, IAS, or IVS) not visualized well, aortic outflow added (5-chamber view), or significant lateral wall drop out of LV compared with the excellent quality reference*. |
0 | Poor: | Any 2 chambers or anatomical structures (LA, LV, RA, RV, MV, TV, IAS, and IVS) not visualized well, the left and right sides of the image are flipped, raters do not recognize the view as an apical 4-chamber view, or no image obtained. |
S4C | 2 | Excellent: | All 7 chambers and anatomical structures (LA, LV, RA, RV, IAS, IVS, and liver) visualized or similar to the excellent quality reference*. The left and right side flipped image does not affect the subcostal 4-chamber view scoring. |
1 | Acceptable: | One chamber or anatomical structure (LA, LV, RA, RV, IAS, IVS, or liver) severely foreshortened/not visualized well or aortic outflow added (5-chamber view). |
0 | Poor: | Any 2 chambers or anatomical structures (LA, LV, RA, RV, IAS, IVS, and liver) not visualized well, raters do not recognize the view as a subcostal 4-chamber view, or no image obtained. |
SIVC | 2 | Excellent: | IVC visualized in a longitudinal fashion, connection of IVC to RA visualized clearly, and IVC diameter > = 1.0 cm at 2 cm from the RA-IVC junction, or similar to the excellent quality reference*. The left and right sides flipped image does not affect the subcostal IVC view scoring. |
1 | Acceptable: | IVC diameter > = 1.0 cm at 2 cm from the RA-IVC junction, but no clear connection of IVC to RA, or IVC not visualized in a longitudinal fashion. |
0 | Poor: | IVC diameter < 1.0 cm at 2 cm from the RA-IVC junction, descending aorta imaged instead of IVC, raters do not recognize the view as a subcostal IVC view, or no image obtained. |
AV, aortic valve; A4C, apical 4-chamber view; IAS, interatrial septum; IVC, inferior vena cava; IVS, interventricular septum; LA, left atrium; LV, left ventricle; LVOT, left ventricle outflow tract; MV, mitral valve; PLAX, parasternal long-axis view; POCUS, point-of-care ultrasound; PSAX, papillary muscle level of parasternal short-axis view; RA, right atrium; RV, right ventricle; SIVC, subcostal inferior vena cava view; S4C, subcostal 4-chamber view; TV, tricuspid valve. |
The 2-point maximum scores for each of the 5 cardiac POCUS views are added for the 10-point maximum skill test score. *Excellent quality reference refers to an image obtained by the cardiologist (MI) on the healthy volunteer used for all skill tests (Fig. 2A and Additional File 5). Adapted from Jujo et al.[20] |
Knowledge Test Scoring System
We assessed the anatomical knowledge of students before, immediately after, and 8 weeks after training, using an identical knowledge test on Google Forms (Fig. 1). The knowledge test consisted of 40 multiple-choice questions identifying normal anatomic structures seen in the 5 cardiac POCUS views. The 40-point maximum knowledge test scoring system is in Additional File 9. This scoring system demonstrated outstanding discriminatory ability between novices and experts for echocardiography in a validation study using knowledge test scores from 59 medical students in our pilot study and the current study (Additional File 10).
Outcome Measures
We measured the following curriculum learning effect outcomes: The primary outcome was [ⅰ] and secondary outcomes were [ⅱ]–[ⅷ].
Skill test score improvement
[ⅰ] skill test score difference between pre-training and 8-week post-training and [ⅱ] the difference between pre-training and immediate post-training.
Knowledge test score improvement
[ⅲ] knowledge test score difference between pre-training and 8-week post-training and [ⅳ] the difference between pre-training and immediate post-training.
5-point Likert scale questionnaire
We administered 5-point Likert scale questionnaires using Google Forms to measure [ⅴ] overall curriculum satisfaction, [ⅵ] the ASE online module satisfaction, and [ⅶ] hands-on training satisfaction at immediate post- and 8-week post-training. Questionnaires also assessed [ⅷ] student motivation to purchase a personal HHU at pre-, immediate post-, and 8-week post-training.
Subgroup analysis
Based on our pilot study findings of individual skill retention variation [20], we planned to perform subgroup analyses to investigate factors that affected skill retention. When we found significant skill test score variation at 8-week post-training by visual inspection, we examined demographic factors between students with a skill test score of 5 or higher and less than 5 at 8-week post-training to investigate the reason for score variation.
Interrater Reliability of the Skill Test Scoring System
We assessed interrater reliability of the skill test scoring system with intraclass correlation coefficient (ICC) using all skill test scores (pre-, immediate post-, and 8-week post-training).
Sample Size and Power Calculation
Sample size calculation was based on our pilot study with 6 pre-clinical medical students [20]. The pilot study showed that the mean skill test score difference between pre-training and 8-week post-training was 2.28 points [standard deviation (SD), 4.44]. Using this estimate, 25 participants were required to provide 80% power with a one-sided alpha level of 0.05. Assuming an approximately 15% participant withdrawal from the study, the sample size required was 29. Participation in the study was voluntary; thus, sampling was not random. To increase representativeness of the sample and precision of the outcome measures, we continued to recruit participants for the planned study period even after the statistical sample size was achieved.
Statistical Analysis
All statistical analyses were performed using BellCurve for Excel (Social Survey Research Information Co., Ltd.). All numeric variables are presented as mean and SD, or median and interquartile range. Mean difference (MD) between pre- and 8-week post-training data was calculated with an unpaired t-test and presented as mean and SD with 95% confidence interval (CI) and the ES. MD between pre- and immediate post-training data was calculated with a paired t-test and presented as mean and SD with 95% CI and the ES. We interpreted the clinical significance of ES according to Cohen’s ES guidelines (ES of 0.2–0.5 = small ES, 0.5–0.8 = moderate ES, and > 0.8 = large ES) [27, 28]. ICC estimates and their 95% CIs were calculated based on a mean rating (k = 3), absolute agreement, two-way random-effects model for interrater reliability and two-way mixed-effects model for test-retest reliability [29].
This manuscript adheres to the Guideline for Reporting Evidence-based practice Educational interventions and Teaching (GREET) with the GREET checklist (Additional File 11) [30].