In 2021, we introduce a comprehensive, transversal simulation-based curriculum, the so-called SIMCLUB, for residents of twelve residency schools at the Humanitas University and IRCCS Humanitas Research Hospital. The clinical objective was to equip residents with knowledge on how to deal with common emergencies and to increase their confidence in managing unpredictable events in different settings and with diverse teams. A pre- and post-test questioner was used to test the impact on students’ confidence and skill acquisition. ON the other hand, the structural objective aimed to address the implementation effectiveness and comprises a qualitative evaluation. Guided by a project manager, throughout the process the team critically analyzed the process and outcomes and described areas where improvements could be made. Lessons learned from the implementation were carefully documented, ensuring that valuable insights were captured for future reference.
Clinical Simulation
Every month we run two high-fidelity simulation cases during a two-hour session, followed by a debriefing with expert faculty. Each simulation scenario changed every month, and was limited to a maximum of four residents, while the remaining residents (average participation rate of 39 residents per event) watched passively from an auditorium. Topics were chosen according to needs or urgencies within the hospital and designed together with clinical experts of the respective field. The topics varied each session but were designed to cover a transversal set of skills and competencies. The requirement of each case was the presence of an actual acute emergency event. Examples included the fainting patients after HIP replacement, a patient with high fever or seizures, etc.
Each simulation case had a high-fidelity character, replicating the clinical routine as realistically as possible. During the simulation scenario, at least two tutors managed the case from the control room, and one tutor was present in the simulation room to adjust the scenario in case of difficulties. All cases were pre-briefed by the simulation team 4h prior to the session. For quality control, all cases followed a defined standard (Simulation Case Charter including objectives, medical history, diagnostic pathways, therapeutic pathways, and investigations printouts). This was important to optimize time, efficacy, and quality, as guidelines supported the automation and standardization of tasks. All tutors were familiar with the transcript of the cases and the checklists, and moderators were highly trained in leading debriefing sessions. In addition, a train-the-trainer course was offered beforehand to all tutors to ensure adequate knowledge of the processes and to guarantee quality control. Experienced tutors conducted the simulation scenarios, and expert consultants from the respective fields led the debriefing discussions using relevant facts and guidelines.
Project management intervention
We used the Harvard Project Management theory to facilitate the implementation of the simulation-based training curriculum (16, 17). Based on their recommendation, we split the curriculum into four phases (planning, build-up, execution, and close-out), and developed and used tools, including a project charter, project plan, risk management plan, and project monitoring and control mechanisms. Table 1 describes the different phases including the specific steps necessary to lead to success.
Table 1
Project Management Phases
| PLANNING | BUILD-UP | EXECUTION | CLOSING |
GENERAL | Determine the problem (Incoming residents have different levels of knowledge and feel unprepared in managing emergencies) | Schedule and specify assignments (Use transversal topics and refine arguments, aligned with the needs of the hospital; e.g. focus on sepsis mortality) | Monitor and control the output (Document challenges of each simulation session) | Analyzing the data and the performance |
SPECIFICS | Define objectives, scope, resources and tasks | Develop a budget Refine timeline and milestones | Report progress Risk Identification and mitigation | Evaluate standardization of processes |
STAKEHOLDER | Identify stakeholders (physicians, marketing, internal communication, IT, engineers, funders, students) | Focus on awareness building and buy-in (use of communication channels like WhatsApp©, email, intranet, billboards) | Regular meetings with stakeholder Repeatedly present performance with scientific data | Debrief with the team (invite all stakeholders to improve future collaboration) |
TOOLS | Use of project charter and a 12-month project plan | Financial Analysis Scheduling tools (GANTT) | Risk charter (RAID LOG) Quality management | Lessons Learnt Report Cost-Analysis |
Legend: This table illustrates the specific actions, stakeholders, and the tools used to effectively structure the different phases of the project.
*RAID: Risks, assumptions, Issues, and dependencies *GANTT chart: illustrates work completed over a period of time in relation to the time planned for the work.
Project phases
The planning phase defined the new project and established the curriculum's scope, objectives, and course of action. Objectives were defined as 1) Evaluation of the effectiveness of using project management theory to implement a simulation-based training curriculum and 2) Assessment of the impact of the simulation-based training curriculum on the clinical emergency preparedness of medical residents. Participants were expected to leave the course with increased knowledge and confidence to manage critical clinical situations in the hospital.
Early on, we involved stakeholders, which included directors of selected resident’s schools, participants from I.T., communication, and marketing; scenario design team including instructional designers, content editors, subject matter experts, and adjunct faculty, all of whom have specific roles and responsibilities; and finally, the project manager responsible for overseeing the planning, execution, and successful completion of the project. External stakeholders may include medical societies and potential course sponsors.
During the build-up phase, we met weekly with the small teams to discuss the progress, barriers, improvement areas, and milestones to be reached. Those meetings were documented in a risk mitigation logbook and followed up accordingly. A budget was specified to include all expenses to market and run the course. Challenges in this phase were mainly linked to marketing and communication, namely identifying the most efficient awareness-building channels, and increasing the directors' buy-in. We started with an e-mail campaign, but we progressively extended the communication strategy to WhatsApp messaging, social media and presence at monthly Morbidity and Mortality Meetings. To provide a fully immersive and interdisciplinary experience, we decided to call on all residents at Humanitas University, a so-called “bottom-up” approach. Despite initial success with this approach, we quickly saw a drop in participation, mainly linked to the shifting priorities from educational to increased clinical responsibilities. As a result, we changed our communication strategy to a “top-down” approach, working mainly with incentivizing the directors of the residency schools, improving significantly buy-in and sustainable participation rates.
During the build-up phase, the strategic direction was converted into producing a schedule of eight simulation sessions. The focus was set to communicate closely with all stakeholders to define topics and learning objectives. A precise curriculum was developed and subsequently implemented during the third stage. Moreover, during this phase, we elaborated on our communication strategy, which included social media ads, billboards and a traditional newsletter to optimize awareness about the program. We had a project monitoring log in which we defined the processes to follow regarding communication, marketing, handling of enrollments, logistics, simulation scenarios, and budget.
Mid 2021, we entered the execution phase, and the implementation of the first of eight sessions took place. Each session lasted two hours and followed the scheme of running two simulated clinical cases à 30 min, followed by a 60 min expert-led debriefing session. Details on the sessions can be found in the previous chapter.
In the final close-out phase, aimed to close and review the achievements of key objectives, we conducted a comprehensive review of the performance, outcomes, and deliverables. It included a cost-effectiveness analysis to determine the sustainability of the project. In addition, ongoing communication with all stakeholders was essential to create a positive culture towards the project and to share the success. Finally, we identified areas of improvement and finalized the lessons learnt to refine the potential scalability of the project.
Evaluation
To evaluate and assess the course's effectiveness, we prepared a multiple-answer questionnaire consisting of ten questions every time. The test was sent out electronically to all participants three hours prior to the event and immediately after the event. In parallel, all participants completed a self-reported questionnaire evaluating clinical knowledge and non-technical skills such as leadership, situational awareness, teamwork, and communication. During the first 12 months, we changed the mode of examination to a live evaluation using Wooclap© Online Quiz. The questionnaires were based on Kirkpatrick’s model evaluating reaction, learning, behavior, and results. Satisfaction was evaluated based on a Likert Scale. For the organizational evaluation, we conducted regular quality control meetings, assessing risks and barriers during the implementation, and derived lessons learnt. After each milestone meeting, team members were able to make suggestions for improvements based on their expert opinion. The key recommendations and areas for improvement identified in these data were used to structure the lessons learnt and risk mitigation report. To capture the user’s perspective, we conducted satisfaction surveys also for faculty members, asking their satisfaction and evaluation of the implementation process.
Data analysis
All the data collected by the faculty members and project manager were compiled for statistical analysis. Raw data were exported into Microsoft Excel® software (Microsoft Corporation, Redmond, WA, USA), which was then used to calculate descriptive statistics. T-tests for “before and after” comparisons were carried out, and p-values less than 0.05 were considered statistically significant.