We selected a User Satisfaction Survey to explore the efficacy of the digitized OSCE system compared to the traditional paper-based system. As part of the Children’s Health academic requirement, fifth-year medical students at the College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia, must be assessed clinically through an end-of-year OSCE examination following their Children’s Health course. The e-learning unit at the College of Medicine decided to digitize the OSCE examination for all medical students to meet its strategic academic plans, and the Department of Pediatrics was chosen as a trial. Numerous meetings and brain storming sessions were held to gauge such requirements and how effectively/efficiently they could be conducted utilizing available resources, customized to the needs of the department.
QuestionPro is a web-based software for creating and distributing surveys. It consists of an interface for creating survey questions, tools for distributing surveys via email or website, and tools for analyzing and viewing the results.21 To fit our purpose, as an OSCE management solution, we created a full exam platform by utilizing survey features and went one step further by using QR codes. QuestionPro has been “globally recognized by multiple educational, business, research, and marketing institutes for over ten years.”21
Assessment documentation, stations selection, and scoring criteria were chosen, formulated, reviewed, and agreed on by the OSCE committee faculty members of the department. Subsequently, all information was handed to the e-learning support team to be uploaded to the newly generated assessment system using QuestionPro software. Prior to the OSCE date, OSCE assessors, circuit coordinators, and student invigilators were trained to use the electronic system, and technical support was available at the time of the OSCE assessments. Further, an introductory session was held to introduce the new electronic system to the students.
Regarding OSCE scoring, we provided assessors with 3–5 scoring options for each question of the assessment with which to rate each student’s performance: Not Done, Inadequately Done, Partially Done, Adequately Done, and Well Done. We assigned different scoring weights to each question based on its difficultly, complexity, and number of variable answers for each question (Fig. 1).
This new COES, which was customized “in-house” at the e-learning unit of College of Medicine at Imam Abdulrahman Bin Faisal University, was used to store and analyze data electronically. Moreover, student feedback was sent to students electronically using the student email system.
This study was approved by the Institutional Review Board (IRB) of Imam Abdulrahman Bin Faisal University through an expedited review (IRB approval number IRB-2020-01-048). The datasets used and/or analyzed during this study are available from the corresponding author on reasonable request.
OSCE layout for the students
A total of 139 fifth-year medical students utilized the new electronic OSCE assessment system in December 2019. They were assessed by 30 examiners from the faculty board of the Department of Pediatrics using portable tablets (iPads) that were provided by the Deanship of e-learning. The OSCE comprised five separate stations. Students were divided into three parallel circuits (A, B, and C) operating simultaneously to accommodate numerous examinees. Each circuit comprised the same five stations in the same systematic order. A range of 12–14 students were assigned prior to the exam to four rotations per circuit. This distribution of students was meticulously generated using Excel Microsoft software and reviewed by three different members of the OSCE exam committee to eradicate any individual/technical errors. Each student completed a history taking and discussion, pediatric surgery case scenario, data interpretation, physical examination, and counseling station, each of which was eight minutes.
The Computerized Web-based OSCE Evaluation System
The duration of the station was determined by the required time for the student’s assessment, for the student’s QR code to be scanned, and a two-minute safety zone should any electronic issue arise. A QR color-coded ID card was given to each student before entering the exam to be scanned by each assessor using their iPads at the beginning of the station (Figs. 2 & 3).
The coded card showed the student’s data (Name, University number) and was encrypted to match the assigned circuit, assessor, and rotation for each student. Once the assessor scanned the QR code on each student’s ID card, an online designed page opened on the assessor’s tablet showing the student’s data, circuit, and rotation numbers for second step verification (Fig. 4). Subsequently, the assessor was asked to choose their assigned station number from the five stations shown on that page, which also showed the relevant assessors’ names under each station. After selecting a station, the assessor graded the performance of the student within the given time frame (Fig. 5).
Once the time was up, the assessor submitted the form, and data were recorded in the system. After submission of the performance questionnaire, an automated text appeared containing the student data and a message confirming the submission of the questionnaire. Notably, the submission of performance forms was only allowed when all questionnaire items were completed to overcome missing data potential. Finally, the software can be used to download the raw data results in different formats (MS Excel, MS PowerPoint, Adobe PDF) during or after the completion of the exam. The serial number can be used to merge the data for each student in a single MS Excel file, and the sum and average formulas can be added manually through the MS Excel function “sort” (Fig. 6).
Overall User Satisfaction Survey
After the completion of the exam, we asked the assessors to complete an overall satisfaction survey of their experience with the new electronic system (COES). The survey was originally developed from a previous work following an extensive literature review in the field of electronic OSCE management.16,22 A 25-item-questionnaire was divided into three sections: the OSCE Software user evaluation (3 Items); usage of the electronic OSCE system and its training (10 Items); and the OSCE assessment process itself (12 Items). For the questions related to the usage of the electronic OSCE system and the assessment process itself (22 Items), they chose from the following: Strongly agree, Agree, Disagree/strongly disagree, or No judgment. For the OSCE software user evaluation (3 Items), the choices were as follows: Excellent, Good, Fair, or Poor. Additionally, three other questions were added for a better analysis of the process: Assessor’s age, gender, and whether they possessed a tablet device at home. The assessors’ answers were recorded and analyzed.