OSCE Setting and Participants
We conducted the examination three months after the NNPs started their careers at our institution. We tested their clinical skills related to the issues addressed at each station: history taking, conducting a complete physical examination, problem-directed management, interpersonal communication, and required-procedure techniques (Table 1). We created a 4-station OSCE: (1) Care of fever (Station A), (2) medication administration (Station B), (3) patients with abdominal pain (Station C), and (4) care for intravenous lines (Station D) (Fig. 1).
Table 1
Description of the four OSCEa stations for assessing the competence of novice nursing practitioners.
Station | Description of competence | Task | Skills tested |
A | Care of patients with fevers | Evaluate performance of history taking and symptom assessment | 1. History taking: medical history, drug history, travel history, and symptoms 2. Explain the procedure for fever management and its purpose. 3. Answer questions and provide emotional support. 4. Education, including home care |
B | Medication administration | Evaluate performance of medication administration and patient education | 1. Perform 5 ‘Rights’: right drug, right route, right time, right dose, and right patient. 2. Know medication-associated risks. 3. Know the effects: primarily intended effect, and related effects of the pharmacological properties. 4. Communicate clearly. 5. Monitor adverse effects. 6. Report errors and adverse events. |
C | Care of patients with abdominal pain | Evaluate performance of history taking, physical examination and team work | 1. Perform comprehensive assessment of abdominal pain, including location, characteristics, onset, duration, frequency, quality, intensity, severity, precipitating, and relief factors. 2. Communicate with the interdisciplinary team. |
D | Care of patients with IVb lines | Observe administration of IV medication and monitoring of the line to ensure that it is working without complications | 1. TOUCH: check if there is a temperature change (heat or warmth), redness, or leakage at the IV site. 2. LOOK: confirm that the IV site is dry and visible at all time. 3. COMPARE: check if there is swelling in the limb with the IV line, comparing with the opposite limb without the IV line. 4. Educate: provide information about IV care to patients and caregivers. |
a OSCE, objective structured clinical exam; b IV, intravenous |
We chose these four clinical scenarios because nursing staff with over 10 years of clinical experience at our institution reviewed the relevant literature [3, 19–22] and recommended that the proper handling of these clinical problems can be essential for NNPs at the beginning of their careers. Furthermore, over 90% of the experts on the OSCE education committee of our institution agreed on the importance and practicality of each clinical scenario. The internal consistency of the OSCE stations was tested using Cronbach’s alpha. The overall reliability of the Cronbach's α coefficient was 0.791, which indicated good stability and internal consistency, with minor differences in the progression of the indices.
We enrolled 55 NNPs from different work units, who obtained a diploma of bachelor’s degree in Nursing in Taiwan, but had no internship experience in clinical practice as of July 2016. None of the NNPs had ever taken part in an OSCE before this study. Before entering the OSCE, NNPs needed to complete the training of core professional skills, a 5-day orientation that included standard training courses that were designed and verified by the Department of Nursing at our institution [3]. Three months after the orientation courses, the NNPs were assessed at end of the module through formative OSCE. The NNPs were familiarized with the OSCE procedure under the guidance of instructors who were nursing staff at our institution with comprehensive training. The instructors encouraged the NNPs to discuss the core and provide feedback to the NNPs about his/her achievements, deficiencies and opportunities for improvement.
Implementation, Instruments, and Evaluations at the OSCE Stations
Each station had one standardized patient (SP) and one examiner. The SP was a person who had completed at least eight hours of standard training provided by the Taiwan Association of Standardized Patients, who was also capable of simulating the signs and symptoms of diseases, mimicking clinical scenarios, and providing feedback to the NNPs. The examiner was a nursing faculty rater who had completed OSCE education training program and was certified by our institution and the Taiwan Nursing Association. The raters acted as passive evaluators and were instructed not to guide or prompt the participants.
At the beginning of the test at each OSCE station, participants had 1 min to read a written description of the required tasks. Participants would spend 10 minutes at each station, consisting of 8 minutes of observation and 2 minutes of immediate verbal feedback from the station examiner (Fig. 1). The examiner would assess the abilities of the participants in terms of their clinical skills, strategies, and interpretation of clinical problems (Table 1), and grade them according to a checklist for each skill. The checklist consisted of 10–12 items that were rated on a 3-point scale: 0 (failed to perform), 1 (performed poorly or out of sequence), and 2 (performed appropriately in the correct sequence). Kendall's coefficient of concordance was 0.781 (p < .0001), indicating that there was a significant correlation between the examiners' scores; consequently, there was a good agreement between examiners’ and the scorers’ ratings. We also measured certain practices, such as greeting the patient and hand decontamination, but we did not apply these elements to participants’ overall scores. We recorded the sum of the scores from all the checklist items for each station, and the participants received their own performance-analysis report after the OSCE (Fig. 2). The instructors arranged an 80-min debriefing session to review the report and help the NNPs understand the core (i.e. clinically important) elements of the stations. We used the “borderline-group method” to establish the standard “pass” score. The “pass” score was the mean score of the NNPs whose OSCE scores were rated “borderline” at each station [22].
Participants were required to complete a questionnaire before the implementation (pretest) and after the end of the OSCE program (posttest). The questionnaire was a modified version of a tool used in a previous report, which collected basic learning and personal background information, plus a Nursing Competency Questionnaire (NCQ), a Stress scale, and a Satisfaction with Learning scale [3]. The Nursing Competency Questionnaire, a 26-item instrument using a five-point scale, was designed to evaluate nursing competency. The five domains include taking a medical history (5 questions), physical assessment (3 questions), interpersonal communication (7 questions), problem-directed management (5 questions), and problem-required skills (6 questions). The Stress scale consisted of 10 statements in relation to stressful nursing situations. Each item required respondents to rate the situation on a 5-point scale (1 = not stressful at all to 5 = extremely stressful). The Satisfaction with Learning scale, a 3-item instrument designed to measure the nurses’ satisfaction. Each item contained a statement about nurses’ satisfaction with learning in regards with obtaining input from trainers, using a 5-point scale from 1 (strongly disagree) to 5 (strongly agree). Seven experts, including three attending physicians and four senior nursing supervisors, were invited to validate this questionnaire. A test of internal reliability was conducted with ten senior nurses who had more than 5 years of working experience. Next, the professionals rated its content validity, which yielded a content validity index (CVI) of 0.89–0.91.
Ethical Considerations
Data collection began after the research ethics committee had (IRB approval number:104-9928B) approved the study protocol at the host hospital. Subsequently, we held a meeting with the NNPs to explain the program and the study, including the study’s purpose and procedures, the participants’ rights, and confidentiality. We sent this information, including a covering letter and the questionnaire, to the participants before data collection in a self-addressed stamped envelope. For their convenience, the participants could complete the questionnaires in either the paper or electronic form. The participants returned the questionnaires by mail or emailed them to the research team. We destroyed all the envelopes and deleted all the email addresses that could identify the participants, immediately after the data were saved in a secured computer protected with passwords known only to the primary investigator.
Data Analysis
The data were verified and analyzed using the Statistical Package for Social Sciences (SPSS) software, version 21.0 for Windows. Descriptive statistics (mean scores and standard deviations) were obtained for each examination tool, and analyzed by one-sample or two-sample t-tests, or analysis of variance when appropriate. Statistical tables and percentages were used for the presentation of demographic data; the chi-square test and Spearman’s correlation were used to test the significance of associations between demographic variables and competency levels. Continuous data were tested for normality using the Kolmogorov-Smirnov test and presented as means and standard deviations. The internal consistency of the OSCE stations was tested using Cronbach’s alpha. Agreement between the total scores obtained in both tests was analyzed using the Bland-Altman analysis, and associations were measured using Pearson’s correlation coefficient. The level of significance for all analyses was set at 5% (p < 0.05).