In this study, we assessed clinical pharmacy students' attitudes and perceptions of different assessment methods and their insights regarding how these methods predict their clinical competencies. We also characterized their preference and acceptance of different formats of clinical competency assessment instruments in terms of evaluating their theoretical and practical knowledge. Students rated their attitude, perception and understanding of both formative and summative assessment approaches, and rated the applicability and utilizations of the OSCE assessments. They also rated various clinical competency assessment instruments on their difficulty, fairness, effectiveness for learning and preferred frequency of use.
Students believed that formative assessment (FA) methods could help them to identify their strengths and weaknesses in the subject and pinpoint areas requiring more focus. They also perceived that this approach could improve their academic performance, motivation to study, confidence, improved their competency confidence, compatibility with the program and employability within the learning module. This may be partly explained by the continuous and systematic nature of the formative assessment approach, which allows students to evaluate themselves regularly. Also, educators can identify early on which aspects of the course and teaching process require more attention and who needs academic support. This aligns with existing evidence that FA is designed to monitor student learning, provide continuous feedback, assist students in identifying their strengths and weaknesses, pinpoint target areas that need detailed work, and aid instructors in addressing students’ difficulties immediately (29–31). This approach, known as assessment for learning (AFL), aims to assess learning, inform teaching strategies, and support student learning progress through ongoing processes. Formative assessments are interactive activities that are a continuous process focusing on providing instant, actionable feedback to both students and teachers during the instructional process. By engaging students in self-assessment and reflection, FA aims to identify areas for growth and improvement, fostering a more adaptive and responsive educational environment (21–23).
Students believed that summative assessments (SA) were the primary strategy for preparing for written exams was question-spotting. Additionally, the majority believed that study materials required for written exams were excessive and they felt that module grades often overly rely on single, one-time written exams. Also, continuous assessment is fairer than one-off exams judging academic performance and students perceived that employing negative markings within MCQs was unfair. This perspective concurs with previous studies stating that SA is used to evaluate student learning at the end of an instructional period to determine whether learning objectives have been met. This approach, known as assessment of learning (AoL), aims to assess and measure student learning against predefined standards at the end of an instructional period. The SA encompasses final exams, objective structured clinical examinations (OSCE), Oral exams and end-of-term projects. This approach is comprised of high stakes, used for grading, certification, and accountability purposes, providing a cumulative assessment of what students have learned and can demonstrate (24). While AoL serves as a final measure of educational outcomes, AFL directly influences teaching and learning processes by offering continuous feedback and opportunities for improvement (25). Assessment of clinical competency models is tailored to the performance levels, learning stages, academic organization capabilities and whether the purpose is formative or summative (5, 26).
The graduated students had positive attitudes towards using the OSCE instruments in assessing clinical competency. Larger proportions of students agreed and strongly agreed that the OSCE could effectively evaluate their clinical knowledge, integrate knowledge across modules and assess clinical skills required in pharmacy practice. The possible explanation is if the OSCE well integrated into existing assessment methods could validate and enhance clinical assessment tools' reliability, thereby significantly increasing theoretical and practical knowledge. Students may perceive OSCEs as accurate assessments of their skills. This view is strongly supported by an earlier study which describes the OSCE as a multipurpose clinical assessment tool employed to assess the clinical knowledge and skills of healthcare professionals in clinical settings. The OSCE evaluates competency in a precise, objective, and reproducible manner, allowing uniform testing of students for an extensive range of clinical skills. It is particularly effective in assessing critical areas of healthcare professionals such as communication skills and the ability to handle unpredictable patient behavior (32, 33). Furthermore, the OSCE has been increasingly utilized in both undergraduate and graduate programs worldwide. These assessment instruments are also employed in licensure examinations and as feedback instruments in FA approaches (34). Therefore, the survey findings revealed that students have positive and engaging insights into the utilization of this instrument in evaluating their clinical learning outcomes and have positive predictions in utilizing it to measure clinical competencies.
This survey also explored students’ perception of the difficulty, fairness and effectiveness of various assessment instruments as well as their preferences in their clinical skill assessment process. Although, higher proportions of students did not express strong opinions and take a definitive stance, about half rated the essay-type questions as the most difficult tool, followed by OSCE. Despite many students (68%) having agreed or strongly agreed to the utilization and objectivity of OSCE in assessing clinical competencies, they also rated it as difficult to perform each OSCE. This may be explained by the OSCE format integrating pharmacotherapeutic knowledge, problem-solving, and interpersonal skills, allowing them to learn from mistakes before real patient encounters (35). This was quite supported by previous reports stating that despite various difficulty concerns, the OSCE has received considerable support from students, who felt it provided significant knowledge across all stations (36). Conversely, portfolios and MCQs were considered the easiest assessment methods. These findings suggest how learners perceive different assessment methods and indicate areas where educators might need to address perceived difficulties. This can help in modifications and adjustments to better utilize these assessment tools in evaluating clinical competencies. Additionally, approximately 60% of the students reported that they gained significant knowledge of a lot of these tools and used them much more frequently.