Our memory is aimed at making decisions and is prepared to forget [13]. Furthermore, without repetition, just 60 % of newly learned material can be recalled after 20 min and after one hour more than half of it is lost [14]. Only through repetitions and/or processing information is transferred from short term to long term memory [15]. Additionally, students attend many lectures passively without reaching an active or even interactive state of learning [16]. However, active learning has a high impact on the learning success and performance in assessments [17]. Beyond that, the attention of learners strongly declines after 20-25 min of classical teacher-centred lectures [18] requiring a change, like questions using an audience response system, in order to again raise the attention.
Moreover, audience response systems serve another important purpose: The questions using the audience response system and the following discussion of the wrong and correct answers provide the students important and prompt formative feedback during the learning process and about their learning progress [19-21] which especially enhances the positive effect of multiple-choice testing [22]. It is difficult to accurately measure the effectiveness of feedback [23]. However, one can at least state that feedback has overall a medium-high effect on student learning and is especially effective for cognitive outcome measures [24] as it activates both fast and slow learning and memory processes in the brain [25].
For these reasons, we integrated an audience response system into our seminar as audience response systems aim at knowledge required for decision making, repeat the contents taught, establish interactivity, activate students, and provide individualized formative feedback.
It was therefore fair to assume that audience response systems should also impact the results of assessments. Previous studies on the impact of audience response systems on assessments showed contradictory results [6, 7, 9]. One needs to consider that it is difficult if not impossible to form appropriate control groups if the impact of audience response systems is assessed in plenary lectures.
Here, we conducted a controlled educational research study in a seminar setting which allowed a direct comparison between topics taught with an audience response system and topics without this additional interactive part. Furthermore, the dummy coding procedure allowed us to reach a considerable high number of study participants compared with other studies [5-7, 9].
Students answered interactive questions during their classes using the audience response system eduVote. Control groups were instructed without this interactive part. We analysed the results of the final assessment of students for an impact of the use of the audience response system. However, we could not demonstrate a positive long-term impact of the audience response system on learning and perception.
It is possible that our seminar “Human Genetics” was already interactive enough and that the additional activation using the audience response system had no further effect as the participants already have reached an interactive state of learning [16]. One major bias may have been an overlearning of students in preparation to a summative assessment [26]. This effect may have covered the effect of the teaching methods on the results of the summative assessment.
Due to the nature of this educational research studies, students could not be blinded to the fact that they were exposed to the audience response system. However, as different groups were exposed to the audience response system for different topics, students were blinded when they served as control group. For organizational reasons, we could not randomly assign individual students to study arms as the groups were precomposed. However, we randomly assigned each group to the study arms. In order to exclude bias by different teachers, tutors were randomly assigned to the different seminar and control groups. In order to exclude any dropouts, we carefully tracked whether students participated at the right time, in the right group and the right room.
Strengths of our study were e.g. the large samples size with clear intervention and control cohorts consisting of multiple independent groups. Importantly, our study was not conducted in a laboratory setting with e.g. pre-recorded lectures or artificial questions but during regular classes with real students and tutors. The tutors were highly qualified and experienced. Our results were unaffected by incentives and we still noted a high motivation of the students to participate in our study and to share their opinion and exam results with us. Furthermore, eduVote turned out to be an easy-to-use and reliable audience response system and our intervention could be easily adapted into other classes and courses.
After the class, we evaluated the use of eduVote. In our questionnaire, the students gave a very positive feedback regarding the use of the audience response system. The students further stressed that they especially appreciated the anonymity of the audience response system and that they felt not to be forced to join the majority.
Interestingly, we observed that we could specifically reach and activate students who feel uncomfortable with answering questions in front of others by changing the traditional way of asking questions through the use of an audience response system. While others proposed such an effect of audience response systems before [27], we here provide evidence for this assumption.
Our results are in line with previous results indicating that the main advantage of audience response systems would be more the motivation of students and the generation of a stimulating learning environment than the improvement of assessment grades [6].