In ‘Reading the Research Part 1’ we examined an interesting paper by Dunnett et. al. which explored the use of Student Response Systems (SRS) on student attendance and achievement in seminar settings. Three groups were taught through seminars; one group had no access to an SRS, the others used an SRS (so the students had clickers to vote with) and in a 3rd group some students had clickers and some did not (a mixed group). The study was unusual and worthy of comment because, although the numbers in the study were small (97 students across the 3 groups), the internal consistency of the study was very high as the same lecturer taught all of the groups and the content was the same in each case. Therefore any variations in attendance and achievement were very likely to have been heavily influenced by the presence or absence of the SRS within the teaching.
In our earlier article we explored the findings of the study in relation to student achievement. Specifically we pondered the conclusion that it was actually the mixed method group, that is the group which had a mixture of verbal questions and SRS questions, which showed the highest student learning gains. In this article we are going to explore the findings of the other side of the Dunnett research in relation to student attendance at teaching sessions. We will make some focused comments about how SRS can contribute to improved attendance at lectures and why this matters for both students and their university.
In his three-way comparative study, Dunnett found that the use of an SRS in the classroom did not improve attendance. In fact the group which was not using clickers had a higher attendance rate than the group which did have clickers. This finding is at odds with nearly all of the existing research on SRS in lecture settings where deployment normally leads to a significant increase in attendance. Dunnett et. al cite the work of Preis and Kellar (2007) who found an 87% increase in lecture attendance for marketing students when an SRS was introduced. The group with the best attendance in the Dunnett study was the mixed group. As we noted in the first part of this article, this group not only had the best attainment, they also turned up more frequently to the sessions. The explanations for this finding are complex and nuanced, and Dunnett and his co-writers suggest, very plausibly, that the dynamics of a small group (sizes between 18 and 39 students) are very different to those of the large lecture theatre (groups of 100 or even 500). In particular they argue that the use of clickers may have paradoxically created a situation where students felt only passively involved in the learning and this relative passivity caused a drop in attendance for the SRS only group. Being aware of the context and size of the teaching group and how this might impact on learner perception of their role using technology is an important finding and discussion point and one worth remembering for universities seeking to use SRS as a teaching and learning tool.
We certainly don’t think this means that using SRS in smaller groups is to be avoided, but it does mean that pedagogic strategies for smaller groups should build the SRS component into a broader overall teaching strategy. Any successful strategy should give the students additional ways to interact beyond the clickers. A period of discussion or reflection after questions or question-sets could possibly work here; the SRS does what it does best, namely collecting answers quickly and making them available for analysis, but the tutor can then lead a discussion to explore these results and give students another way of exploring their ideas.
So our conclusion to this article echoes the conclusions we made in part 1; which is the importance of regarding the SRS as one part of an overall teaching strategy. Then building, where possible, opportunities for discussion and analysis into the sessions. These are often not possible in the conventional large lecture settings because of time pressures and the complexity of the results, but with smaller groups the experienced tutor should be able to blend the votes with other teaching and learning strategies to create a coherent experience for the student and enhanced learning outcomes.
If you are thinking of using OMBEA with a group of any size, do get in touch with us as we have lots of useful and practical ideas for implementation based on years of supporting learning in universities with SRS.
Bati, A., Mandiracioglu, A., Orgun, F., & Govsa, F. (2014). Why do students miss lectures.? A study of lecture attendance amongst students of health science. Nurse Education Today, 33(6), 596-601.
Dunnett, A., Shannahan, K. J., Shannahan, R. J., & Treholm, B. (2011). Exploring the Impact of Clicker Technology in a Small Classroom Setting on Student Class Attendance and Course Performance. Journal Of The Academy Of Business Education, 1243-56.