Reading the Research, Part 1

Here at Ombea and Reivo we are passionate about student response systems (SRS) and how they can improve teaching and learning for students. We know from the work we do with clients, that when Ombea is used well in a lecture or seminar setting it increases engagement from the students and crucially it facilitates deep thought about the content. We have noticed also that there are definite benefits for lecturers. The process of thinking about the questions to ask makes lecturers reflect more deeply on how they are structuring their teaching, so often the use of student responses begins to improve teaching even before any questions are asked.

Many academic staff in universities have evaluated the impact of SRS in their institutions, and there is a substantial and growing body of literature which shows they have a positive effect on a whole range of factors, from attendance at lectures (students are more likely to attend a lecture where they can use their clicker) all the way through to attainment. For instance Edmonds and Edmonds (2008) found that managerial accounting students who had used a response system as part of their teaching outperformed those who did not in test scores.

Most of the studies done have been of students using response systems in lectures, so typically at least a 100 people, often more, sat down listening to a didactic presentation by a lecturer which then includes sets of questions. One study from 2011 (Dunnett et. al 2011) caught our eye though as it examines SRS impacts not on lectures but on smaller classroom settings, with groups of between 30 to 40 students on a business studies course. This study was also unusual in its design. 3 groups of students were involved, one were taught with a strategy which involved using SRS, one group had no SRS and there was a mixed group where some students had clickers and others did not. Crucially the teaching content was the same for all groups, and all had the same sets of questions asked of them as part of the teaching with those in the non-clicker group answering in the traditional method by putting hands up. And the lecturer was the same person for all 3 groups, thus removing any complications in the results down to the effectiveness of different lecturers.

Two main areas were examined in the study. The first was attendance, and the second was student attainment, comparing how much the different groups learned. We'll be looking in more detail about the results on attendance in part 2 of this blog, so let's turn now to the issue of student attainment as it throws up some interesting results.

The results of the Dunnett study were rather contradictory of previous studies of using SRS in university settings and did not follow the usual pattern of showing increased learning when SRS was used. What the researchers found is that the group which demonstrated the largest learning gains was the mixed group, that is the group where some were using clickers and others not. The SRS group, and the non SRS group had no significant difference in their attainment. The fact that the SRS seemed to make no difference whatsoever to attainment could be seen as a rather harsh blow for a company working with SRS systems but there is a lot of complexity here which may help to explain this surprising finding.

So what are we to make of the results of studies such as this one? Well the first conclusion we can draw is about the inherent complexity of educational research and the measurement of learning. From the very earliest days of the introduction of technology into teaching, people have grappled with measuring its impacts and frequently the results have not been conclusive, or (when studies are compared), contradictory.

The second conclusion we are going to draw is one about what was going on in the mixed group (some with clickers, some without), which made this the one with the highest attainment. We have noticed when supporting university staff using SRS, that a very effective strategy is not only one where questions are asked of the students and the results displayed, but where the lecturer adds the additional step of discussing the results and the reasoning which led to the various responses. This discussion allows students to reflect not only on the question in terms of 'did I get it right or wrong', but also in a meta-cognitive way, that is they can think about the kinds of thinking they did to reach their answer (even if it was wrong). Discussions and reflections of this kind can be very powerful in helping students make sense of complex topics. So taking what we know about SRS usage, we surmise here that the mixed group did more of this metacognitive work than the other two. Some students answered with clickers, others put their hands up and gave verbal answers and maybe it was exactly this mixed economy which allowed greater reflection on the content and led to higher learning gains.

So what does this mean for practice?

Reviewing research is always interesting, but one of the most important questions to ask is, 'what can we learn from this to put into practice'. At first sight the conclusions of this study suggest that you can get the best results by giving only half of the class a clicker but this is not really an option, so we shall focus instead on what made that mixed group more effective in their learning. It seems very likely that the mixed group were exposed to more discussion about the results of the questions which were asked. They got the usual aggregated responses displayed on the screen by those with clickers, but also there were those in the group who were giving their answer verbally and explaining the reasons behind their choice. So it appears an optimal strategy when using SRS in a seminar size setting is to meld the results of the polls with discussions about the answers. Lecturers should use the results as a starting point for a conversation with the group about what the answers are and why, and in this way they will activate deeper understanding and reflection on the topic. If only the results are displayed and not discussed, this deeper understanding is very likely not activated.

Our Skills Workshops are a great place to learn and discuss more about this approach to using clickers.

References

Dunnett, A., Shannahan, K. J., Shannahan, R. J., & Treholm, B. (2011). Exploring the Impact of Clicker Technology in a Small Classroom Setting on Student Class Attendance and Course Performance. Journal Of The Academy Of Business Education, 1243-56.

Edmonds, C. T., and T. P. Edmonds. 2008. An empirical investigation of the effects of SRS technology on introductory managerial accounting students. Issues in Accounting Education 23 (3): 421–434.

Find out more

Give us a few minutes and we will let you know how organizations just like yours have increased their revenue and engagement.