Interactions through classroom activities are crucial for students’ learning process, as they allow learners to enhance their knowledge either by adding new information or by reorganizing their existing ideas (Walsh, 2011). Therefore, students’ engagement in classroom activities transform learning from a teacher-centred approach to a student-centred deep learning process (Ramsden, 1992), and changes the learning process from a behaviourist perspective to cognitive constructivism approach (Stewar, 2012). In conventional teaching, the teachers may pose questions to the students, which has a two-fold impact: 1) to enhance their engagement in learning, and 2) to provide the teacher a better idea on how well the students have understood a topic. The students may also raise hands to ask a question about the part of the lecture that they do not understand or give comments on the delivery of the lecture or classroom issues. The so-called ‘hand-raising’ method is not practical in classrooms with a large number of students. Even with a small number of students, the hand-raising method is not very efficient because the students might be shy, they do not want to be pointed out, or are embarrassed to give the wrong answers.
To enhance the interactivity and engagement process in a classroom, audience response systems (ARS) have been used in higher education for more than two decades. An early method of this kind was the use of flashcards which were used for multiple-choice questions (MCQ) or true-false questions (Gurrero, 2010; Lasry, 2008). Although the problem with a large number of audience was partially solved by this method, however, the lack of anonymity prevented students from being actively engaged with the teaching process. The anonymity issue was overcome by using electronic clickers, which allowed students to vote on the answers to a multiple-choice question by pressing the relevant button on a handheld device (Gurrero, 2010; Lasry, 2008; Caldwell, 2007). Both flashcards and electronic clickers have a few shortcomings, such as the initial budget to purchase the devices for all participants, carrying them to lectures, the time wasted distributing and collecting them, damage or loss, and more importantly, lack of flexibility. There have been a few studies on using mobile phone text messaging for audience response (Voelkel & Bennett, 2014; Gomez & Elliot, 2013). The method has more advantages over the clickers, particularly to allow open-ended questions; however, the students could not be anonymous since their phone numbers were revealed to teachers. Furthermore, the participants have to use their text message allowance, or they might be charged for sending text messages by the mobile network provider.
A breakthrough for ARS was when online polling software tools were introduced. The advantages are numerous: they are not limited to a particular device and the audience can send the responses using various devices, including desktop and laptop computers, portable devices like tablets, and smartphones, anonymity is preserved, the questions and students’ responses can be presented through an attractive interface, data collection and analysis of the data are much easier, and it is flexible. Different online polling tools have been used in higher education for more than a decade (Noel, Stover, & McNutt, 2015; Shon & Smith, 2011; Abdel Meguid & Collins, 2017; Abdulla, 2018; Méndez Coca & Slisko, 2013; Dakka, 2015; Dervan, 2014). The most important features of polling tools are the audience size they can accommodate, the integrity in PowerPoint and other presenting tools, the variety of question types, availability of mobile phone app, and of course, the price. The polling tools have initially been designed with the focus on the audience in business meetings. Although some developers have tailored their tools to be suitable for the needs of schools and higher education, still not many of them have the features required for engineering and physical science disciplines, like using equations and graphs.
There is extensive research on polling tools in higher education (Noel, Stover, & McNutt, 2015, Shon & Smith, 2011, Abdel Meguid & Collins, 2017; Abdulla, 2018, Méndez Coca & Slisko, 2013; Dakka, 2015; Dervan, 2014). The majority of the studies have mainly focused on two aspects: 1) how to attract learners’ attention, and 2) providing the teacher with an insight on how learners have understood the topics taught in the lectures. One important feature, which exists in the conventional ‘hand-raising’ method, is when the students want to ask a question from the teacher or make a comment on the delivery of lectures. A few student feedback systems using online digital tools have been reported in the literature (Bergstrom, Harris, & Karahalios, 2011; Yu, 2017; Dobbins & Denton, 2017). However, the study compared to polling tools is very limited. The online polling tools are mainly designed as ARS, and although in some of them there are some question types which are suitable for acquiring feedback from the students. most of the online polling and feedback systems in the literature address one-way communication. Therefore, the research question of this investigation is to gain an insight into how a two-way live communication between students and teachers and among students could be implemented and how it can enhance the student’s engagement in classroom activities.
In this study, a novel approach is used in which the students can have a two-way online live communication with the classroom and the teacher, allowing students to respond to teachers’ questions through the ARS online tools - ‘polling’ - afterwards, and to post their own questions or comments to teachers - ‘feedback’ - afterwards.
The process was implemented in three targeting modules in two disciplines in the University of Liverpool during the second semester of the academic year 2018-2019. The method was evaluated using online surveys, focus groups, and the analysis of the students’ responses acquired by the ARS software tools. The ethics approval was obtained from the Faculty of Science and Engineering Ethics Committee, University of Liverpool.
The online polling software tools were used to acquire the audience response to content-related questions. At the start phase of the project, various polling software tools available were studied and their features were compared. The essential features are the audience size, variety of question types and quality of presentation, the capability of embedding in PowerPoint, availability of mobile phone apps, participant anonymity, and the price. It was found that the variety of question types and the quality of presentation of questions and answers have a significant impact on the students’ response rate and hence their engagement with classroom activities. The three most used polling software tools, Poll Everywhere (Deng, 2019; Shon & Smith, 2011; Abdel Meguid & Collins, 2017), Socrative (Abdulla, 2018; Méndez Coca & Slisko, 2013; Dakka 2015; Dervan 2014), Mentimeter (Mayhew, 2019; Wood, 2020), and new rivals share the essential features, have a comparable quality of presentation of questions and answers, and have competitive prices. All have a free version with limited features, like smaller audience size. Another software, Kahoot, which is categorized as a game-based learning platform, can be used as a polling tool (Yabuno, Luong, & Shaffer, 2019; Asmali, 2018) which attracts the interest of students due to its gamification features.
The polling tools can be used to acquire the students’ feedback by posing rating or presentation feedback question types during or at the end of lectures. It could be by multiple-choice rating questions, like ‘How was my lecture today?’, or short answer questions, like ‘Is there anything that you did not understand?’ However, it does not give the students the freedom to ask the questions they want; they just answer the questions posed by the lecturer or teacher. A platform is needed by which the students can ask any question at the time they need to ask. One way is to use polling tools to pose an open-ended question like ‘You can ask a question or give a comment at any time during the lecture’, and keep the question active throughout the lecture. Alternatively, another online tool can be used to get students questions (feedback). Some online communication tools suitable for this purpose are Textwall (Dobbins & Denton, 2017) and Padlet (Etfita & Wahyuni, 2020). The latter, however, is more suitable for discussions and collaboration between students. There are some reports on using social media and instant messaging mobile phone apps for communication between students and teacher, and among students themselves. They have been excluded from this research because they usually need registration, and they cannot fully preserve the anonymity of the participants.
Based on this study and comparing the features, it was decided to use Poll Everywhere for polling. One of the main reasons for this decision was that, to the best of our knowledge, it is the only polling software that can embed mathematical equations in questions and answer choices. This feature is particularly important in engineering and physical sciences disciplines. Kahoot was also used because of the interest of students in its gamification and competition features. For polling and feedback, three combinations were used:
Poll Everywhere for polling and Textwall for feedback
Poll Everywhere and Kahoot, one for polling and the other one for feedback
Poll Everywhere for both polling and feedback.
The process was implemented in three targeted modules, namely, Electronic Circuits ELEC104, a year 1 module with about 100 students, Communication Systems ELEC202, a year 2 module with about 200 students, and Biological Psychology PSYC133, a year 1 module with about 300 students, during the second semester of the academic year 2018-2019. The first two modules were in the Department of Electrical Engineering and Electronics, and the third module was in the School of Psychology, University of Liverpool. The process was applied to a larger number of modules in the academic year 2019-2020; however, it could not be evaluated as planned due to lockdown of the university caused by the COVID-19 pandemic.
Question types for polling
A range of question types and delivery formats were used, each intended for a particular focus, allowing a range of modes of interactivity that helped further engagement of students while reducing the monotony and predictability of delivery styles. The formats included:
Interactive polls within slides during a lecture or problem class, including multiple-choice or multiple-answer, clickable image, word cloud, rank order, or open-ended question types. The quiz-style competition in Kahoot, like the one in
Polls to stimulate thought on a topic prior to delivery. One of the favourite types is word cloud, where the audience can interactively visualize emerging and growing themes (
Question types in which the students see examples of other students’ work or ideas, typically open-ended or short answer questions. These activities allow students to read how other students deal with the same task and can be a helpful educational exercise. The lecturer can then offer feedback to examples of particular subjects or students’ concerns and can provide useful feedforward to better prepare the students for future summative assessments.
Light-hearted question types and humour in questions can influence the mood of the room. Some questions or options in MCQ are intentionally light-hearted to bring a smile, chuckle, or moan. For example, in the question of
The question types with a high response rate a) Kahoot quiz, b) word cloud, c) clickable image, and d) multiple-choice with humour.
Question types for feedback
In addition to engaging the learners with short assessment tasks, the instructor needs to be able to gauge the mood of the room, the prevalent views, preferences, and concerns. She/he should also be able to respond to feedback, questions, and immediate concerns raised by students, by empowering the audience with the facility to initiate feedback. This was facilitated via two independent channels in three different ways:
Using polling tools with short answer or open-ended questions, ‘Am I going too fast?’, ‘Would you like a further example?’, or ‘What type of problem class should we have next week?’. Another type is multiple-choice or clickable image emotion scale question types like ‘How was today’s lecture?’ (
A dedicated channel by which the students can freely ask questions and provide feedback and comments. The tool used for this purpose was Textwall, by which the students could send messages using short message service (SMS) on mobile phones or a dedicated message web page. The information could be given to students at the beginning of the session, and a visual reminder can appear at the side or bottom of each slide. Depending on the format and style of delivery, these messages can either be shown continuously at the bottom of the screen, visible to the audience, or made private and visible to the lecturer only, either in a separate browser window or on a separate portable device. Although having to switch between multiple apps or web pages is not ideal, having both these channels available simultaneously provides both the lecturer and the students with a voice with which to influence the progress of the session. Students who may not otherwise have had the opportunity, courage, or confidence to ask certain questions, were able, without interrupting the lecture’s flow, to ask questions like ‘Is today’s lecture included in the class test?’, ‘Could you please repeat that?’, or warnings such as ‘the microphone stopped working’, ‘Could you turn the lights up please? It’s dark at the back’ (
Although having to switch between multiple apps or web pages is not ideal, having both these channels available simultaneously provides both the lecturer and the students with a voice with which to influence the progress of the session. Students who may not otherwise have had the opportunity, courage, or confidence to ask certain questions, were able, without interrupting the lecture’s flow, to ask questions like ‘Is today’s lecture included in the class test?’, ‘Could you please repeat that?’, or warnings such as ‘the microphone stopped working’, ‘Could you turn the lights up please? It’s dark at the back’ (
A polling tool can be used for both polling and feedback at the same time, which provides more comfort to students by avoiding switching between different apps or web pages. An open-ended statement like ‘You can ask a question or give a comment at any time during the lecture’ is posed and left active during the session. When polling questions are posed, this question becomes temporarily deactivated. Poll Everywhere has a new feature, pinned question, which allows a question of this type to remain active even when other questions are posed. This feature was not available at the time of the study, but it was possible to reactivate the feedback question manually.
a) Emotions scale question type, and b) typical Textwall questions.
The process was evaluated by online surveys, a focus group, and the analysis of the students’ response gathered during lectures. There are some studies on surveys in general and online surveys which show the impact of the number of questions or length of the survey on the audience response rate (Liu & Wronski, 2017; Porter 2004). To keep the length of the survey as short as possible, twenty-five multiple-choice and multiple-answer questions and one open-ended question for comments were used in the questionnaire. The questionnaires were uploaded to Google Forms and the link was provided in the learning management system (LMS) pages of the targeted modules. The students were asked to give their consent before starting the survey.
For focus groups, the initial plan was to have a focus group for each module with six participants. However, because there were not enough volunteers, the participants were merged into one focus group with six participants. The participants were general students of the targeted modules and no requirements or conditions were applied. The participation was voluntary and the participants signed a consent form before the start of the session. The facilitator of the focus group was chosen not to be any of the students’ lecturers, to provide more freedom to the participants in expressing their opinions. The facilitator led the discussion through the pre-designed thematic questions and made sure that all participants took part in the discussion. The audio recording was transcribed and anonymized by a professional transcriber.
The students’ responses stored in the polling software were also analysed to find the correlation between the response rate and some parameters like question type, polling software tools, and the implementation method.
Results and discussion
The questionnaires for the survey were made online in the middle of the semester and they were open until the end of the semester. The number of responses was eleven for ELEC104 (around 10%), 25 for ELEC202 (around10%), and eighty-two for PSYC133 (around 27%). Since the feedback process was not used regularly in PSYC133, the answers to questions related to feedback were excluded from the students’ response for this module. The count for polling and general questions, therefore, is 118 and it is thirty-six for feedback.
About half of the responses indicate that the students have used polling and feedback to a large extent or an extent. Only 13% of students did not take part in polling activities, whereas this number is zero for feedback. This is in close agreement with the results reported by Voelkel and Bennett (Voelkel & Bennett, 2014). The students who did not take part in live polling or took part rarely expressed the main reasons as being that they did not know the answers or did not have access to the website or app (
The reasons for not taking part in live polling a) and b) feedback activities.
In response to the question of how the polling and feedback have enhanced the engagement of students in classroom activities, the highest vote was for making the lecture more lively (
Students’ response to question on engagement, a) polling and b) feedback.
The majority of students ranked both polling and feedback four out of five, followed by a large number of ranks five and three (
Ranking of polling and feedback by students
Key: 5: Liked it very much, 1: Did not like it at all.
Students’ opinion on advantages (a, b) and disadvantages (c, d) of polling (a, c) and feedback (b, d).
When the students asked if the process should be used in other modules as well, about 40% expressed that it should be used in all modules and a large number of students believed that it should be used in some modules (37% for polling and 25% for feedback). The study showed no relevance of the polling or feedback to the attendance of students in lectures. A few students who responded to the open-ended question for comments expressed how they enjoyed activities using Kahoot. It was also suggested that polling should be used moderately.
Although the students were from two completely different disciplines and they responded differently to some questions, there were, however, many similarities in the students’ responses. For example, they all expressed, at almost the same rate, that polling has increased their engagement in the lecture, has made the lecture more interesting and livelier, and had been fun. The psychology students used live polling tools about 10% more than electrical engineering students, presumably because the method has been widely used in their school, in nearly all modules, whereas in electrical engineering, very few lecturers used the method before this study. For electrical engineering students, the main reason that they did not answer the questions was expressed as they ‘did not know the answer’ (about 45%), whereas for psychology students the main reasons were that they ‘did not have access to the app or website’ (about 30%) and ‘did not like the method’ (23%). Only 22% expressed that they ‘did not know the answer’. This could be related to the difference in the nature of disciplines; in psychology - the topics are more abstract and the students usually discuss them with their teachers during the lecture, whereas in electrical engineering the topics are more cognitive, in which students need to follow more independent learning based on a cognitive approach. However, there are many other parameters which were not the focus of this study and need further investigation.
A thematic analysis was used for the response of the students in the focus group. The participants expressed that the main benefit of polling is that the lecturer of the module would know how well students have understood the lecture contents. If many students give a wrong answer to a question, the lecturer finds out immediately and can explain the subject further or in a different way. They emphasized that students rarely raise their hands to ask a question because ‘people don’t want to be the one putting their hand up saying they don’t understand it’. Polling is an easier way to say ‘I’m struggling with this topic because I’m not getting the question right’ or ‘I understand it, let’s move on’.
One student expressed the importance of how students can learn from the answers given by other students and when they work with each other to find the correct answer, which he described as ‘learning by teaching’ and believes it reinforces the learning. Some students also mentioned that sometimes there are too many questions, particularly when the same question is repeated with different numerical values. They believe overdoing of polling makes it boring and distracting, the students lose their interest to take part, and it wastes time for other activities in the lecture. This was mentioned in particular referring to tutorials or problem classes where the students have one-to-one interaction with the lecturer or demonstrators. Another student stated how similar repetitive questions could backfire, and instead of engaging students, students stop caring about the activity and stop paying attention. ‘Too many questions’ was mentioned as one of the main disadvantages of the method by a few students. They suggested that the lecturers should have planned the activities well and be flexible; they do not need to use all the questions they have designed. When they see the students understand the subject, it would be better to ignore the other questions on the same subject and move on. The students also stated that they prefer questions about basic concepts or the ones with a quick solution. Another student expressed how the method could enhance the interaction between students and the lecturer, and how it could overcome the problem in some conventional lectures when students feel that they are disconnected and are just watching a screen.
A few students indicated their interest in Kahoot and how the entertaining questions in Kahoot can change the atmosphere of the class and make all the students be engaged. However, there was a mixed feeling about Kahoot and some questions in Poll Everywhere with entertaining or funny presentation. One student believes that it is a waste of time and he does not gain anything from them, compared with serious questions. However, he admits that it is useful in early morning classes to change the mood of the class and get the students involved. A few students believed that the emotion scale question with ‘smilies’ (
Most participants of the focus group had a positive opinion on feedback. A student explained the situation as ‘Like they’ll say, “has anyone got any questions?” and you’ll have a burning question, but you won’t ask it because you don’t want to be that guy’. You think it might be a stupid question, but you can just send it through Poll Everywhere or other tools without being worried about embarrassment. When they were asked if they did not miss the face to face interaction, it was stated that it depends on the type of questions they want to ask. If they want to have a detailed discussion with the lecturer, they can go and see the lecturer and ask the question in person. But for the type of questions that you usually ask after the lecture and both students and lecturer are in a hurry to arrive on time to the next lecture, it is the best solution.
The authors understand that the focus group of six participants was a very small sample of more than 600 students involved in this study, and the discussion of less than an hour cannot be considered as the opinion of all students. However, even in this short meeting, they mentioned some important points which the authors of this work found true by observing the reactions of students during the implementation of the method in later lectures or in short conversations with students. When some authors of this work practiced the discussed points in their lectures, like using polls moderately, not using repetitive questions, or using questions on basic concepts and avoiding complex questions which need more elaboration, they could see the difference by observing the student response rate or the mood in the classroom.
Analysis of students’ response
For a precise statistical analysis, the collected data in polling tools should be studied in a systematic way and in a larger number of modules. The impact of the question types, the online tools used for the activity, the number of questions in each session, the method of presentation, the year of study, the size of the cohort, and other parameters on the response rate can be studied. In this section, a few examples of students response are presented, which can be useful to enhance the efficiency of the activities.
One of the parameters which has a large impact on the response rate is the question type. The clickable image question in
The questions about the basic concepts and those when the students can find the solutions quickly have more responses than those which need complex calculations. However, in problems classes, even the questions which required a few minutes to find the answer had a good response rate. In general, the students’ response to similar repetitive questions was low. A second clickable image question similar to one in
In this work, a novel two-way online communication system between students and lecturer using polling and other types of digital communication tools is presented. The system consists of polling, by which the lecturer can attract the attention of students to the lecture and evaluate their understanding of the subject, and feedback, which provides a platform for students to ask content-related questions or comment on the delivery of the lecture. The details of each activity and the software tools were explained, including a discussion about the question types in polling. The process was implemented in three targeted modules and was evaluated by online surveys, a focus group, and analysis of the students’ response. The results of each evaluation method were presented and were discussed, and showed enhancement of students’ engagement in classroom activities. It reveals that the process should be carefully designed and implemented to achieve the desired outcome.
Findings from the online survey and the focus group show that the majority of students found the method has been useful to increase their engagement in classroom activities, to make the lecture more lively, and to enhance their learning. More than 40% of students expressed that the method should be used in all modules or some modules. The students suggested that polling activities should be well designed and be used moderately. The students found the feedback process as a way to ask questions and reflect their concerns freely, without being worried to be pointed out or embarrassed. Use of interactive polling and feedback is not without challenges and obstacles, and several of these became apparent in this study. Lectures must be cautious of overuse of polling, as students may grow weary of constant polling, particularly if the activity involves multiple platforms or apps, or if it results in distraction or unhelpful disruption to the flow of the lecture.