Developing Academic Practice

Enhancing the engagement of large cohorts using live interactive polling and feedback

Developing Academic Practice 2021, 31–50.

Abstract

A novel two-way live communication process between teacher and students in large classes has been implemented using Web 2.0 polling platforms to enhance the engagement of students in classroom activities, and to allow the teacher to respond to students’ question in real time. The research question of this study was to explore and evaluate the two-way live communication between students and teachers, and among students, and how it enhances the engagement of students. To study this research question, modules from two different disciplines (electrical engineering and electronics, and psychology) have adopted the two-way live communication process. A mixed methodology, including online surveys, focus groups, and analysis of students’ responses, was followed to analyse in-depth the impact of the method on students’ engagement and learning. The study is supported by the review of the background from the literature. The outcomes of the study are discussed and compared with previous studies from the literature.

Enhancing the engagement of large cohorts using live interactive polling and feedback

Abstract

A novel two-way live communication process between teacher and students in large classes has been implemented using Web 2.0 polling platforms to enhance the engagement of students in classroom activities, and to allow the teacher to respond to students’ question in real time. The research question of this study was to explore and evaluate the two-way live communication between students and teachers, and among students, and how it enhances the engagement of students. To study this research question, modules from two different disciplines (electrical engineering and electronics, and psychology) have adopted the two-way live communication process. A mixed methodology, including online surveys, focus groups, and analysis of students’ responses, was followed to analyse in-depth the impact of the method on students’ engagement and learning. The study is supported by the review of the background from the literature. The outcomes of the study are discussed and compared with previous studies from the literature.

Introduction

Interactions through classroom activities are crucial for students’ learning process, as they allow learners to enhance their knowledge either by adding new information or by reorganizing their existing ideas (Walsh, 2011). Therefore, students’ engagement in classroom activities transform learning from a teacher-centred approach to a student-centred deep learning process (Ramsden, 1992), and changes the learning process from a behaviourist perspective to cognitive constructivism approach (Stewar, 2012). In conventional teaching, the teachers may pose questions to the students, which has a two-fold impact: 1) to enhance their engagement in learning, and 2) to provide the teacher a better idea on how well the students have understood a topic. The students may also raise hands to ask a question about the part of the lecture that they do not understand or give comments on the delivery of the lecture or classroom issues. The so-called ‘hand-raising’ method is not practical in classrooms with a large number of students. Even with a small number of students, the hand-raising method is not very efficient because the students might be shy, they do not want to be pointed out, or are embarrassed to give the wrong answers.

To enhance the interactivity and engagement process in a classroom, audience response systems (ARS) have been used in higher education for more than two decades. An early method of this kind was the use of flashcards which were used for multiple-choice questions (MCQ) or true-false questions (Gurrero, 2010; Lasry, 2008). Although the problem with a large number of audience was partially solved by this method, however, the lack of anonymity prevented students from being actively engaged with the teaching process. The anonymity issue was overcome by using electronic clickers, which allowed students to vote on the answers to a multiple-choice question by pressing the relevant button on a handheld device (Gurrero, 2010; Lasry, 2008; Caldwell, 2007). Both flashcards and electronic clickers have a few shortcomings, such as the initial budget to purchase the devices for all participants, carrying them to lectures, the time wasted distributing and collecting them, damage or loss, and more importantly, lack of flexibility. There have been a few studies on using mobile phone text messaging for audience response (Voelkel & Bennett, 2014; Gomez & Elliot, 2013). The method has more advantages over the clickers, particularly to allow open-ended questions; however, the students could not be anonymous since their phone numbers were revealed to teachers. Furthermore, the participants have to use their text message allowance, or they might be charged for sending text messages by the mobile network provider.

A breakthrough for ARS was when online polling software tools were introduced. The advantages are numerous: they are not limited to a particular device and the audience can send the responses using various devices, including desktop and laptop computers, portable devices like tablets, and smartphones, anonymity is preserved, the questions and students’ responses can be presented through an attractive interface, data collection and analysis of the data are much easier, and it is flexible. Different online polling tools have been used in higher education for more than a decade (Noel, Stover, & McNutt, 2015; Shon & Smith, 2011; Abdel Meguid & Collins, 2017; Abdulla, 2018; Méndez Coca & Slisko, 2013; Dakka, 2015; Dervan, 2014). The most important features of polling tools are the audience size they can accommodate, the integrity in PowerPoint and other presenting tools, the variety of question types, availability of mobile phone app, and of course, the price. The polling tools have initially been designed with the focus on the audience in business meetings. Although some developers have tailored their tools to be suitable for the needs of schools and higher education, still not many of them have the features required for engineering and physical science disciplines, like using equations and graphs.

There is extensive research on polling tools in higher education (Noel, Stover, & McNutt, 2015, Shon & Smith, 2011, Abdel Meguid & Collins, 2017; Abdulla, 2018, Méndez Coca & Slisko, 2013; Dakka, 2015; Dervan, 2014). The majority of the studies have mainly focused on two aspects: 1) how to attract learners’ attention, and 2) providing the teacher with an insight on how learners have understood the topics taught in the lectures. One important feature, which exists in the conventional ‘hand-raising’ method, is when the students want to ask a question from the teacher or make a comment on the delivery of lectures. A few student feedback systems using online digital tools have been reported in the literature (Bergstrom, Harris, & Karahalios, 2011; Yu, 2017; Dobbins & Denton, 2017). However, the study compared to polling tools is very limited. The online polling tools are mainly designed as ARS, and although in some of them there are some question types which are suitable for acquiring feedback from the students. most of the online polling and feedback systems in the literature address one-way communication. Therefore, the research question of this investigation is to gain an insight into how a two-way live communication between students and teachers and among students could be implemented and how it can enhance the student’s engagement in classroom activities.

In this study, a novel approach is used in which the students can have a two-way online live communication with the classroom and the teacher, allowing students to respond to teachers’ questions through the ARS online tools - ‘polling’ - afterwards, and to post their own questions or comments to teachers - ‘feedback’ - afterwards.

The process was implemented in three targeting modules in two disciplines in the University of Liverpool during the second semester of the academic year 2018-2019. The method was evaluated using online surveys, focus groups, and the analysis of the students’ responses acquired by the ARS software tools. The ethics approval was obtained from the Faculty of Science and Engineering Ethics Committee, University of Liverpool.

Method

The online polling software tools were used to acquire the audience response to content-related questions. At the start phase of the project, various polling software tools available were studied and their features were compared. The essential features are the audience size, variety of question types and quality of presentation, the capability of embedding in PowerPoint, availability of mobile phone apps, participant anonymity, and the price. It was found that the variety of question types and the quality of presentation of questions and answers have a significant impact on the students’ response rate and hence their engagement with classroom activities. The three most used polling software tools, Poll Everywhere (Deng, 2019; Shon & Smith, 2011; Abdel Meguid & Collins, 2017), Socrative (Abdulla, 2018; Méndez Coca & Slisko, 2013; Dakka 2015; Dervan 2014), Mentimeter (Mayhew, 2019; Wood, 2020), and new rivals share the essential features, have a comparable quality of presentation of questions and answers, and have competitive prices. All have a free version with limited features, like smaller audience size. Another software, Kahoot, which is categorized as a game-based learning platform, can be used as a polling tool (Yabuno, Luong, & Shaffer, 2019; Asmali, 2018) which attracts the interest of students due to its gamification features.

The polling tools can be used to acquire the students’ feedback by posing rating or presentation feedback question types during or at the end of lectures. It could be by multiple-choice rating questions, like ‘How was my lecture today?’, or short answer questions, like ‘Is there anything that you did not understand?’ However, it does not give the students the freedom to ask the questions they want; they just answer the questions posed by the lecturer or teacher. A platform is needed by which the students can ask any question at the time they need to ask. One way is to use polling tools to pose an open-ended question like ‘You can ask a question or give a comment at any time during the lecture’, and keep the question active throughout the lecture. Alternatively, another online tool can be used to get students questions (feedback). Some online communication tools suitable for this purpose are Textwall (Dobbins & Denton, 2017) and Padlet (Etfita & Wahyuni, 2020). The latter, however, is more suitable for discussions and collaboration between students. There are some reports on using social media and instant messaging mobile phone apps for communication between students and teacher, and among students themselves. They have been excluded from this research because they usually need registration, and they cannot fully preserve the anonymity of the participants.

Based on this study and comparing the features, it was decided to use Poll Everywhere for polling. One of the main reasons for this decision was that, to the best of our knowledge, it is the only polling software that can embed mathematical equations in questions and answer choices. This feature is particularly important in engineering and physical sciences disciplines. Kahoot was also used because of the interest of students in its gamification and competition features. For polling and feedback, three combinations were used:

  • Poll Everywhere for polling and Textwall for feedback

  • Poll Everywhere and Kahoot, one for polling and the other one for feedback

  • Poll Everywhere for both polling and feedback.

  • The process was implemented in three targeted modules, namely, Electronic Circuits ELEC104, a year 1 module with about 100 students, Communication Systems ELEC202, a year 2 module with about 200 students, and Biological Psychology PSYC133, a year 1 module with about 300 students, during the second semester of the academic year 2018-2019. The first two modules were in the Department of Electrical Engineering and Electronics, and the third module was in the School of Psychology, University of Liverpool. The process was applied to a larger number of modules in the academic year 2019-2020; however, it could not be evaluated as planned due to lockdown of the university caused by the COVID-19 pandemic.

    Question types for polling

    A range of question types and delivery formats were used, each intended for a particular focus, allowing a range of modes of interactivity that helped further engagement of students while reducing the monotony and predictability of delivery styles. The formats included:

  • Interactive polls within slides during a lecture or problem class, including multiple-choice or multiple-answer, clickable image, word cloud, rank order, or open-ended question types. The quiz-style competition in Kahoot, like the one in Figure 1a creates a competitive atmosphere, particularly in problem classes, and the rather childish background music and visual effects have been popular with a large segment of the student cohort in this study. Kahoot Premium allows the introduction of explanatory slides into the sequence of questions, and even the embedding of YouTube videos allowing further clarification of a question. Another unique advantage of Kahoot quizzes is the ability to make them available for subsequent asynchronous use. Lecturers can monitor student progress, and as the software automatically flags questions that students found difficult, these areas can be given extra attention during subsequent teaching sessions.

  • Polls to stimulate thought on a topic prior to delivery. One of the favourite types is word cloud, where the audience can interactively visualize emerging and growing themes (Figure 1b), and can be entertaining as students seek to influence the evolving of the word cloud. Word cloud can provide the lecturer with useful prompts for talking points to address misconceptions and to introduce future topics. Students may see this as a way to take part in a larger conversation. Another popular type is clickable image in which the students click on the correct area of an image (Figure 1c) and shows how the students can influence each other when the number of clicks in a particular area is increasing.

  • Question types in which the students see examples of other students’ work or ideas, typically open-ended or short answer questions. These activities allow students to read how other students deal with the same task and can be a helpful educational exercise. The lecturer can then offer feedback to examples of particular subjects or students’ concerns and can provide useful feedforward to better prepare the students for future summative assessments.

  • Light-hearted question types and humour in questions can influence the mood of the room. Some questions or options in MCQ are intentionally light-hearted to bring a smile, chuckle, or moan. For example, in the question of Figure 1d, the 33% students who chose ‘Coffee filter’ knew very well that the answer is irrelevant and probably knew that the correct answer is ‘A’. However, they decided to respond to the lecturer’s humour.

  • The question types with a high response rate a) Kahoot quiz, b) word cloud, c) clickable image, and d) multiple-choice with humour.

    Question types for feedback

    In addition to engaging the learners with short assessment tasks, the instructor needs to be able to gauge the mood of the room, the prevalent views, preferences, and concerns. She/he should also be able to respond to feedback, questions, and immediate concerns raised by students, by empowering the audience with the facility to initiate feedback. This was facilitated via two independent channels in three different ways:

  • Using polling tools with short answer or open-ended questions, ‘Am I going too fast?’, ‘Would you like a further example?’, or ‘What type of problem class should we have next week?’. Another type is multiple-choice or clickable image emotion scale question types like ‘How was today’s lecture?’ (Figure 2a).

  • A dedicated channel by which the students can freely ask questions and provide feedback and comments. The tool used for this purpose was Textwall, by which the students could send messages using short message service (SMS) on mobile phones or a dedicated message web page. The information could be given to students at the beginning of the session, and a visual reminder can appear at the side or bottom of each slide. Depending on the format and style of delivery, these messages can either be shown continuously at the bottom of the screen, visible to the audience, or made private and visible to the lecturer only, either in a separate browser window or on a separate portable device.

  • Although having to switch between multiple apps or web pages is not ideal, having both these channels available simultaneously provides both the lecturer and the students with a voice with which to influence the progress of the session. Students who may not otherwise have had the opportunity, courage, or confidence to ask certain questions, were able, without interrupting the lecture’s flow, to ask questions like ‘Is today’s lecture included in the class test?’, ‘Could you please repeat that?’, or warnings such as ‘the microphone stopped working’, ‘Could you turn the lights up please? It’s dark at the back’ (Figure 2b).

  • A polling tool can be used for both polling and feedback at the same time, which provides more comfort to students by avoiding switching between different apps or web pages. An open-ended statement like ‘You can ask a question or give a comment at any time during the lecture’ is posed and left active during the session. When polling questions are posed, this question becomes temporarily deactivated. Poll Everywhere has a new feature, pinned question, which allows a question of this type to remain active even when other questions are posed. This feature was not available at the time of the study, but it was possible to reactivate the feedback question manually.

  • a) Emotions scale question type, and b) typical Textwall questions.

    Evaluation

    The process was evaluated by online surveys, a focus group, and the analysis of the students’ response gathered during lectures. There are some studies on surveys in general and online surveys which show the impact of the number of questions or length of the survey on the audience response rate (Liu & Wronski, 2017; Porter 2004). To keep the length of the survey as short as possible, twenty-five multiple-choice and multiple-answer questions and one open-ended question for comments were used in the questionnaire. The questionnaires were uploaded to Google Forms and the link was provided in the learning management system (LMS) pages of the targeted modules. The students were asked to give their consent before starting the survey.

    For focus groups, the initial plan was to have a focus group for each module with six participants. However, because there were not enough volunteers, the participants were merged into one focus group with six participants. The participants were general students of the targeted modules and no requirements or conditions were applied. The participation was voluntary and the participants signed a consent form before the start of the session. The facilitator of the focus group was chosen not to be any of the students’ lecturers, to provide more freedom to the participants in expressing their opinions. The facilitator led the discussion through the pre-designed thematic questions and made sure that all participants took part in the discussion. The audio recording was transcribed and anonymized by a professional transcriber.

    The students’ responses stored in the polling software were also analysed to find the correlation between the response rate and some parameters like question type, polling software tools, and the implementation method.

    Results and discussion

    Online survey

    The questionnaires for the survey were made online in the middle of the semester and they were open until the end of the semester. The number of responses was eleven for ELEC104 (around 10%), 25 for ELEC202 (around10%), and eighty-two for PSYC133 (around 27%). Since the feedback process was not used regularly in PSYC133, the answers to questions related to feedback were excluded from the students’ response for this module. The count for polling and general questions, therefore, is 118 and it is thirty-six for feedback.

    About half of the responses indicate that the students have used polling and feedback to a large extent or an extent. Only 13% of students did not take part in polling activities, whereas this number is zero for feedback. This is in close agreement with the results reported by Voelkel and Bennett (Voelkel & Bennett, 2014). The students who did not take part in live polling or took part rarely expressed the main reasons as being that they did not know the answers or did not have access to the website or app (Figure 3a). For feedback, the main reason was ‘I usually do not ask questions in class’, followed closely by ‘I understood the lecture and did not have any question’ (Figure 3b). Interestingly, the embarrassment, which is one of the main reasons that students do not ask questions in conventional lectures without live feedback tools, did not have a significant effect on the students’ response (Filer, 2010; Florenthal, 2018). Other studies show that anonymity and embarrassment to speak up loud are highly correlated and have been the main motivation of students for using the software polling tools (Abdel Meguid & Collins, 2017).

    The reasons for not taking part in live polling a) and b) feedback activities.

    In response to the question of how the polling and feedback have enhanced the engagement of students in classroom activities, the highest vote was for making the lecture more lively (Figure 4). The other high rate responses were: paying more attention to the lecture, answering or asking questions without feeling embarrassed, learning from the other students’ answers in polling, and being able to comment on the lecture contents and pace in feedback. Some of these answers have a high rate in other studies as well (Abdel Meguid & Collins, 2017; Abdulla, 2018).

    Students’ response to question on engagement, a) polling and b) feedback.

    The majority of students ranked both polling and feedback four out of five, followed by a large number of ranks five and three (Table 1), where five is when they liked it very much and one is when they did not like it at all. The biggest advantage of the polling was voted as being engaged in class activities and more lively lectures (Figure 5a), and the advantages of feedback were mentioned as: better understanding of the subject, asking questions easily, and engagement in class activities (Figure 5b). The biggest disadvantage of polling was voted as wasting lecture time followed by distraction from the lecture (Figure 5c), whereas for feedback, the majority voted for no disadvantage (Figure 5d). The authors were concerned about this anomaly, and when the results of each module were analysed, it was found that the majority of ‘wasting time’ votes (40%) were from students of PSYC133, whereas it was substantially lower in the other modules. Further investigations showed that the reason for the difference was that in PSYC133 too many questions were asked in each session. The students respond better if the questions are asked moderately. In the focus group, discussed in the next section, the students expressed the same concern.

    Ranking of polling and feedback by students

    5 4 3 2 1
    Polling 23% 36% 23% 10% 8%
    Feedback 25% 31% 25% 8% 0%

    Key: 5: Liked it very much, 1: Did not like it at all.

    Students’ opinion on advantages (a, b) and disadvantages (c, d) of polling (a, c) and feedback (b, d).

    When the students asked if the process should be used in other modules as well, about 40% expressed that it should be used in all modules and a large number of students believed that it should be used in some modules (37% for polling and 25% for feedback). The study showed no relevance of the polling or feedback to the attendance of students in lectures. A few students who responded to the open-ended question for comments expressed how they enjoyed activities using Kahoot. It was also suggested that polling should be used moderately.

    Although the students were from two completely different disciplines and they responded differently to some questions, there were, however, many similarities in the students’ responses. For example, they all expressed, at almost the same rate, that polling has increased their engagement in the lecture, has made the lecture more interesting and livelier, and had been fun. The psychology students used live polling tools about 10% more than electrical engineering students, presumably because the method has been widely used in their school, in nearly all modules, whereas in electrical engineering, very few lecturers used the method before this study. For electrical engineering students, the main reason that they did not answer the questions was expressed as they ‘did not know the answer’ (about 45%), whereas for psychology students the main reasons were that they ‘did not have access to the app or website’ (about 30%) and ‘did not like the method’ (23%). Only 22% expressed that they ‘did not know the answer’. This could be related to the difference in the nature of disciplines; in psychology - the topics are more abstract and the students usually discuss them with their teachers during the lecture, whereas in electrical engineering the topics are more cognitive, in which students need to follow more independent learning based on a cognitive approach. However, there are many other parameters which were not the focus of this study and need further investigation.

    Focus group

    A thematic analysis was used for the response of the students in the focus group. The participants expressed that the main benefit of polling is that the lecturer of the module would know how well students have understood the lecture contents. If many students give a wrong answer to a question, the lecturer finds out immediately and can explain the subject further or in a different way. They emphasized that students rarely raise their hands to ask a question because ‘people don’t want to be the one putting their hand up saying they don’t understand it’. Polling is an easier way to say ‘I’m struggling with this topic because I’m not getting the question right’ or ‘I understand it, let’s move on’.

    One student expressed the importance of how students can learn from the answers given by other students and when they work with each other to find the correct answer, which he described as ‘learning by teaching’ and believes it reinforces the learning. Some students also mentioned that sometimes there are too many questions, particularly when the same question is repeated with different numerical values. They believe overdoing of polling makes it boring and distracting, the students lose their interest to take part, and it wastes time for other activities in the lecture. This was mentioned in particular referring to tutorials or problem classes where the students have one-to-one interaction with the lecturer or demonstrators. Another student stated how similar repetitive questions could backfire, and instead of engaging students, students stop caring about the activity and stop paying attention. ‘Too many questions’ was mentioned as one of the main disadvantages of the method by a few students. They suggested that the lecturers should have planned the activities well and be flexible; they do not need to use all the questions they have designed. When they see the students understand the subject, it would be better to ignore the other questions on the same subject and move on. The students also stated that they prefer questions about basic concepts or the ones with a quick solution. Another student expressed how the method could enhance the interaction between students and the lecturer, and how it could overcome the problem in some conventional lectures when students feel that they are disconnected and are just watching a screen.

    A few students indicated their interest in Kahoot and how the entertaining questions in Kahoot can change the atmosphere of the class and make all the students be engaged. However, there was a mixed feeling about Kahoot and some questions in Poll Everywhere with entertaining or funny presentation. One student believes that it is a waste of time and he does not gain anything from them, compared with serious questions. However, he admits that it is useful in early morning classes to change the mood of the class and get the students involved. A few students believed that the emotion scale question with ‘smilies’ (Figure 2a) that some lecturers show at the end of the lectures is pointless. They may get a good or bad feeling about their lecture, but it does not give them any hints to improve the lectures.

    Most participants of the focus group had a positive opinion on feedback. A student explained the situation as ‘Like they’ll say, “has anyone got any questions?” and you’ll have a burning question, but you won’t ask it because you don’t want to be that guy’. You think it might be a stupid question, but you can just send it through Poll Everywhere or other tools without being worried about embarrassment. When they were asked if they did not miss the face to face interaction, it was stated that it depends on the type of questions they want to ask. If they want to have a detailed discussion with the lecturer, they can go and see the lecturer and ask the question in person. But for the type of questions that you usually ask after the lecture and both students and lecturer are in a hurry to arrive on time to the next lecture, it is the best solution.

    The authors understand that the focus group of six participants was a very small sample of more than 600 students involved in this study, and the discussion of less than an hour cannot be considered as the opinion of all students. However, even in this short meeting, they mentioned some important points which the authors of this work found true by observing the reactions of students during the implementation of the method in later lectures or in short conversations with students. When some authors of this work practiced the discussed points in their lectures, like using polls moderately, not using repetitive questions, or using questions on basic concepts and avoiding complex questions which need more elaboration, they could see the difference by observing the student response rate or the mood in the classroom.

    Analysis of students’ response

    For a precise statistical analysis, the collected data in polling tools should be studied in a systematic way and in a larger number of modules. The impact of the question types, the online tools used for the activity, the number of questions in each session, the method of presentation, the year of study, the size of the cohort, and other parameters on the response rate can be studied. In this section, a few examples of students response are presented, which can be useful to enhance the efficiency of the activities.

    One of the parameters which has a large impact on the response rate is the question type. The clickable image question in Figure 1c had 321 responses from a cohort of around 100 students, whereas some MCQ in the same session had less than ten responses. The word cloud question type shown in Figure 1b also attracts a large number of responses, when the students see how their favourite answers are evolving. Quizzes and competitions are also very popular, particularly in Kahoot (Figure 1b).

    The questions about the basic concepts and those when the students can find the solutions quickly have more responses than those which need complex calculations. However, in problems classes, even the questions which required a few minutes to find the answer had a good response rate. In general, the students’ response to similar repetitive questions was low. A second clickable image question similar to one in Figure 1c and with the same level of complexity had a slightly lower response of 259, and the third question of this type attracted only thirty-nine responses. Even entertaining questions when they are repeated lose the attention of students. Like any other learning material, the teachers of the module need to spend time designing questions which are attractive in both content and appearance. Eye-catching questions containing images and intriguing questions which make the students curious about the correct answer attract more attention.

    Conclusions

    In this work, a novel two-way online communication system between students and lecturer using polling and other types of digital communication tools is presented. The system consists of polling, by which the lecturer can attract the attention of students to the lecture and evaluate their understanding of the subject, and feedback, which provides a platform for students to ask content-related questions or comment on the delivery of the lecture. The details of each activity and the software tools were explained, including a discussion about the question types in polling. The process was implemented in three targeted modules and was evaluated by online surveys, a focus group, and analysis of the students’ response. The results of each evaluation method were presented and were discussed, and showed enhancement of students’ engagement in classroom activities. It reveals that the process should be carefully designed and implemented to achieve the desired outcome.

    Findings from the online survey and the focus group show that the majority of students found the method has been useful to increase their engagement in classroom activities, to make the lecture more lively, and to enhance their learning. More than 40% of students expressed that the method should be used in all modules or some modules. The students suggested that polling activities should be well designed and be used moderately. The students found the feedback process as a way to ask questions and reflect their concerns freely, without being worried to be pointed out or embarrassed. Use of interactive polling and feedback is not without challenges and obstacles, and several of these became apparent in this study. Lectures must be cautious of overuse of polling, as students may grow weary of constant polling, particularly if the activity involves multiple platforms or apps, or if it results in distraction or unhelpful disruption to the flow of the lecture.

    References

    Abdel Meguid, E., & Collins, M. (2017). Students’ perceptions of lecturing approaches: Traditional versus interactive teaching. Advances in Medical Education and Practice, 8, 229-241. Google Scholar

    Abdulla, M. H. (2018). The use of an online student response system to support learning of physiology during lectures to medical students. Education and Information Technologies, 23(6), 2931-2946. Google Scholar

    Asmali, M. (2018). Integrating technology into ESP classes: Use of student response system in English for specific purposes instruction. Teaching English with Technology, 18(3), 86-104. Google Scholar

    Bergstrom, T., Harris, A., & Karahalios, K. (2011, 5-9 September 2011). Encouraging initiative in the classroom with anonymous feedback [Conference presentation]. 13 International Conference on Human-Computer Interaction (INTERACT), Lisbon, Portugal. Google Scholar

    Caldwell, J. E. (2007). Clickers in the classroom: Current research and best-practice tips. CBE: Life Sciences Education, 6(1), 9-20. Google Scholar

    Dakka, S. M. (2015). Using Socrative to enhance in-class student engagement and collaboration. International Journal on Integrating Technology in Education (IJITE), 4(3), 13-19. Google Scholar

    Deng, L. (2019). Assess and engage: How Poll Everywhere can make learning meaningful again for millennial library users. Journal of Electronic Resources Librarianship, 31(2), 55-65. Google Scholar

    Dervan, P. (2014). Enhancing in-class student engagement using Socrative (an online student response system): A report. All Ireland Journal of Teaching and Learning in Higher Education, 6(3), 1801-1813. Google Scholar

    Dobbins, C., & Denton, P. (2017). MyWallMate: An investigation into the use of mobile technology in enhancing student engagement. TechTrends: Linking Research & Practice to Improve Learning, 61(6), 541-549. Google Scholar

    Etfita, F., & Wahyuni, S. (2020). Developing English learning materials for mechanical engineering students using padlet. International Journal of Interactive Mobile Technologies, 14(04), 166-181. Google Scholar

    Filer, D. (2010). Everyone’s answering: Using technology to increase classroom participation. Nurse Education Perspective, 31(4), 247-250. Google Scholar

    Florenthal, B. (2018). Students’ motivation to participate via mobile technology in the classroom: A uses and gratification approach. Journal of Marketing Education, 41(3), 234-253. Google Scholar

    Gomez, E. A., & Elliot, N. (2013). Short-message performance assessment in emergency response settings. IEEE Transactions on Professional Communication, 56(1), 16-32. Google Scholar

    Gurrero, C. (2010). Fostering active learning through the use of feedback technologies and collaborative activities in a postsecondary setting (Publication number 847278439) [Doctoral dissertation]. University of Austin, Texas. Google Scholar

    Lasry, N. (2008). Clickers or flashcards: Is there really a difference? The Physics Teacher, 46(4), 242-244. Google Scholar

    Liu, M., & Wronski, L. (2017). Examining completion rates in web surveys via over 25,000 real-world surveys. Social Science Computer Review, 36(1), 116-124. Google Scholar

    Mayhew, E. (2019). No longer a silent partner: How Mentimeter can enhance teaching and learning within political science. Journal of Political Science Education, 15(4), 546-551. Google Scholar

    Méndez Coca, D., & Slisko, J. (2013). Software Socrative and smartphones as tools for implementation of basic processes of active physics learning in classroom: An initial feasibility study with prospective teachers. European Journal of Physics Education, 4(2), 17-24. Google Scholar

    Noel, D., Stover, S., & McNutt, M. (2015). Student perceptions of engagement using mobile-based polling as an audience response system: Implications for leadership studies. Journal of Leadership Education, 14(3), 53-70. Google Scholar

    Porter, S. R. (2004). Overcoming survey research problems: New directions for institutional research. London: Jossey-Bass. Google Scholar

    Ramsden, P. (1992). Learning to teach in higher education. London: Routledge. Google Scholar

    Shon, H., & Smith, L. (2011). A review of Poll Everywhere audience response system. Journal of Technology in Human Services, 29(3), 236-245. Google Scholar

    Stewar, M. (2012). Effective classroom teaching. In L. Hunt & D. Chalmers (eds). University teaching in focus: A learning-centred approach (pp. 21-37). London: Routledge. Google Scholar

    Voelkel, S., & Bennett, D. (2014). New use for a familiar technology: Introducing mobile phone polling in large classes. Innovations in Education and Teaching International, 51(1), 46-58. Google Scholar

    Walsh, S. (2011). Exploring classroom discourse language in action. Oxon: Routledge. Google Scholar

    Wood, A. (2020). Utilizing technology-enhanced learning in geography: Testing student response systems in large lectures. Journal of Geography in Higher Education, 44(1), 160-170. Google Scholar

    Yabuno, K., Luong, E., & Shaffer, J. F. (2019). Comparison of traditional and gamified Student response systems in an undergraduate human anatomy course. HAPS Educator, 23(1), 29-36. Google Scholar

    Yu, U-C. (2017). Teaching with a dual-channel classroom feedback system in the digital classroom environment. IEEE Transactions on Learning Technology, 10(3), 391-402. Google Scholar

    References

    Abdel Meguid, E., & Collins, M. (2017). Students’ perceptions of lecturing approaches: Traditional versus interactive teaching. Advances in Medical Education and Practice, 8, 229-241. Google Scholar

    Abdulla, M. H. (2018). The use of an online student response system to support learning of physiology during lectures to medical students. Education and Information Technologies, 23(6), 2931-2946. Google Scholar

    Asmali, M. (2018). Integrating technology into ESP classes: Use of student response system in English for specific purposes instruction. Teaching English with Technology, 18(3), 86-104. Google Scholar

    Bergstrom, T., Harris, A., & Karahalios, K. (2011, 5-9 September 2011). Encouraging initiative in the classroom with anonymous feedback [Conference presentation]. 13 International Conference on Human-Computer Interaction (INTERACT), Lisbon, Portugal. Google Scholar

    Caldwell, J. E. (2007). Clickers in the classroom: Current research and best-practice tips. CBE: Life Sciences Education, 6(1), 9-20. Google Scholar

    Dakka, S. M. (2015). Using Socrative to enhance in-class student engagement and collaboration. International Journal on Integrating Technology in Education (IJITE), 4(3), 13-19. Google Scholar

    Deng, L. (2019). Assess and engage: How Poll Everywhere can make learning meaningful again for millennial library users. Journal of Electronic Resources Librarianship, 31(2), 55-65. Google Scholar

    Dervan, P. (2014). Enhancing in-class student engagement using Socrative (an online student response system): A report. All Ireland Journal of Teaching and Learning in Higher Education, 6(3), 1801-1813. Google Scholar

    Dobbins, C., & Denton, P. (2017). MyWallMate: An investigation into the use of mobile technology in enhancing student engagement. TechTrends: Linking Research & Practice to Improve Learning, 61(6), 541-549. Google Scholar

    Etfita, F., & Wahyuni, S. (2020). Developing English learning materials for mechanical engineering students using padlet. International Journal of Interactive Mobile Technologies, 14(04), 166-181. Google Scholar

    Filer, D. (2010). Everyone’s answering: Using technology to increase classroom participation. Nurse Education Perspective, 31(4), 247-250. Google Scholar

    Florenthal, B. (2018). Students’ motivation to participate via mobile technology in the classroom: A uses and gratification approach. Journal of Marketing Education, 41(3), 234-253. Google Scholar

    Gomez, E. A., & Elliot, N. (2013). Short-message performance assessment in emergency response settings. IEEE Transactions on Professional Communication, 56(1), 16-32. Google Scholar

    Gurrero, C. (2010). Fostering active learning through the use of feedback technologies and collaborative activities in a postsecondary setting (Publication number 847278439) [Doctoral dissertation]. University of Austin, Texas. Google Scholar

    Lasry, N. (2008). Clickers or flashcards: Is there really a difference? The Physics Teacher, 46(4), 242-244. Google Scholar

    Liu, M., & Wronski, L. (2017). Examining completion rates in web surveys via over 25,000 real-world surveys. Social Science Computer Review, 36(1), 116-124. Google Scholar

    Mayhew, E. (2019). No longer a silent partner: How Mentimeter can enhance teaching and learning within political science. Journal of Political Science Education, 15(4), 546-551. Google Scholar

    Méndez Coca, D., & Slisko, J. (2013). Software Socrative and smartphones as tools for implementation of basic processes of active physics learning in classroom: An initial feasibility study with prospective teachers. European Journal of Physics Education, 4(2), 17-24. Google Scholar

    Noel, D., Stover, S., & McNutt, M. (2015). Student perceptions of engagement using mobile-based polling as an audience response system: Implications for leadership studies. Journal of Leadership Education, 14(3), 53-70. Google Scholar

    Porter, S. R. (2004). Overcoming survey research problems: New directions for institutional research. London: Jossey-Bass. Google Scholar

    Ramsden, P. (1992). Learning to teach in higher education. London: Routledge. Google Scholar

    Shon, H., & Smith, L. (2011). A review of Poll Everywhere audience response system. Journal of Technology in Human Services, 29(3), 236-245. Google Scholar

    Stewar, M. (2012). Effective classroom teaching. In L. Hunt & D. Chalmers (eds). University teaching in focus: A learning-centred approach (pp. 21-37). London: Routledge. Google Scholar

    Voelkel, S., & Bennett, D. (2014). New use for a familiar technology: Introducing mobile phone polling in large classes. Innovations in Education and Teaching International, 51(1), 46-58. Google Scholar

    Walsh, S. (2011). Exploring classroom discourse language in action. Oxon: Routledge. Google Scholar

    Wood, A. (2020). Utilizing technology-enhanced learning in geography: Testing student response systems in large lectures. Journal of Geography in Higher Education, 44(1), 160-170. Google Scholar

    Yabuno, K., Luong, E., & Shaffer, J. F. (2019). Comparison of traditional and gamified Student response systems in an undergraduate human anatomy course. HAPS Educator, 23(1), 29-36. Google Scholar

    Yu, U-C. (2017). Teaching with a dual-channel classroom feedback system in the digital classroom environment. IEEE Transactions on Learning Technology, 10(3), 391-402. Google Scholar


    Details