You are on page 1of 15

Cl-I-C-K-Er-S 1

Cl-I-C-K-Er-S:

The Effect of Clickers on Student Engagement in the High School Chemistry Classroom

Sarah Kitzan

University of Alaska Southeast


Cl-I-C-K-Er-S 2

Introduction

For most students, high school chemistry is not an easy subject. Students look forward to

doing exciting labs and getting to blow things up, but unfortunately the class is not all fun and

games. Yes, there are labs and fun activities, but we spend a majority of class time taking notes

and learning about concepts that are not always exciting, so keeping students engaged is a

constant struggle. In an effort to make note-taking more interesting and to get students more

involved during notes, I started looking into different ways that students can actively participate

while they take notes. At my school, quite a few teachers use clickers in their classrooms, and as

I do not have much experience with them, I thought this would be the perfect opportunity to try

them out. For my action research project, the question I chose to research is Does the use of

clickers increase student engagement in the high school chemistry classroom?

Literature Review

Many teachers are constantly working to change how they present material in their

classrooms to engage students and take a more active role in their learning. Many classes use a

traditional lecture-driven model, but there is a lack of evidence that lecture-driven classes should

be the only instruction approach used in classrooms (Lumpkin, Achen, & Dodd, 2015, p.121).

Classroom communication systems, which include the use of clickers, or Audience Response

Systems (ARS), are one method that is used to break out of the lecture-based teaching style.

Beatty (2004) defines classroom communication systems (CCCs) as technology

productsdesigned to support communication and interactivity in classes (p.2). Moore (2007),

as cited by Devlin (2013), says that learners have grown accustomed to acquiring information

and communicating by utilizing technology-based methods (p.1). Students are rarely seen

without some kind of technology in their hand, so why not add that to the normal classroom
Cl-I-C-K-Er-S 3

routine. Multiple research studies have found a strong correlation between the use of an ARS

and increased student participation and engagement. (Barnes, 2008, p.535; Beatty, 2004, p.5;

Dunn, Richardson, Oprescu, and McDonald, 2012, p.1173; Gunn, 2014, p.1; McNabb, 2009,

p.20; Micheletto, 2011, p.10; Terrion & Aceti, 2012).

One benefit to using clickers as an educational tool is that it can create an active learning

environment, which allows students to learn from doing activities and working with other

students instead of just listening to a teacher lecture. Lumpkin et al. (2015) found that active,

collaborative activities engaged students and positively impacted learning (p.129). Barnes

(2008) used both lecture-free and lecture-based methods with four different high school biology

classes and found that a majority of students preferred the lecture-free method that utilized an

ARS, saying they learned more and had to use their brains more than in the lecture-based method

(p.534).

However, it is not a simple process to begin using an ARS in the classroom. Using new

technologies in teaching brings with it the risk of mis-using technology or not using technology

in a pedagogically effective way (Chittleborough, 2014, p.391). It would be very easy to start

using clickers without carefully planning how to use them effectively. Shirley, Irving, Sanalan,

Pape, and Owens (2010) also touch on an important aspect of using connected classroom

technology (CCT), and that is how practical it is to use. Practicality consists of three constructs:

congruence with teachers values and practice; instrumentality compatibility with the existing

school structures, and cost/benefits whether the reward is worth the effort (Shirley et al.,

2010, p.459) It is possible that implementing a certain technology will not fit well in certain

schools or with all teachers. Penuel, Boscardin, Masyn, and Crawford (2006) mention a need for

teachers to have adequate training in how to use student response systems effectively in the
Cl-I-C-K-Er-S 4

classroom (p.342). Most teachers do not have a background in the different technologies

available, so extra training would be beneficial before using them in the classroom.

With the ubiquitous use of computers in society there is an increasing need for computer

technology to be integrated into teaching (Chittleborough, 2014, p.374). Utilizing clicker

technology is one way of integrating a computer technology into the classroom, and hopefully

that integration will increase student engagement and offer students an active learning

environment. Technology may offer a means to enhance student engagement (Terrion et al.,

2012), so trying to increase student engagement by using clickers appears to be an attainable

goal.

Methodology

I conducted this action research project in five chemistry classes at Palmer High School,

in Palmer, AK, over a five-week period. In my proposal, I had originally planned on only using

two of my five class periods, but this made planning and scheduling difficult, so I opted to use all

five classes instead. Classes at Palmer High School operate on a modified block schedule, so in

a normal five-day school week, I see each class four out of five days. Two of the days are forty-

eight minute periods, and two of the days are sixty-seven minute periods. Students enrolled in

these five classes range from tenth grade through twelfth grade, with 49 sophomores, 52 juniors,

and 7 seniors. Of the 108 students in the study, 48% are female and 52% are male. I did not

inform students of the research project because student privacy was completely protected due to

the anonymous nature of the data collection I did not want them knowing about the research to

have an impact on their behavior in class. Students learned about two different topics, electrons

and the periodic table, over the five-week period of data collection.
Cl-I-C-K-Er-S 5

In order to conduct my project, I needed my students to use clickers, but I did not have

access to a classroom set of clickers I could use on a daily basis. I had heard of a site called

Poll Everywhere (Poll Everywhere) that allows people to respond to polls just by using their

cell phones. Since it seems students always have their phones, I thought this might just be my

solution to the clicker problem. I made a free account with Poll Everywhere, and then decided

that it would be more useful for me to upgrade to the premium K-12 plan, which allowed me to

create gradebook reports to keep track of which students were responding and gave me an option

to grade students responses as well. I gave the students in my five classes a URL with which

they could create a free account that linked them to mine. Students had the option of

downloading the Poll Everywhere app to respond in class, or simply texting their responses with

a keyword. Students that did not have a cell phone or other technology to use were limited to

responding to the polls with paper and pencil.

I use Microsoft PowerPoint to deliver my notes in class, so another handy feature of Poll

Everywhere is the PowerPoint add-on. I was able to create polls on the website, and then insert

them into my note presentations so the delivery of polls to students was seamless with the rest of

the class. When I switched to a poll slide, the poll would automatically activate, allowing

students to respond immediately. As students responded, the results displayed in real-time, so

students could see how their answer compared with the rest of the class. All responses were

anonymous, so students did not need to worry about being embarrassed if they chose the wrong

answer. After everyone had voted, I would discuss the results with the class, and elaborate

further if the responses showed a lack of understanding. Once I moved on from that poll, it

would not be active anymore, so no more students would be able to respond. Depending on the

class period and the day, the number of polls varied significantly
Cl-I-C-K-Er-S 6

I used four different methods of data collection throughout my action research project.

The first method I used was giving the students a ten-question, one-to-five scale Likert-survey,

administered on the first and last days of data collection. My second method of data collection

was a nine question open-ended survey, administered on the last day of data collection. In my

proposal, I had intended to perform focus groups and interviews, but opted for the open-ended

surveys due to time constraints. A third data collection method was the creation of gradebook

reports in Poll Everywhere, showing how many questions each student responded to, how they

responded to each question, and an overall participation percentage for the class as a whole. My

final data collection method was observing during class sessions when we used Poll Everywhere.

Data Analysis

The goal of this action research project was to answer the research question, Does the

use of clickers increase student engagement in the high school chemistry classroom? I kept this

question in the forefront as I analyzed the data that I collected.

The Likert surveys (Appendix A) consisted of 10 statements and students were asked to

pick a number from one to five to represent how much they either agree or disagree with that

statement, where 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly

agree. Out of my 108 students, 93 (86.1%) took the survey at the beginning of data collection

and 92 (85.2%) took the survey at the end of data collection. I calculated the average for each

class period for the surveys given before we started using Poll Everywhere and after we had used

it for five weeks. Of the ten statements students responded to, five relate directly to student

engagement and five relate to the general classroom environment. The statements relating to

engagement had to do with participation, attentiveness, on-task behavior, effort, and

preparedness. Data analysis showed that students perceived a decrease in their participation,
Cl-I-C-K-Er-S 7

attentiveness, and on-task behavior in class, and an increase in their perceived effort and

preparedness for class.

At the conclusion of the research project, in addition to administering the Likert surveys

one more time, I also gave students a nine-question, open-ended survey (Appendix B). Out of

my five chemistry classes, 89 students (82.4%) completed the open-ended survey. I asked

students five questions relating specifically to the use of Poll Everywhere in class, and four

questions about student participation and engagement in class. One question that helps answer

my research question is, How has Poll Everywhere changed your participation in class? I

coded the responses as no change/unsure, increase/positive, and decrease/negative. Of the

89 students that responded to the surveys, 87 answered the question. Of those, 22 students

(25.3%) said Poll Everywhere had no effect or they were unsure of the its effects on

participation, 63 students (72.4%) made statements indicating an increase or positive effect on

their participation, and two students (2.3%) made negative comments concerning participation

with Poll Everywhere (Figure 1).

Figure 1:
How has Poll Everywhere changed your participation in
class?
2.3%

No Change
25.3%
Positive
72.4%
Negative

Another question I asked students on this survey was Do you want to continue using

Poll Everywhere in class? Of the 89 students that responded, 82 (92.1%) of them said yes. An

overwhelming majority of students showing interest in continuing the use of Poll Everywhere
Cl-I-C-K-Er-S 8

suggests that students enjoyed using Poll Everywhere in class and that it had a positive effect on

the learning environment.

My third method of data collection comes from gradebook reports created at the

conclusion of each class period in which we used Poll Everywhere. I organized the gradebook

reports by class period and date, and looked at three specific things within each report: number

of participants responding, number of polls answered, and overall participation percentage. I

multiplied the number of participants by the number of polls answered to get the total number of

questions asked in each session, and used the participation percent to calculate how many

questions were answered. I found that in many class period sessions using Poll Everywhere,

there was close to 100% participation, which means every student answered every single

question. In cases where the participation was less than 100%, one or more students may not

have answered one or more of the questions, but participation was never lower than 89% in any

of the Poll Everywhere sessions. Figure 2 shows the data from my first period class.

Figure 2:
Period 1 Unit 4 #1 Unit 4 #2 Unit 4 #3 Unit 5 #1 Unit 5 #2 Unit 5 #3
Total Questions Answered 96 116 57 76 70 36
Total Questions Asked 96 120 57 76 72 36
Participation (%) 100 97 100 100 97 100

My final method of data collection is the observations I made during class periods when

we used Poll Everywhere. During my observations, I tried to keep track of how many students

did not appear engaged in class and made note of how many students did not have cell phones to

participate in Poll Everywhere. My original plan was to arrange for someone to come in and

make observations for me, but this did not end up being possible, so I did my own observations.

I had also intended to observe during a non-clicker class period, but that also did not occur. I

made 16 different observations throughout the five weeks of data collection. Of the 16

observations, 11 specifically mention at least one student without a phone. Nine of the
Cl-I-C-K-Er-S 9

observations mention one or more students exhibiting off-task behavior at some point during a

class period when we used Poll Everywhere.

I triangulated all of my data, comparing the results from each of the methods together.

Even though the Likert survey results appear to be inconclusive, the open-ended surveys show

very clearly that student engagement increased overall, at least in terms of participation. The

ability for students to explain their change in participation due to Poll Everywhere makes up for

the lack of explanation allowed in the Likert surveys. The open-ended surveys also help me

interpret the observation notes. Even though I did not have any non-clicker classes to compare

my notes with, the students felt that using Poll Everywhere had a positive effect on their

engagement in class. This suggests that if I were to compare my observations with a non-clicker

class, I would find that there are fewer students off-task when we are using Poll Everywhere than

when we are not using Poll Everywhere. Combining this with the Poll Everywhere gradebook

data showing near 100% participation, and my observation notes, I am confident that together,

my data provides a clear answer to my research question.

Discussion

The question I set out to answer during this project was Does the use of clickers increase

student engagement in the high school chemistry classroom? Based on my data analysis and

triangulation, I conclude that using clickers, specifically Poll Everywhere, has increased the

engagement of students in my five chemistry classes.

While I have answered my research question, I feel there is more to learn about how, and

to what degree, clickers increase student engagement. One of the drawbacks to the design of my

study is that I did not have a way to measure how much the engagement was increasing. I plan
Cl-I-C-K-Er-S 10

to continue this action research later on, and one of the things I will think about when designing

my research plan is how I can measure the change in my students engagement.

Another limitation to my research is how I chose to define student engagement. I focused

a majority of my data collection on student participation, which represents only one aspect of

student engagement. Taylor and Parsons (2011) note that a majority of research concerning

student engagement focuses on achievement data, such as test scores and attendance, instead of

levels of student engagement, such as interest in classes, time spent on task, and an enjoyment in

learning (p. 5). As I work on continuing this research, I want to look more in depth at other

levels of student engagement, like how interested students are in my class and if they enjoy

learning about chemistry.

A limitation in the Likert surveys is that even though the number of students taking the

survey before and after was almost the same, it does not mean all of the same students took the

survey twice. It is possible some students took the survey only once, either at the beginning or at

the end, and the results do not account for that possibility.

Another issue I discovered had to do with the open-ended surveys. As I read student

responses, it became evident that not all students understood what each of the questions was

asking. There was an issue, for example, with the word engagement. Some students were not

familiar with the educational definition of the word. If I continue this research further, I will be

sure to explain each question before students respond so that everyone can answer as accurately

as possible.

For future studies, I plan to change a few other things about my data collection methods.

One change I would make to the study is the use of Likert surveys. I feel like the results

obtained from Likert surveys are very limiting in determining changes in student perceptions.
Cl-I-C-K-Er-S 11

Instead of using Likert surveys, I will likely conduct individual interviews with a small number

of students throughout the research process. I feel that individual interviews will provide more

insight into how the clickers are affecting the students.

My observation data from this study would be much more insightful if I had also

observed normal lecture-based class periods, excluding the use of clickers. In the future, I would

make sure to observe classes without clickers in order to provide a comparison for classes when

clickers are used.

Based on the outcome of this study, I plan to continue using Poll Everywhere in my

chemistry classes, and possibly expand its use to my freshmen-level physical science class.

Using Poll Everywhere is going to be an ongoing research project for me, and I look forward to

learning more about how I can continue to increase student engagement.


Cl-I-C-K-Er-S 12

References

Barnes, L. (2008). Lecture-Free High School Biology Using an Audience Response System. The
American Biology Teacher, 70(9), 531-536.

Beatty, I. (2004). Transforming Student Learning with Classroom Communication Systems.


EDUCAUSE Center for Applied Research, 2004(3).

Chittleborough, G. (2014). Learning How to Teach Chemistry with Technology: Pre-Service


Teachers Experiences with Integrating Technology into Their Learning and Teaching. J
Sci Teacher Educ Journal of Science Teacher Education, 25, 373-393.

Devlin, T., Feldhaus, C., & Bentrem, K. (2013). The Evolving Classroom: A Study of
Traditional and Technology-Based Instruction in a STEM Classroom. Journal of
Technology Education, 25(1).

Dunn, P., Richardson, A., Oprescu, F., & Mcdonald, C. (2013). Mobile-phone-based classroom
response systems: Students perceptions of engagement and learning in a large
undergraduate course. International Journal of Mathematical Education in Science and
Technology, 44(8), 1160-1174.

Gunn, E. (2014). Using Clickers to Collect Formative Feedback on Teaching: A Tool for Faculty
Development. International Journal for the Scholarship of Teaching and Learning, 8(1).

Lumpkin, A., Achen, R., & Dodd, R. (2015). Student Perceptions of Active Learning. College
Student Journal, 49(1).

McNabb, K. (2009). Use of an Audience Response System to Evaluate and Streamline a General
Chemistry Class.

Micheletto, M. (2011). Conducting A Classroom Mini-Experiment Using An Audience


Response System: Demonstrating the Isolation Effect. Journal of College Teaching and
Learning, 8(8).

Penuel, W., Boscardin, C., Masyn, K., & Crawford, V. (2006). Teaching with student response
systems in elementary and secondary education settings: A survey study. Education Tech
Research Dev Educational Technology Research and Development, 55, 315-346.

Poll Everywhere. (n.d.). https://www.polleverywhere.com/

Shirley, M., Irving, K., Sanalan, V., Pape, S., & Owens, D. (2010). The Practicality Of
Implementing Connected Classroom Technology In Secondary Mathematics And Science
Classrooms. Int J of Sci and Math Educ International Journal of Science and
Mathematics Education, 9, 459-481.
Cl-I-C-K-Er-S 13

Taylor, L. & Parsons, J. (2011). Improving Student Engagement. Current Issues in Education,
14(1). Retrieved December 8, 2015 from http://cie.asu.edu/

Terrion, J., & Aceti, V. (2012). Perceptions of the effects of clicker technology on student
learning and engagement: A study of freshmen Chemistry students. Research in Learning
Technology, 20.
Cl-I-C-K-Er-S 14

Appendix A

Likert Survey

Answer each of the following questions by circling the number that best describes how you feel:

Strongly Disagree = 1 Disagree = 2 Neutral = 3 Agree = 4 Strongly Agree = 5

1. I participate in class everyday 1 2 3 4 5

2. I pay attention during class 1 2 3 4 5

3. I give my best effort when I am in class 1 2 3 4 5

4. I am happy with the work I do in class 1 2 3 4 5

5. I stay on-task most of the time in class 1 2 3 4 5

6. I am in control of what I learn in class 1 2 3 4 5

7. I come to class prepared to learn everyday 1 2 3 4 5

8. The work I do in this class challenges me to think 1 2 3 4 5

9. I am encouraged to participate in class 1 2 3 4 5

10. I am encouraged to work hard in class 1 2 3 4 5


Cl-I-C-K-Er-S 15

Appendix B
Open-Ended Survey

1. What would you change about the way notes are delivered in class?

2. How involved in class do you feel on a regular basis?

3. What does engagement look like to you?

4. How does engagement affect your performance in class?

5. What do you like about using Poll Everywhere in class?

6. What do you dislike about using Poll Everywhere in class?

7. How has Poll Everywhere changed your participation in class?

8. How would you change the way we use Poll Everywhere in class?

9. Do you want to continue using Poll Everywhere in class?

You might also like