You are on page 1of 16

Introduction

Introduction
On November 9, video game creator Activision
released the long awaited game “Call of Duty: Black
Ops.” This event corresponded with the creation of AP
Statistic’s Survey Project, resulting in our survey about
video games. We surveyed Gunn students from all
grades in order to find out their video game habits,
such as their favorite consoles, most played games, and
frequency of playing time. We looked to see if the
release of a popular video game would increase the
use of students’ Playstations 3s or Xbox 360s
Purpose
The purpose of this survey was to see if a person’s
video game preference was influenced by the
type of console that person purchased and to
identify the video game habits of Gunn High
School students. In addition, we also wanted to
see the effects of an unbiased survey on data
compared to a biased survey.
Hypothesis
Our hypothesis was based on social stigma. We thought
that boys would have more video games than girls
and play them more times in a week. Also, since we
developed the survey during the time that the
popular shooting game Black Ops came out, we
expected to see a rise in playing time and an
increased use in Playstation 3 and Xbox specifically
for boys.
Abstract
Our survey tried to identify if there was any
correlation between console and video game
preference and which groups showed significant
engagement in the practice. We compared data
with different methods of sampling data (SRS vs.
biased) and tried to see if any difference in
approach skewed the response. We learned by
SRSing the data, we significantly limited the chance
of errors in our data collection opposed to the
biased version. In the end we found that….
Biased Survey Request Message
(although the biased messages varied in question, the
typical message request was something like the following:)
Wazzap guys!!!!
Guess what?! you have all been selected to take part in
the biased section of my AP stat survey! what an excellent
opportunity to contribute to my project taking this survey!
PLEASE TAKE THIS!
just type the results back to me. each spacing of the line
means the next question
thanks kids!
(survey questions copied and pasted tomorrow)
Non-Biased Survey Request Message
Hello
This is Tina, Emily, Amrita, and Eamon. As one of our AP stat projects, we
need to survey people. You were one of the people that was randomly
selected. It would be greatly appreciated if you take some time to fill out
this survey. Thanks and here’s the URL.

https://spreadsheets.google.com/viewform?formkey=dE44cDNJT0prX0hK
UTd1OTJRMkFBZmc6MQ

(as an error, we had to have everybody retake the survey because we


didn’t know how to sort the data into biased and non-biased because we
sent everybody to the same link)

So sorry. But could u just answer the survey through here. Thanks.
How the SRS Sample was
Collected
1. Using the master list of all Gunn High School students, we first
blocked for each grade and gender.  
2. We used simple random sampling to randomly select the participants
for the survey. We SRSed for 6 random males and 6 random females
(12 people) from each grade that would equally represent each
grade.  
3. Found every SRSed person on Facebook and we sent out the survey
as well as a letter explaining who we were and asking for the people
to spend about 5 minutes answering the survey.
4. We sent each message with the exact same template.
5. After collecting a fairly equal amount of data for each grade, we
combined the data together to account for all of Gunn High School,
not individual grades
How Bias Sample was Collected
For our bias survey we purposely did not use an SRS to find the
people to be surveyed but we used multiple types of
convenience sampling.
1. We sent the survey request message to our friends for
guaranteed quick responses
2. Our group also tagged the messages to the recipients with
funny, snarky comments in order to catch their attention.
3. We did not block for gender
4. Since everyone in our group was a junior and we sent the
bias messaged to our friends, only the upperclassmen at
Gunn were represented and not all grades.
Non-Biased Data Analysis
Biased Data Analysis
Non-biased Errors and Possible
Lurking/Confounding Variables
1. We realized that we should have not send out the link after we
already did, so we sent out another message with the actual
survey questions and requested for people to take the survey again
- People could have been frustrated and may have been more
careless about the survey the 2nd time they took it.  

2. In addition, because we had people respond to the survey through


Facebook message, they knew that the survey was not completely
anonymous.  
- People may have answered the surveys a little differently
than they would have if it were an anonymous survey due to
pressure.
Non-biased Errors and Possible Lurking/Confounding
Variables

3. After collecting data, we saw that some people didn’t respond


to the survey.
- This was or could have been because they were not
frequent Facebook users and did not see the message, or
thought the survey was junk mail or spam.

4. Also we limited the communication to simple forms of


distributing the messages such as Facebook. In our SRS
sampling, some of the people we randomly selected did not
have a Facebook or we did not have their email.
-In this case, we had to skip that person and SRS for another
person because there was no way to contact them.
Biased Error Analysis
1. Since all four members of the groups were juniors and we all
distributed the messages to our friends, only the
upperclassmen were represented
- By not SRSing, we overrepresented some groups and
underrepresented others.

2. Interviewer bias was introduced when we sent the messages


specifically to our closest friends with funny or personalized
survey request messages.
-This put pressure on our friends to take part in the survey and
answer the questions in a way that satisfied the way we saw them.
Bias Error Analysis (continued)
3. We copied and pasted the survey in the biased
message as well, so the person had to type their
responses rather than clicking a button.

4. We also let the message recipients know they had been


selected to participate in the bias part of our survey.
- Letting them know could have potentially
influenced their answers as well. Somemight have
felt pressured to see how we would use their biased
data, and others might not have taken it seriously
Conclusion
Conclusion Analysis

You might also like