You are on page 1of 81

Med8 - Group 10ml883 May 2010

Preface

Motivation – in this chapter we introduce the project and present our motivation

Immersion, Presence, Flow and Engagement Theories – in this chapter we investigate the
titular theory subjects, and reach a conclusion on what theoretical framework to use for this
project. Based on this we also present our initial problem statement.

Pre-Analysis – in this chapter we further specify how to conduct our project, selecting test
setup interfaces, delimiting ourselves and presenting our final problem statement.

Synthesis of Engagement Theory – in this chapter we come to a consensus on what


parameters to use for the testing, as well as how to define them.

Methodology – in this chapter we present our methodology for the testing procedures,
designing the test setups and questionnaires, as well as present our result hypotheses.

Test Results – in this chapter we present the test results, followed by the data analysis of
these results, showing which parts of our result hypothesis is correct.

Discussion – in this chapter we discuss the merits of the results and reflect on the analysis,
attempting to reach a consensus on the overall validity of our method.

Conclusion – in this chapter we summarize all the conclusions drawn in the previous
chapters.

Future Perspectives – in this chapter we present our thoughts on how to further develop our
method, as well as what our method could be used for in the future.

Page | 1
Med8 - Group 10ml883 May 2010

Acknowledgements

We would like to thank Henrik Schønau-Fog and Thomas Bjørner for giving us access to
their research in engagement theory, which at the time of writing this was unpublished.
We would also like to thank all who helped us by volunteering themselves to the rigors of
playing Portal for thirty minutes as part of our testing.

Page | 2
Med8 - Group 10ml883 May 2010

Table of Content

Preface........................................................................................................................................ 1
Acknowledgements .................................................................................................................... 2
Table of Content ........................................................................................................................ 3
1 Motivation .......................................................................................................................... 5
2 Immersion, Presence, Flow and Engagement Theories ...................................................... 6
2.1 Theories ....................................................................................................................... 6
2.2 Methods ....................................................................................................................... 7
2.3 Engagement Theory and Definitions........................................................................... 9
2.4 New Engagement Parameters.................................................................................... 10
2.5 Initial Problem Statement .......................................................................................... 12
3 Pre-Analysis...................................................................................................................... 13
3.1 Choice of Interfaces .................................................................................................. 13
3.1.1 Standard PC Interface ........................................................................................ 13
3.1.2 Nintendo Wii Interface ...................................................................................... 14
3.1.3 Head Mounted Display Interface ....................................................................... 16
3.2 Target Group ............................................................................................................. 18
3.3 Choice of Game for Testing ...................................................................................... 19
3.3.1 Examples of Engagement from Portal ............................................................... 20
3.4 Delimitation............................................................................................................... 21
3.5 Final Problem Statement ........................................................................................... 22
4 Synthesis on Engagement Theory .................................................................................... 23
4.1 Engagement Parameter Comparison ......................................................................... 24
4.2 Final Choice of Engagement Parameters .................................................................. 27
5 Methodology..................................................................................................................... 30
5.1 Test Design................................................................................................................ 31
5.1.1 Interface Setup Designs ..................................................................................... 33
5.1.2 Final Test Requirements .................................................................................... 34
5.2 Questionnaire ............................................................................................................ 35
5.2.1 The Qualitative Test Content ............................................................................. 36
5.3 Pilot Test ................................................................................................................... 37
5.4 Test Implementation.................................................................................................. 38
5.4.1 Baseline Interface............................................................................................... 38

Page | 3
Med8 - Group 10ml883 May 2010

5.4.2 HMD Interface ................................................................................................... 39


5.4.3 Wiimote Interface .............................................................................................. 40
5.5 Result Hypothesis ...................................................................................................... 42
6 Test Results....................................................................................................................... 45
6.1 Standard PC Interface Test........................................................................................ 45
6.2 HMD Interface Test .................................................................................................. 47
6.3 Wiimote Interface Test .............................................................................................. 50
6.4 Test Result Analysis .................................................................................................. 52
6.4.1 Baseline and HMD Interface Comparison ......................................................... 52
6.4.2 Baseline and Wiimote Interface Comparison .................................................... 55
6.4.3 Experienced and Inexperienced Player Data Comparisons ............................... 57
7 Discussion......................................................................................................................... 60
7.1 HMD Interface Discussion ........................................................................................ 60
7.2 Wiimote Interface Discussion ................................................................................... 61
7.3 Experienced and Inexperienced Player Discussion ................................................... 62
7.4 Overall Evaluation..................................................................................................... 63
8 Conclusion ........................................................................................................................ 65
9 Future Perspectives ........................................................................................................... 66
9.1 HMD Specific Future Perspectives ........................................................................... 66
9.2 Future Theoretical and Practical Applications .......................................................... 67
10 Bibliography ................................................................................................................. 69
I Appendix .......................................................................................................................... 73
I.I Installation Guide – Wiimote .................................................................................... 73
I.II GlovePIE Script......................................................................................................... 74
I.III Installation Guide – Vuzix iWear VR920 ................................................................. 78
I.IV Questionnaire ............................................................................................................ 78
II CD Content ....................................................................................................................... 80
III Test Data ........................................................................................................................... 81

Page | 4
Med8 - Group 10ml883 May 2010

1 Motivation

Our original intention for this study was a search for uniformity in the academic
understanding of game experiences. We find this interesting since this field of study is one
where there is no uniform consensus on definitions of the taxonomy and vocabulary
commonly used to describe the game experiences. It is our goal to use this study to add our
say in reaching a comprehensible and understandable consensus on the terminology used
within this field.
We intend to investigate the current methodologies in which game experiences are
understood and measured, and from that attempt to synthesize our own method to do the
same. Specifically we are interested in measuring experiences during gameplay, as a lot of
existing methodologies focus on collecting data after one has played a game.

We consider data collection on game experiences during gameplay interesting for this project
because it poses an interesting academic challenge to develop a method to map a players
experience over time, plus it fits well with the semester theme. We intend to do this by
mapping similar game experiences through different perspectives, allowing us to compare
data and find possible different patterns in how players experience the same game.

For the purpose of achieving these different perspectives in gameplay, we envision players
experiencing the same game through different interfaces. This would allow for comparing
experience maps to see if the mapping method is sensitive enough to detect if the same game
is played through different means.

Page | 5
Med8 - Group 10ml883 May 2010

2 Immersion, Presence, Flow and Engagement Theories

With the motivation for this project declared in the previous section, a proper understanding
of the topic at hand is needed. The overall topic of how to understand the changes in
perceived gameplay experiences between different combinations of input and output devices
encompass two main issues. Firstly how one can understand a game experience and secondly
how to measure it. In other words, how to define and understand the fun one might have in
playing a computer game, and how one might measure or track this.
This section will present our research into current trends and theories in how experiences are
understood, specifically related to games, and also how these experiences are measured. This
is done to find inspiration and sources on how to create the framework to perform our desired
experiments.

The different components of a gameplay experience must first be understood, as some


theories focus more on what leads up to an experience, such as IJsselsteijn et al. [1], and how
that affects the experience, while other theories such as Sweetster and Wyeth’s [2] reflect on
the sum of the experience or the results after the fact. Equally, one should then be careful
since there is no uniform or standardized terminology within this field. Certain concepts and
definitions are less contested than others, but many use similar vocabularies and describe
similar effects, but with entirely different taxonomies, potentially leading to much confusion
when comparing theories

2.1 Theories

Of the most commonly used concepts mentioned in the fields of understanding experiences,
game experiences and user experiences are flow [1] [2] [3] [4] [5] [6], immersion [7] [8] [1]
[2] [9] [10] [11] [12] [4] [5] [6] [13], presence [7] [14] [15] [5] [6] [13] [15] and engagement
[10] [14] [5] [6] [16], but as mentioned before this does by no means indicate any form of
consensus.
Presence is a good example of how many of the definitions proposed overlap or say the
opposite things, with one using immersion to describe what another calls presence. Such as
the case of Slater [15] or Brockmyer et al. [6] stating that presence is a result of immersion,
while Ermi and Mäyrä [9] state that presence and immersion can be used as synonyms,
despite different origins of the terms, which is but one example of the lack of consensus in
the area. There is however also a number of sources that corroborate and support each other’s
claims. Slater states [12] that “Presence is a state of consciousness that may be concomitant
with immersion, and is related to a sense of being in a place” which is quite similar to
Lombard and Ditton’s notion of psychological immersion [13].

Slater furthermore states that immersion is a more objectively quantifiable concept: “We call
a computer system that supports such experience an "immersive virtual environment". It is
immersive since it immerses a representation of the person's body in the computer generated

Page | 6
Med8 - Group 10ml883 May 2010

environment.” [15] which is similar to Lombard and Ditton’s perceptual immersion [13]. This
is supported by Ermi and Mäyrä [9] when referring to immersion as “The sensation of being
surrounded by a completely other reality [...] that takes over all of our attention, our whole
perceptual apparatus”, but they also state that this supposed similarity between perceptual and
psychological immersion means that immersion and presence are often used as synonyms in
some cases. This difference in definition of the term shows that while similarities and some
levels of consensus exist, they are not yet universal.

Engagement is also a term with many definitions. Dow at al. [14] define engagement as
something that “... refers to a person’s involvement or interest in the content or activity of an
experience, regardless of the medium”, while Lindley [10] says “Engagement in that case
facilitates the discovery of schemas of game or narrative form (i.e. for specific genres of
games or narratives) that provide criteria for the development and/or selection of schemas of
play or viewing.” These are very different understandings of what engagement is, and
O’Brien and Toms [17] posit that “…engagement may share some attributes with flow, such
as focused attention, feedback, control, activity orientation (i.e., interactivity), and intrinsic
motivation medium.”, specifying that “…it is the interaction between users and systems
operating within a specific context that facilitates an engaging experience.”, which while
similar to Dow et al.’s definitions, are not completely identical.

Flow is the one concept that is not disputed, originally coined by Cziksentmihalyi [18] and
later updated for game theory use by Sweetser and Wyeth [2]. Flow is defined by
Cziksentmihalyi as a list of seven points that when combined facilitate a state of flow in a
person [18] :

1. Tasks with a reasonable chance of completion


2. Clear goals
3. Immediate feedback
4. Deep but effortless involvement that removes from awareness the frustrations and
worries of everyday life
5. Sense of control over our actions
6. No concern for the self
7. Alteration of the concept of time, hours can pass in minutes and minutes can look
like hours

2.2 Methods

The difference in theoretical approaches to understand what and how people experience
things such as immersion and presence continues in the more practical methodologies used to
test, chart and rate these experiences. Brown and Cairns [7] takes a novel approach, using
grounded theory to come to their theoretical conclusions on engagement and immersion,
while Nacke and Lindley [8] used self-report questionnaires, acquiring data before and after
the game experience. This approach using questionnaires is very common for such data
collection [8] [6], while Sweetser and Wyeth rate game experience potential through
investigative observation of the game without involving test subjects [2].

Page | 7
Med8 - Group 10ml883 May 2010

Ultimately for this project a method must be found that can reflect on how a gameplay
experience changes during play. That is, how the experience develops from the first few
minutes of uncertainty when a player is new and unfamiliar with the gameplay environment,
to later when routine settles in and focus shifts from learning to doing.

In Ermi and Mäyrä’s work [9] on analysing fundamental components of the gameplay
experience, they used self-evaluating questionnaires where test subjects rated their
experiences. This was then done with a series of questions where the test subjects rate each
from one to five.

Figure 1: The GEQ table and question example list showed in Brockmyer et al.’s work [6]

This approach of describing gameplay experiences and having people choose which
statement best fits what they felt when playing is also used by Brockmyer et al. [6]. Their
development of the Game Engagement Questionnaire (GEQ) led to a method with limited
multiple choices for each of the nineteen descriptive statements. It is debatable how accurate
their method was. This is because for each statement one would answer them with “Yes”,
“No” or “Maybe”. In figure 1 a sample of the questions in the GEQ can be seen. To this we
would also argue that questions such as “I lose track of time” and “I play longer than I meant
to” are far too similar.

When dealing with the very absolute form of a multiple choice questionnaire there is little
room for subtlety and uncertainty, making results polarized and potentially confusing test
subjects if their experiences do not match up to the presented answer options – for example,
“Yes”, “No” and “Maybe” doesn’t account for the differences between “Yes, but only a

Page | 8
Med8 - Group 10ml883 May 2010

little” and “Yes, I completely agree”, which can be important when rating something as
personal and individual as a gameplay experience. For this an interval scale such as Ermi and
Mäyrä’s would be more efficient at showing subtle differences. When using a quantitative
interval scale to rate a game experience, it must be broad enough to encompass subtle
differences, otherwise results may be inaccurate.

2.3 Engagement Theory and Definitions

With the outlined need for a theoretical framework that can describe a gameplay experience,
both as it unfolds, and in varying intervals instead of absolutes, we have decided that the
previously described approaches simply will not suffice. As a solution to this we have found
the work of Schønau-Fog and Bjørner [16] very close to what we want to use.

Schønau-Fog and Bjørner propose in their engagement theory for computer games [16] a
framework consisting of six engagement parameters rated from 0 to 5. The six engagement
parameters do not describe any kind of overall game experience, but independent aspects of
what a person focuses on during gameplay. Schønau-Fog and Bjørner posit that if high
enough engagement is achieved, through any combination of the six parameters, then a player
will get a positive gameplay experience and return to the game again.

The six parameters are as follows:

Intellectual engagement – focus on puzzle solving, creative thinking, overcoming


challenges and pattern finding.

Physical engagement – focus on physical movement and control of the user’s body, such as
hand eye coordination or rapidly pressing different buttons.

Sensory engagement – focus on perceiving game surroundings, or specific elements in


gameplay, such as visual or auditory stimuli, or observing game aesthetics.

Social engagement – focus on social contact and interaction, multiplayer, be it competitive


or cooperative, both in game and in real life.

Dramatic engagement – focus on story elements, narratives, dramatic elements and events.

Emotional engagement – focus on emotional connections to the gameplay, characters,


events in the game and co-players during multiplayer, both positive and negative.

The framework that Schønau-Fog and Bjørner proposed [16] also describes four phases to a
gameplay experience. These four phases of engagement map a person’s level of engagement
in the six parameters starting at the initial encounter with the game, all the way towards the
end when the player stops playing that specific game.

Phase one - Attention: The “attention” phase is described as the first encounter with the
game, where the players’ curiosity is sparked, for instance via an advertisement or through

Page | 9
Med8 - Group 10ml883 May 2010

word of mouth. In this stage players are engaged intellectual, physical and most of all on a
sensory level while they explores a new world and learns the basic controls of the game. The
remaining three engagement parameters have not yet developed in this phase.

Phase two – Commitment and Participation: The second phase “commitment and
participation” occurs when a player has spent some time in-game and become more familiar
with it. He is no longer engaged to such a high degree by the sensory and physical elements
as he has seen a lot of new places the game has to offer and has learnt the basic controls of
the game. However social and emotional engagement increases in this phase, as the player
becomes attached to his character and builds friendships in-game as well as outside of the
game with other players.

Phase three – Absorption: “Absorption” is reached when a player becomes increasingly


engaged socially, emotionally and dramatically in the game. While the remaining three
engagement parameters are now far less relevant to him. Flow is frequently experienced by
players in this phase, as they end up spending more time playing than they initially wanted to
or thought they had.

Phase four – Disengagement: The last phase, “disengagement”, occurs once the player has
finished the game, or simply loses interest. At this point only social engagement can keep the
player involved in the game. Alternatively the disengagement phase is also reached if
challenges within the game prove too difficult or easy for the player, which in turn do not
lead to a state of flow.

These four phases are described to be very dynamic, so that a seasoned player of a certain
game genre might be able to have a very brief commitment and participation phase and go
quickly into the absorption phase. While an inexperienced player might have a long second
phase and a short or nonexistent absorption phase, going straight to the disengagement phase
due to an inability to play the game due to inexperience. However, it should also be
understood that these phases can span months or even years, as they would describe the entire
length of time from the acquisition of a game, until one is tired of it and plays it no more.

2.4 New Engagement Parameters

Further work in this field lead to a new engagement model by Schønau-Fog [19], in which he
presents 10 new types of engagement classifications defined through grounded theory. These
engagement parameters are: Advancement, Completion, Exploration, Sharing, Intellectual,
Modification, Interfacing, Emotions, Physical Reactions and Absorption.
Some of the engagement parameters are the same as the other parameters defined by
Schønau-Fog and Bjørner [16], but what makes these ten parameters different from the
original six is that they have all been determined via a grounded investigation into what made
people feel engaged, without reference to Schønau-Fog’s work on engagement with Bjørner.

Below is a brief summary of the 10 types.

Page | 10
Med8 - Group 10ml883 May 2010

Advancement – this parameter refers to the desire of the player to constantly improve.
Whether it is by learning the rules and possibilities of the game, by acquiring items within the
game to advance a character or get to the next level. The desire to become better is what
keeps the player engaged.

Completion – a player will be engaged and keep playing as long as there are some objectives
to complete. If there are challenges to overcome or levels to complete the player won’t be
satisfied until he has completed the game fully.

Exploration – the exploration type of engagement could be both sensory or story related.
The player enjoys seeing everything and knowing everything about the in-game world he is
in. This type of player actively looks for hidden items or objectives, or discovers alternate
paths to the end of a level.

Sharing – this social engagement aspect focuses on sharing the experience of playing the
game with others. This could both be co-operative play or competitive play, as long as the
player feels like he is a part of a group or community.

Intellectual – developing strategies and problem solving are the parameters which engage the
player here.

Modification – the player is engaged if he is able to customize certain aspects of the game or
his character. User generated content, such as the ability to create e.g. level by himself are
key factors.

Interfacing – interfacing requires that the player carries out physical actions to provide input
to the game. These actions must be carried out for the player to reach the goal.

Emotions - this engagement parameters covers emotional reactions to in-game events, such
as frustration and joy when met with a difficult task and subsequently overcoming it.
However, it is also possible for players to develop emotions towards in-game characters.

Psychological Reactions – players who seek out games based on psychological reactions
they get from them e.g. an adrenaline rush.

Absorption – when the player feels like he is in another world, and is totally absorbed by the
whole experience. This engagement type also acts as an escape from the real world for the
player.

The concepts such as flow, presence, immersion and engagement are of great importance for
the researchers within the field of game and user experiences. Understanding those in depth
can give a new perspective in the development of games. However we chose to focus on
engagement theory as our primary framework of choice, to work out how gameplay
experiences are affected by different interface combinations.

To conclude, we find that Schønau-Fog and Bjørner’s original engagement theory fits our
work, in that their engagement framework is specifically set up to reflect on changes in levels
of engagement throughout the entirety of a gameplay experience. Equally, we find the

Page | 11
Med8 - Group 10ml883 May 2010

concept of specific engagement parameters easier to digest, compared to the more abstract
definitions to terms such as immersion or presence, where one can only really argue in ‘yes’
or ‘no’ terms if one has experienced immersion and so on. Further analysis and direct
application of Schønau-Fog and Bjørner’s engagement theories can be found in the Synthesis
on Engagement Theory chapter.

2.5 Initial Problem Statement

With the choice of which engagement theory to use settled, the initial problem statement for
this project is as follows:
“How is a player’s engagement in a video game affected by change in user interface?”

Having reached the conclusion in the previous section that Schønau-Fog and Bjørner’s
engagement theory framework best describes the mid-gameplay focus of a computer game
player, we state with our initial problem statement that we wish to understand how a player’s
engagement changes. Tying this in with our original motivation to observe and try to
understand the differences in engagement parameters within the same game, but through
different interfaces, we include in the initial problem statement that we want to know how
different interfaces affect this.

How to specifically chart the differences using engagement theory are still undetermined, as
well as what to test with, but this will be settled in the following sections of the report.

Page | 12
Med8 - Group 10ml883 May 2010

3 Pre-Analysis

In this chapter we will present our initial ideas for the theories and different interface
configurations for the tests. We specify what game to use as content for the tests, as well as
define the limits of the scope of the project.

3.1 Choice of Interfaces

As stated in the motivation, the intended goal of this project is to examine the differences in
game experiences derived from variations in game interfaces. In this section we will discuss
our choices of these interfaces.
For the purpose of comparative analysis of gameplay experiences we considered the more
popular as well as the more uncommon interface types available. Most consoles do variations
on the theme of gamepads, and computers have a large variety of keyboards and mouse-
devices, but relative to their system they all do roughly the same things. Then there is the
Nintendo Wii, requiring more physically effort, as the Wiimote controller uses tilt and motion
sensors, which in turn means that the games for the Wii tend to require a lot of physical
motion.

Of the more uncommon variations of interfaces there are those that deal with the visual
output of the game system. All computers have monitors, and all consoles require a TV or
something similar to display their content. To this end an early idea of ours would be to also
work with a Head Mounted Display (HMD).

All of this we intend to combine, as stated initially, to test how a game experience is affected
by comparing player data from the same game, but with three different interfaces: One being
a normal computer setup, one being with a Wiimote and one with the use of a HMD.

3.1.1 Standard PC Interface

The baseline for comparison will be that of a laptop, with a mouse, connected to a video
projector projecting the visual interface onto a larger surface.
We chose this as the standard setup because common sense dictates that this interface is one
that most if not everybody is familiar with, regardless of what operating system that is being
used. This setup is compatible with most if not all games, and finally because it will be
compatible with the subsequent altered interfaces – or to put it another way: all subsequent
alternate interface setups will be based on this interface.

Page | 13
Med8 - Group 10ml883 May 2010

3.1.2 Nintendo Wii Interface

We originally considered the use of a Wiimote as one of the interface variations. In this
section we will explain why we find this a valid interface alternative.The Nintendo Wii
console itself is far more popular than its rival consoles, with sales twice that of the
Playstation 3 for instance:
80

70

60

50
Nintendo Wii
40
Xbox 360
30 Playstation 3

20

10

0
Sales in millions

Figure 2: Console sales, showing the Wii to outsell its main competitors [20] [21] [22]

The Wiimote offers unique input and interfacing possibilities not found on any other
commercial game system at this time. This is part of why we have chosen the Wiimote as one
of the interface types for this project.

We also feel that the types of games most commonly known to, and played by, people on the
Wii are casual mini-game compilations. These also focus on local (same room) multiplayer
support. This indicates an overall higher physical engagement potential, as well as heavily
implied social engagement due to the prevalence of multiplayer games for the console.

This is backed up by looking at the best selling games for each of the three consoles.

Page | 14
Med8 - Group 10ml883 May 2010

70

60 Third most selling game title

50

40

Second most selling game title


30

20

10
Most selling game title
0
Nintendo Wii Xbox 360 Playstation 3

Figure 3: Console game sales, comparing the three most popular games’ sales for each of the three consoles in
millions, showing that Wii games sell more than its competitors [23] [24] [25]

The 3 top selling Wii titles are [26]:

• Wii Sports, with 60.69 million copies sold


• Wii Play, 26.71 million copies sold
• Wii Fit, with 22.56 million copies sold

All of these games are mini-game compilations, focusing on short but intense spurts of
physical activity as part of the gameplay.

Looking at the top 10 lists for each console reveals similar trends [23] [24] [25]. The same
type of casual mini-game titles populate the Wii game library, whereas the two other
platforms largely sport games focused on longer play time intervals.

We believe that the Wiimote input interface is highly physically engaging to a player at first,
but loses its appeal after some time (~30 minutes) due to fatigue. This is supported by our
own observations of the type of games most common to the Wii as well as testimony from
actual Wii players [27] [28]. The Wiimote thus appears best suited for short periods of play
(~15 minutes) due to the physically active play style.

It is because of the above stated differences, that we find the Wiimote interface, due to the
use of infrared (IR) motion detection and built in accelerometers, to be sufficiently different
from the traditional mouse and keyboard input interface, to justify comparative testing on its
effect on a user’s gameplay experience.

To specify how this interface will differ from the baseline interface, the Wiimote and
accompanying handheld Nunchuck device will substitute the mouse and keyboard input
devices for the users during the tests. They will be configured in such a way that the devices

Page | 15
Med8 - Group 10ml883 May 2010

respond in similar fashion to that of a mouse and the keys that may be needed to play a
simple game. We recognize that this imposes a limit to the complexity of the possible game
interface, but it should be possible to find a game that suits our needs with these imposed
limits.

3.1.3 Head Mounted Display Interface

We also wish to test on an interface centered around a HMD, that is, a stereo-optic visual
output device worn on the head. As with the choice of the Wiimote, we choose this because
we believe that a visual output device such as a HMD is sufficiently different, although not
unheard of, for it to be valid as a truly alternate interface device.
HMD’s have several applications in areas such as entertainment - for gaming and movies,
military - for aviation and tactical purposes and in areas such as medicine, engineering and
science. [29]

HMDs were first developed in the late 60’s by Ivan Sutherland [30]. They first appeared on
the market in the 90’s by companies such as Olympus and Sony. The early models were
bulky, expensive and they looked ridiculous when worn, a factor referred to as the “dork-look
factor” [31]. Besides, those models were heavy, and with low quality displays limited by
contemporary miniaturization technology. Today’s technology has improved and vendors
constantly develop models which have fewer shortcomings compared to previous ones. The
latest models are in the form of eyeglasses and sometimes those are referred to as personal
media viewers [32] or video glasses.

A typical device has two small displays, one in front of each eye (binocular HMD). The
displays could be Liquid Crystal Display (LCD), Cathode-ray Tubes (CRT), Liquid Crystal
on Silicon (LCoS) or Organic Light-Emitting Diode (OLED) [33]. Some HMD lenses
magnify the displays in a way that makes the viewer have the feeling that he is looking at a
large screen from a certain distance. Most of today’s models beam a slightly different image
to each eye which gives the sensation of depth and allows for 3-D imagery. This effect is
called stereopsis which could in theory increase the sensory engagement of a user.

Page | 16
Med8 - Group 10ml883 May 2010

Figure 4: Example of stereopsis [34]

An important element in the development of a HMD is the Field of View (FOV). A television
display can take up approximately 10-15 degrees of our FOV. The normal human eye can
span between 180 and 200 degrees FOV. Some HMDs allow for head tracking, so with
additional head rotation on the horizontal axis it reaches an FOV up to 270 degrees [30].
Even a very large flat display would fill only 180 degrees FOV, unless a curved surface is
used. This could potentially affect a user’s sensory and physical engagement, in that
increased attention would have to be given to looking in specific directions, and cerebrally
compensating for the changing point of view, compared to using a fixed monitor.

Another important factor in the development of HMDs is the resolution of the displays. A
higher resolution is required for better quality video. Vendors are aware of those facts as they
strive to develop devices with a higher resolution and wider FOV. However the current
HMDs only take up a small range of the human FOV because of the limitations of the display
technology used in development [35]. Nevertheless those issues will diminish as the
technology behind the HMDs evolves. The fact that HMDs today cannot match the resolution
of computer monitors does present a potential for problems relating to sensory engagement.
How this will affect any test results is unknown, as it could create a negative impact on the
engagement experience or it could have no effect at all.

Even though we live in a constantly evolving technological era where video glasses become
lightweight and less expensive, there are still constraints that their usage brings to the viewer.
One of these constraints is the fact that they can become uncomfortable after prolonged
viewing. The fact that the displays are so close to the eyes can make the viewer focus closer
than is comfortable to him [31]. If the lenses are not adjusted to the user’s individual
interpupillary distance (the distance between the eyes) it will tire him quickly from the
constant image focusing. In some cases those issues cause a condition referred to as cyber

Page | 17
Med8 - Group 10ml883 May 2010

stress [32]. According to sources [36] the symptoms of this condition are dizziness, nausea,
headache and eyestrain. However there is little research showing that those devices could
harm a user’s vision. However vendors’ warnings are that the viewers should take breaks as
the devices become uncomfortable after half an hour to one hour [31].

There are some HMD devices that also feature head tracking. They have built in sensors
which allow changes in the displayed images via head rotation. In applications where such
functionality is relevant (e.g. First Person Shooter games) where the viewer can explore the
virtual environment from a first person view, the difference between the head movement and
the image beamed onto the eyes should be minimal. Otherwise it can result in lag or latency,
which may evoke motion sickness, as it creates a disruption between ones vision and sense of
balance [37], which could influence a person’s proprioception. A feature such as head
tracking, unless built in to the device, requires an external motion tracking setup beyond the
intended technical scope of this project, and will thus not be implemented.

Another reason for motion sickness could be the immersive HMDs [32] [31], as they have a
rubber shield which covers the entire field of view of the eyes and excludes the view of the
surroundings. A solution proposed to this are the video glasses which have small openings
that allow the viewer to see the environment around him at all time, which are known as non-
immersive HMDs.

There are of course a lot of choices to be made when using a HMD, based not just on choice
of what brand or specific HMD goggle to use, but how it is to be used. This last part relates to
affordance, a term defined in relation to Human Computer Interaction (HCI) by Donald
Norman [38], where Norman states that affordance is what people expect of things, and
expect things to be able to do.

Another problem is quite simply how media manufacturers like Hollywood portray HMDs
and virtual reality related technology, such as in the movie The Matrix [39]. HMD goggles do
not put one into a virtual reality; they simply put the monitor of the computer up into ones
face.

Despite what user expectations may be for the HMD, we still argue that the interface is
different enough from the baseline interface, that a gameplay experience with an HMD will
differ noticeably. Provided that we use immersive HMDs, then sensory engagement should
be much easier to achieve, since there would only be the visual and audio content of the game
to perceive.

3.2 Target Group

The concept of our project relies on a game where a variety of interface combinations can be
used without complications. The same applies to the people we intend to test this with.
The users which we aim for do not fall under particular categories of game players. They can
be people who are casual players and spend few hours a week playing games, hardcore

Page | 18
Med8 - Group 10ml883 May 2010

gamers who play games several hours a day, or people who do not play computer games at
all. It doesn’t matter if the test subjects have played the game before or not either. Since our
project is about mapping different game experiences, then comparing the data from different
types of players could in fact be an interesting way to test the mapping method.

Age and gender is not critical, however it will be of interest to look at that data, when
evaluating the results, to see if any patterns emerge.

3.3 Choice of Game for Testing

Using the Wiimote as an alternative interface might be problematic because most games are
not ones you normally play in short ~15 minute spurts as you do with most Wii games. But
despite this disadvantage we believe the Wiimote would still feel natural to the player if we
choose a suitable game for it. The trigger on its back mimics a gun trigger, while its IR
capabilities allow it to mimic the notion of pointing a gun and moving the crosshair around
quite well, making it a well suited interface device for FPS games.
With this limitation imposed by the Wiimote, we do not find any other genre to be a viable
alternative. There are other genres that can be played from a first person viewpoint such as
flight simulators or racing games, but the games in these genre that we considered failed on
our subsequent requirements. They did not have simple enough interfaces for being viable to
play with a Wiimote, or have enough content to engage a player enough in all of the
engagement parameters.

While we do not have a particular target group in mind when selecting test subjects, we
believe the FPS genre in itself should be familiar even to casual or non-gamers.

While the roots of the genre can be traced back as far as the 1970s, the 1992 release of id
Software’s PC game “Wolfenstein 3D” still serves as a blueprint for modern FPS games [40].
Since then, games of the genre have been predominantly developed for the PC, where a
mouse and keyboard input interface is typically used.

Having decided on a genre, the next challenge was to find a game which had controls that
could be fully mapped to the Wiimote, yet still be intuitive to the player. This ruled out most
of the common FPS games, as they often require numerous buttons available to be pressed for
a multitude of actions in the game.

Another concern was finding a game that would to some degree exhibit as many of the six
engagement factors as possible, as defined earlier in the report. Generally FPS games are not
known for their storylines, characters or puzzles. It would therefore be crucial to find a game
that would have at least some dramatic, emotional and intellectual engagement we could
measure during testing.

In addition we needed a game genre that would allow us to utilize an HMD to its fullest.
While many genres could in theory benefit from an HMD setup, such as racing games or
flight simulators, the FPS genre is the logical choice. The list of supported games by the

Page | 19
Med8 - Group 10ml883 May 2010

particular HMDs available for use at AAU-Copenhagen also played a part in this decision. As
choosing an unsupported game would require additional setup time, compared to one which
works with the HMDs right out of the box.

The game ultimately chosen was “Portal” [41], which happens to fulfils all our needs. The
game is developed by Valve Corporation and released in 2007 for the PC, Playstation 3 and
Xbox 360.

First of all it has simple controls which can be fully mapped to the Wiimote. The level
structure of the game also lends itself to the Wiimote, as it’s quite similar to the traditional
Wii mini-games in the sense that the game is split up into relatively short levels. And while
the storyline and characters in the game are fairly minimalistic, we hope what little there is
will provoke some curiosity in the player, making them want to explore and overcome the
challenges and puzzles to get to the next level to find out what is going on in the story.

The game is also quite different from most FPS games in that you don’t go around shooting
enemies, the gameplay is entirely focused on puzzle solving. This aspect should provide
some intellectual engagement for the player at least. As a result, the pace of the game is
slower than typical FPS games to accommodate the puzzle elements, which should allow for
easier gameplay with a Wiimote, even if players find using a Wiimote difficult.

3.3.1 Examples of Engagement from Portal

In this subsection we will briefly give a series of examples of how a player of the game Portal
can become engaged, while playing the game. These examples are based on our own
observations when playing the game.
Intellectual engagement: The puzzle solving in the game proposes an obvious source of
creative thinking.

Physical engagement: Careful coordination of hand-eye control is in some places needed to


solve certain puzzles. When testing with the handheld Wiimote device there is also a great
chance that this will affect the physical engagement of the player.

Sensory engagement: There are many subtle details in the game that one has to become
aware of to successfully navigate the levels. This can be what kind of surfaces one can
project portals on to, or paying attention to spatial geometry when bouncing things around to
hit special objectives.

Dramatic engagement: The story of the game is that you are a test subject in a portal testing
facility… or are you? While the details of the story are sketchy at first, there are many subtle
clues that something is wrong, which is particularly apparent through the AI narrator that
guides you through the maps.

Emotional engagement: As with any puzzle game, great frustration can be had if one gets
stuck. Equally, then the rewarding feeling of completing such tricky puzzles can yield

Page | 20
Med8 - Group 10ml883 May 2010

positive emotional engagement. There are also elements in the game where one is supposed
to destroy certain companion objects, which are designed to evoke emotions in the players.

Social engagement: The game is not a multiplayer game, so social engagement is not
applicable for this reason. We find that this is an acceptable trade-off, as the game meets all
our other requirements.

With these examples we demonstrate that it should theoretically be possible to experience


engagement in all the relevant parameters in the game, making it a valid choice for testing.

3.4 Delimitation

In this study we want to investigate engagement using only the particular types of
engagement parameters defined by Schønau-Fog and Bjørner [16]. It is thus outside of the
scope of this project to work with immersion and presence, as these concepts are often
presented in very abstract forms, while the engagement theory we wish to work with propose
very definite parameters that can be tested upon and gauged even by test subjects unfamiliar
with the topic. Equally, our choice of engagement deals with what a player is engaged in
during gameplay while the concepts of immersion, flow and presence are typically only
measured after the game experience. To this end the concept of immersion, presence and flow
will not be used in test evaluations.
The concept of the four phases of engagement will not be used in this study. We choose this
because the time spans of the different phases are too broad for single semester studies.
However, we will take into account the fact that the phases show that the engagement
framework can reflect on change over time in engagement. This aspect of engagement
changing over time will be worked into the tests.

Another delimitation of this project concerns the choice of a game. As stated in the Choice of
Game for Testing section we have chosen to use a simple FPS game with a minimalistic
interface, since it can be fully mapped to a Wiimote and Nunchuck. Equally, if not more
importantly, then the game shows that it can allow a players to experience gameplay relating
to all of the engagement parameters. It should be noted that this does not include the social
engagement parameter, as we do not intend to test with any form of multiplayer gameplay,
which the game doesn’t support anyway.

Page | 21
Med8 - Group 10ml883 May 2010

3.5 Final Problem Statement

With the delimitations, the selected interfaces and the choice of the game to be used in place,
we expand our problem statement to include the following:
“How can one map player engagement in the game Portal, and how is that engagement
affected by playing it with a Wiimote and Nunchuck as an input devices, or with a HMD
as an output device, compared to playing it with a mouse & keyboard using a video
projector?”

While similar to the initial problem statement, this final problem statement specifies what
variable parameters are to be introduced in the testing, and how the data is to be analysed.

Page | 22
Med8 - Group 10ml883 May 2010

4 Synthesis on Engagement Theory

In this section the material relating to Schønau-Fog and Bjørner’s engagement theory,
covered in the Immersion, Presence, Flow and Engagement Theories chapter, will be
analysed and the different versions compared. A synthesis will be made from this which we
will use later to help formulate a specific questionnaire for our tests.

Figure 5: The causal relations between engagement, flow, immersion and presence

As mentioned in the Immersion, Presence, Flow and Engagement Theories section, there are
different definitions of immersion, engagement and presence. The above figure shows how
we have come to understand the causal relationship between engagement, immersion,
presence and flow for the purpose of this project. Engagement, as defined by Shønau-Fog
and Bjørner [16], is any combination of any of the engagement parameters. At high enough
levels engagement can evoke any combination of the three following states of mind.
Presence represents the feeling of non-mediation of the media one is engaged in [13], the
feeling that one is not interacting through a media, but directly with the virtual content. Flow
represents the ease of use and positive feedback of the interaction [2]. The concept of
immersion we understand as perceptual immersion as defined by Lombard and Ditton [13].
This is explained in gameflow theory [2], stating that one becomes so immersed in a game
that it leads to one ignoring the real world in favor of the virtual.

To further explain the above figure, the three effects of engagement (immersion, presence
and flow) can occur individually or at the same time. Consider that flow deals with positive
experiences, and not everything falls under that category. Non-mediation of media can
happen, simply if a media experience is detailed enough for a person’s imagination to kick in,
making one momentarily think that he is there. Alternately an interface could be so intuitive
that a person does not become consciously aware of the mediation between him and the
virtual content. Immersion can happen any time one is so engaged in something that one

Page | 23
Med8 - Group 10ml883 May 2010

forgets about time, be it work, play, or anything else. Gameflow theory [2] does link
immersion to flow, but then you have to enjoy yourself to lose track of time. All of these
concepts require some level of engagement into something, be it a game or almost anything
else; you cannot experience something without engaging with it on some level, be it through
conscious effort, accidentally or subconscious effort.

4.1 Engagement Parameter Comparison

Regarding the ten ‘new’ parameters of engagement described by Schønau-Fog [19]


mentioned in the Immersion, Presence, Flow and Engagement Theories section, we have
come to understand that the initial six types of engagement defined by Schønau-Fog and
Bjørner [16] are more related to the topic of understanding what people think of when
playing a game, while the new parameters seem to describe more the players’ motivation for
playing.
Most of the aspects covered by the ten new types of engagement are already present in the
old definitions in Schønau-Fog and Bjørner’s work [16]. We feel that the engagement
concepts applied in Schønau-Fog and Bjørner’s initial research give a better overview of the
engagement topic, relating to the concept of what people busy their minds with when
engaged with something. The new concepts seem to better describe not what people think
about when playing, but why they play, making them less relevant for this study. However,
some of the concepts are useful for this study none the less, so we will conduct a brief
analysis of the ‘new’ parameters to judge their potential for this study.

Advancement – this parameter refers to the desire of the player to constantly improve.
Whether it is by learning the rules and possibilities of the game, by acquiring items within the
game to advance a character or get to the next level. The desire to become better is what
keeps the player engaged.

Completion – a player will be engaged and keep playing as long as there are some objectives
to complete. If there are challenges to overcome or levels to complete the player won’t be
satisfied until he has completed the game fully.

These two parameters seem oddly similar. Both relate to doing or finishing goals in a game,
to the point that we would question why these aren’t merged into a single concept,
encompassing the players drive to complete goals in a game setting to improve and advance
himself. After all a common goal in many games is to improve and advance ones character.

Exploration – the exploration type of engagement could be both sensory or story related.
The player enjoys seeing everything and knowing everything about the in-game world he is
in. This type of player actively looks for hidden items or objectives, or discovers alternate
paths to the end of a level.

The description of this form of engagement highlights that this relates less to what a person
focuses on mentally during gameplay, but more describes overall play style and motivation

Page | 24
Med8 - Group 10ml883 May 2010

for doing certain things in a game. This also applies to some extent to advancement
engagement. However, the concept of exploration, on focusing on exploring game content
and figuring out all possible ways of doing things in a game is not covered by the six initial
engagement parameters. To this end a revised version of this engagement parameter could be
useful for this study.

Sharing – this social engagement aspect focuses on sharing the experience of playing the
game with others. This could both be co-operative play or competitive play, as long as the
player feels like he is a part of a group or community.

As an engagement parameter, this is identical to the initial ‘social engagement’ classification


in Schønau-Fog and Bjørner’s work [16], making us question why a new term had to be
coined for this.

Intellectual – developing strategies and problem solving are the parameters which engage the
player here.

Since intellectual engagement is used in the same way that is described in the initial
definition, we find no flaw in having it in this new list of engagement parameters.

Modification – the player is engaged if he is able to customize certain aspects of the game or
his character. User generated content, such as the ability to create e.g. level by himself are
key factors.

The modification factor is described as the player being engaged in the ability to change or
modify game content, be it by contributing with user generated content, or just by the options
of interaction in the game itself.

To elaborate on this, this parameter is defined by the possibility to work with user-generated
content. However it is also stated that if a game simply has different built-in ways of doing
similar things differently, or to customize things, then that constitutes a form of modification
as well. We find that defining engagement as something as simple as changing your
characters hair color is not a valid reason to define a new parameter of engagement. Most
games offer ways to customize ones game experience, be it by making a personalized game
character, or choosing different weapons for a mission, but the end result is a pre-defined
outcome for pretty much all games. You defeat the final boss, no matter what, it does not
matter what your game character is customized to look like.

Equally, the description in the engagement definition relating to having fun would already be
covered in another parameter. It is stated that a person would only modify things in a game,
or create his own levels if it's e.g. fun, but that is covered specifically by the psychological
reactions parameter. On the other hand if the player has finished all the levels in the game and
wants new and more challenging levels to play, he could make his own levels to complete,
which would then be covered by the completion factor. However in neither case do we see a
need for the separate modification category. In essence, if the game content includes the
option to create addition game content, then almost any reason to do so can be attributed to

Page | 25
Med8 - Group 10ml883 May 2010

the other engagement parameters. If you make game content better suited for multiplayer;
that would be related to sharing, and so on.

Interfacing – interfacing requires that the player carries out physical actions to provide input
to the game. These actions must be carried out for the player to reach the goal.

We find this engagement concept completely redundant. A person can be engaged in a game
while looking at others playing the game, or thinking about how to solve a puzzle
encountered in a game while not playing it. However, giving the concept of ‘are you actually
playing the game or not’ its own parameter is redundant, in that it can be summed up as a
simple yes or no question. This parameter is defined from answers relating to being
physically active while playing games, such as games on a Wii – which relates far more to
Physical engagement as defined in the initial study [16].

Emotions - this engagement parameter covers emotional reactions to in-game events, such as
frustration and joy when met with a difficult task and subsequently overcoming it. However,
it is also possible for players to develop emotions towards in-game characters.

Psychological Reactions – players who seek out games based on psychological reactions
they get from them e.g. an adrenaline rush.

Common sense would dictate that feeling an emotion is a psychological reaction. Players
seeking out specific psychological reactions in terms of wanting to experience specific
emotions, such as playing a horror game to experience fear, thrill or an adrenaline rush,
would be engaged in both of these engagement concepts at the same time. Therefore we
believe that the more general term “emotional engagement”, as defined in the original study
[16], is more applicable to describe this.

Absorption – when the player feels like he is in another world, and is totally absorbed by the
whole experience. This engagement type also acts as an escape from the real world for the
player.

This concept is essentially just a mix of presence as defined by Lombard and Ditton [13], as
well as the element of flow describing “a deep but effortless involvement that removes
awareness of the frustrations of everyday life” [2]. We find that with the notion of absorption
being used to describe a form of escapism, which describes less the actual thought processes
during a gameplay experience, and instead goes to explain a players motivation for playing a
game in the first place, similar to exploration engagement. To this end, this parameter being
identical to immersion and presence, it almost falls outside the scope of engagement theory,
making us question its validity as a parameter.

To conclude on this, we find that some of these concepts seem somewhat superfluous,
although truthfully the reason for this could also be that the basic definition of engagement
used in the initial study by Schønau-Fog and Bjørner [16] differs slightly from Schønau-
Fog’s later work on engagement [19]. Although it should be mentioned that in Schønau-Fog
and Bjørner’s work they do specifically mention that their definition of engagement also
relates to the motivation of a player, but Shønau-Fog’s later work on engagement does seem
Page | 26
Med8 - Group 10ml883 May 2010

to have a greater focus on motivation than his cooperative work with Bjørner. It should also
be noted that all of the ten parameters were defined purely from the grounded theory
approach, with no regard being taken to the earlier work Schønau-Fog had made with
Bjørner. We find this lack of comparative analysis on Schønau-Fog’s part questionable, as
the parameters he define are so similar to the original six parameters.

4.2 Final Choice of Engagement Parameters

For this study, and with the above examination of the engagement parameters, we must
choose and justify our own understanding of engagement. Indeed, this field of science is still
new and consensus might be years away, but we will choose a combination of engagement
factors that we believe will best suit our intended goal.
It must be understood that while the initial six engagement parameters were very broad, we
intend to narrow down the scope of each parameter. The goal being to allow test subjects to
better understand each parameter when asked for their state of mind relating to each of them.
Equally, since our test will not be one utilizing any forms of multiplayer content, we will not
include social engagement in any form. One should remember that this is not to exclude this
concept from our overall final definition of engagement, as we find that it would be an
integral part of an overall framework to understand engagement, but for the purposes of this
study we have no desire to have test subjects play against or with each other.

In the end we have defined seven parameters of engagement that will be used for this study,
five from the initial study [16] and two from Schønau-Fog’s later work on the same subject
[19]. All have been revised and reworded for the purpose of defining them as ‘what people
think of/focus on during gameplay’.

Intellectual engagement – Focus on intellectual challenges which encourage creativity and


thinking, for instance solving puzzles or creating strategies.

This concept of intellectual engagement differs very little from its original form. The concept
is unique and relates very little to the other parameters, making it valid as a non-redundant
parameter.

Physical engagement – Focus on physical actions carried out with the use of input devices,
hand to eye coordination and similar physical activity to play the game.

While the original definitions [16] explains that physical engagement relates to the positive
feedback of physical interaction with a game or game system, we have simplified it to the
extent that it focuses not on the physical interaction during gameplay, but on the attention
given to coordinating physical actions to facilitate gameplay. Essentially this covers the same
subject area, but the focus is different: stating that simply interacting with a game is engaging
is redundant, as it is required no matter what. The level of focus required to interact can vary
greatly, be it because a game system require greater physical control, such as with the Wii, or

Page | 27
Med8 - Group 10ml883 May 2010

because one is inexperienced with an interface and thus must be cautious and conscious of his
actions to ensure that the right buttons are pressed.

An example of high physical engagement in this form could be one who types on a keyboard
very slowly, pressing one button with his index finger at a time, having to constantly focus on
finding buttons and coordinating his hands, while a person who types very quickly would
have very low physical engagement in the act of writing.

Dramatic engagement – Focus on the story experienced while playing the game, much the
same as it is in books or movies.

As with our definition of intellectual engagement, then this definition does not differ greatly
from the original definition in [16]. It is sufficiently unique to be valid and non-redundant.

Emotional engagement – Focus on the player’s own emotions during gameplay such as
frustration over hard challenges, excitement/joy when overcoming a difficult challenge as
well as feelings towards game characters and non-player characters. This includes both
positive and negative emotions – any emotional connection to the game.

Again as with dramatic and intellectual engagement, this engagement parameter is very
similar to its original definition in [16]. It is sufficiently unique to be valid and non-
redundant.

Sensory engagement – Focus on various visual and auditory cues from the game. Such as
the graphics, animations, visual effects, sound effects, music and dialogue. How much
attention you pay to what you see and hear.

Sensory engagement was originally defined to include aspects of what Schønau-Fog later
split off into exploration engagement [19], to which end we define sensory engagement
without this, defining it as how much attention is given to the sensory stimuli, such as audio
or graphics, available in the game one is engaged in.

Of the ten new engagement concepts proposed by Schønau-Fog [19], two of the new types of
engagement turned out to be very interesting, namely completion and exploration. While both
are similar to the original definition of the ‘sensory engagement’ parameter, we feel the
specific distinction between the 3 types of engagement is very important in this case, as they
reflect on three very distinct player experiences. Sensory engagement described the general
desire for the player to experience the game world and see new things, however the player
does not necessarily wish to advance or finish the level by that same definition.

While a player who’s engaged by completion might strive to take the quickest path to the end
of a level, a player engaged by exploration would instead search every nook and cranny to
see if there are any hidden paths towards the goal. Previously sensory engagement would
have put both types of players into one category, as they were both interested in exploring the
whole game, with one type wanting to explore each level thoroughly, while the other type
would instead rush through every level to get to the next one to completely explore the whole
game world. With this we arrive at the following definitions:

Page | 28
Med8 - Group 10ml883 May 2010

Completion engagement – Focus on completing available tasks and levels in the game.
Trying again and again even in the face of impossible odds.

Exploration engagement – Focus on exploring everything that is possible to do, or explore


in the game, be it a level’s layout, a game's lore, possible character interaction or anything
else in the game.

It is with this list of engagement parameters that we will conduct our tests, as we reason that
with these parameters we can map the particular game experience we plan to question our test
subjects about. If anyone was to conduct a test that involved aspect of multiplayer gameplay,
then social engagement in its original form as defined in [16] should be added to the list.

Since in the Examples of Engagement from Portal subsection we had examples of the
engagement parameters, we end this chapter of the report with two examples of potential
engagement from the game Portal relating to the new additions to the parameter list:

Completion engagement: The game Portal offers many ways to have fun with the portal
mechanics; this can be a great distraction from actually focusing on completing the game. To
this end the amount of effort given to completing content in the game can vary a lot.

Exploration engagement: Portal can give inexperienced players new things to look at and
try, although how much they focus on this can vary. Experienced players can equally focus
on trying out new tricks and stunts with the portal mechanics.

Page | 29
Med8 - Group 10ml883 May 2010

5 Methodology

For this project we have decided to use a triangulation testing method [42], using both
qualitative and quantitative elements, based on previous studies on game experience and
enjoyment in games, such as Ermi and Mäyrä [9] and Brockmyer et al. [6], mixed with
inspiration taken from Schønau-Fog and Bjørner’s combined and respective works on
engagement [16] [19]. As stated in the Immersion, Presence, Flow and Engagement Theories
section we argue that absolute statements in multiple choice questionnaires are insufficient to
describe the subtlety in personal gameplay experiences. Therefore we have based our
methodology on a refined and updated version of these approaches, so that the end result fits
our goals.
The aim of this study is to find a way to map how player engagement, as defined by Schønau-
Fog and Bjørner [16], is affected by different interfaces during a gameplay experience. To do
that, we will create a questionnaire for quantitative data collection and a structured interview
setup for qualitative questions. To this end, the goal of the tests are to map the experience of
the user and his engagement while playing the game Portal, and through that investigate how
the different types of engagement parameters are affected by the different interfaces. We will
collect quantitative data to allow us to map user engagement over a period of time, using a
modified approach based on Ermi and Mäyrä [9] and Brockmyer et al. [6], focusing on rating
the seven selected engagement parameters from 1 to 5, with 5 being the highest rating. For
qualitative data we will inquire to the players overall enjoyment of playing the game. This
will allow us to put the engagement data into context. We will also inquiring to the test
subject’s age, their average time spent playing computer games a week, and finally we will
write down observations relating to player experiences, such as observations of frustration
from getting stuck in the game, difficulty using the Wiimote and so on.

It should be noted that in this study we are not using any HCI development or software
development methods as they focus on the development of a product which is outside the
scope of this project. As we are conducting a study using an existing game that we use for
comparison of different interfaces’ effect on gameplay experiences, our methodology’s focus
is only on the empirical data and evaluation.

As mentioned in the Target Group subsection, we are not aiming to find test subjects of a
specific age, gender or certain game experience. Consequently other Medialogy students at
AAU-Copenhagen, as well as the engineering students at IHK should meet our requirements.

Age, gender or game experience is not crucial when choosing test participants; but it would
still be interesting to compare the results gathered from e.g. hardcore gamers and casual
gamers, or male and female players. However, the primary goal of the test will be to collect
data to analyse any variations in their responses and ratings of the seven engagement factors,
to map player engagement with a mix of player types, not for a specific target group.

In the end it should be important to take into consideration that the tests themselves do not
seek to determine the “best” possible combination of devices, e.g. HMD with Wiimote, for

Page | 30
Med8 - Group 10ml883 May 2010

playing Portal, as this is not the goal of the project, plus it would require an even great
number of test subjects and more resources to do, and that is outside the scope of this project.

5.1 Test Design

The tests will be done on three groups, each comprised of 20 participants. The first group will
be tested on the baseline interface, the second on the Wiimote interface and the third will be
tested on the HMD interface. Different test subjects will be used for all tests. This between
subject design [42] is to ensure unbiased results.
The data from the baseline test will then be compared to the HMD and the Wiimote interface
data, respectively. For both comparative analyses an equal number of test subjects will be
used, as this will allow us to compare the results using a t-test to show if there is a significant
difference between engagement ratings of the Wiimote and HMD interfaces versus the
baseline. Although we are not making any comparisons between the Wiimote and the HMD
interfaces.

Each participant will be tested for 30 minutes using one of the described interface setups. The
reason for this timeframe is to give the player enough time to become engagement and to
allow a change of focus in engagement during gameplay.

Figure 6: An example of data from one parameter, intellectual engagement, can appear

The test will consist of two parts. First a gameplay test with questions, then a structured
interview with questions.

The first part will be during the gameplay test for each test subject. After the first 5 minutes
of gameplay the player will be prompted to rate his engagement, relating to the seven defined
parameters, as well as after the 15th and the 30th minute of gameplay. For each round of
questions the player will be cued to pause the game. We base these three interrupts for data

Page | 31
Med8 - Group 10ml883 May 2010

collection on the method used by So, Lo and Ho [43] who tested on navigation speed in
virtual environments and its effect on vection, where they would sample data in intervals of 5
minutes. We argue that by sampling data after five minutes, then after fifteen minutes of total
playtime, and finally after a total of thirty minutes of play we give the test subject enough
time to experience enough gameplay to form an opinion relevant to the study, which we can
then transform into a graph showing any change in how the test subject’s gameplay
experience has evolved.

We are aware of the fact that this will interrupt the player’s actions and concentration in the
game. However the purpose of doing so is to probe fresh memories of gameplay from the test
subject after periods of play. Examining if the user’s engagement has dropped or increased
after a certain time period can give us an idea of how and why this happens.

The second part will be after the test, when the participant is done playing the game. He or
she will be given a structured interview where they can describe their experience and give
any feedback, as well as answer our prepared questions. This data will be used to understand
the context in which the test participant gave his or her answers during the test, for example if
the person is a hardcore gamer or not, if they had fun with the game, or if they would play
more if given the opportunity.

Combining qualitative and quantitative measurements will give us more confident and
reliable results which can lead to a concrete conclusion of what the data indicates. However it
is evident that engagement is a subjective concept which is difficult to measure. There are
alternate, more objective, methods that can measure game experience, which includes
psychophysiological recordings such as brain activity (EEG), facial expressions (facial EMG)
or EDA (Electrodermal activity) [44]. Such methods could yield data which is triggered
unconsciously that could describe a test subject’s physical state more factually than a test
subject answering questions. Findings based on such methods could give an interesting
perspective on the results from the questionnaires.

However these methods are time consuming and require a level of expertise on the use of
such equipment and how to analyze the data. Equally, such data would only reflect on a test
subjects overall state of mind and being during a test, not on what engages them. To analyse
such results using psychophysiological recordings would require comparing such data to
video recordings of each test subject’s gameplay, to match test subjects actions in the game to
peaks and valleys in the psychophysiological recordings. Another aspect to consider is that
hooking a test subject up to such a rig of sensors can compromise the test subject’s natural
gaming experience, restricting movement and corrupting any data gathered, as stated by
Ganglbauer et al. [45]

We argue that the data we seek using engagement theory and carefully worded
questionnaires, should yield similar and ample results using far less resources and equipment.
As a result psychophysiological recordings are not part of this study.

The PC game ‘Portal’ will be used as the basis for the tests as justified in the Choice of Game
for Testing section.

Page | 32
Med8 - Group 10ml883 May 2010

5.1.1 Interface Setup Designs

Figure 7: The baseline PC interface setup design, showing a test subject at a laptop, with a video projector

For the mouse and keyboard setup, and the Wiimote setup, we will have the game running
using a projector to give the tester a more open environment where he will be able to stand up
and use the Wiimote freely. This is done to ensure that both test setups have the same visual
output device, making the only difference the input devices and that the Wiimote users stood
up, while the others sat down.

Figure 8: The Wiimote interface setup design

The visual output device test will compare a projector versus a HMD setup, both controlled
with mouse and keyboard, to maintain the same input format for both test. The same aspects
will be taken into consideration here as with the input device comparison.

Page | 33
Med8 - Group 10ml883 May 2010

Figure 9: The HMD interface setup design

The reason we do not intend to test a combination of HMD and Wiimote interfaces is that
coordinating a Wiimote when one cannot see your own arms and hands, due to wearing an
HMD, could create a problem due to the link between visual and proprioceptive perception
[46], which could cause nausea. Proprioception is the ability to coordinate your body relative
to your own body, such as touching your nose with your eyes closed, because you know
where you hand is relative to your nose. This could be detrimental to being able to use a
Wiimote accurately. Equally, not being able to aim at the IR source with the Wiimote would
make it difficult if not impossible to use it, as the Wiimote requires that for motion detection.

5.1.2 Final Test Requirements

With the design specified, we list the following as a simple and comprehensible list of
requirements for the test:

• The players should have enough time to get engaged.


• It should be possible for the player’s engagement to change during gameplay.
• The player must be able to exhibit different kinds of engagement while playing
without being distracted.

With 30 minutes of playtime, it should be more than possible to become engaged in the game,
fulfilling the first requirement. The game Portal can be completed in less than one and a half
hours, allowing a player to advance fairly far in just thirty minutes. With the time available,
and the increasingly difficult puzzles in the game, it should also allow people to shift their
engagement focus many times over to address the different challenges and events in the
game. To be able to relate ones engagement experiences, an understandable and
comprehensive explanation of the seven engagement parameters must be made available. In
the following subsection the creation of the questionnaire with engagement parameters
address the issue of making the concepts understandable. When combined with a thorough

Page | 34
Med8 - Group 10ml883 May 2010

explanation of the concepts to the test subjects before each test, this should fulfil the final
requirement.

5.2 Questionnaire

When creating a questionnaire several factors have to be accounted for. First of all the goal of
the questionnaire, or more specifically what the questionnaire is to collect data on, must be
known. It is also important to consider the length of the questionnaire given to each tester.
With fewer parameters to consider, the questionnaire portion of the test should prove less
distraction and let the player return to the game much quicker. This has been defined in the
previous section and can be outlined as the following:
- To accurately gather quantitative data to map changes in gameplay experience using
the engagement framework outlined in [16] and further specified by us in the
Synthesis on Engagement Theory chapter.

- To gather a qualitative post-experience evaluation of the overall experience to put the


quantitative data into context.

For a copy of the questionnaire, see appendix section I.III.

For each of the engagement parameters there is a description for the values of one and five on
the questionnaire. The descriptions accompanying the engagement parameters ratings are
meant to both explain to the test subject what the various parameters represent, but also what
either extreme of the parameter spectrums represent. We chose this as a better alternative to
the GEQ method used by Brockmyer et al. [6], as we reason that first of all asking fewer
questions would be less interrupting to the game experience; the way in which the GEQ’s in-
game questions are set up, a test subject would have to answer twice the amount of questions,
plus the questions can seem very arbitrary (see figure 1) and similar to each other, which can
confuse or annoy a test subject. If a test subject has to answer five questions which are almost
the same, then it would not be a great stretch of the imagination to see such a person just
arbitrarily giving the same answer to all of them. This is the kind of inaccuracy we attempt to
avoid by having fewer but more specific and clearly worded questions.

Another thing to consider is that the GEQ’s questions that are to be answered during a game
experience do not elaborate very much on what exactly they are referring to. We reason that
for accurate data collection it is paramount that a test subject understands what it is he is
being asked to answer. This is one of the main reasons we have the descriptions
accompanying the possible values, especially considering how abstract the concept of
engagement is, especially to the uninitiated.

Page | 35
Med8 - Group 10ml883 May 2010

5.2.1 The Qualitative Test Content

Since the test will be in two segments, first the quantitative questions relating to engagement,
followed by the qualitative questions relating to overall evaluation of the test experience, then
this qualitative segment has to be properly defined and outlined.
The qualitative questions to be asked after the gameplay are as follows:

Did you have fun while playing the game?

While the concept of fun can be very arbitrary, this question is meant to put into context the
engagement data collected, since the engagement parameters do not in any way display if the
test subject’s experience was positive or negative.

If the data shows that all test subjects who enjoyed playing the game also experienced similar
engagement patterns, then it could be compared against the engagement maps of test subjects
who did not enjoy the experience. This will allow us to examine if any connection between
enjoyment and specific engagement patterns exists.

Have you played the game before? (If yes, how much)

This question puts into context the test subjects quantitative answers in that prior experience
with the game could reduce the challenge presented in the game, shifting focus from learning
to use the game controls to just completing it more rapidly than a test subject new to the
game.

How often do you play computer games on a weekly basis? (Average, in hours)

This question is to help identify the test subjects as seasoned players, or non-gamers, which
could very well reflect on their game experience, making it an important factor to take into
account.

Specifically for the Wiimote interface test, we will also ask if the test subject has used a
Wiimote before. This is both to ensure that the test subject is familiar with the use of the
device, but also to help put into context the players engagement ratings. High physical
engagement during such a test could indicate unfamiliarity with the use of the Wiimote, and
this information would help explain the phenomena.

Finally, we will make observations during the tests on all test subjects, noting down if there
are specific events or details that clearly influence their experience during the test. This could
be the inability to figure out a puzzle in the game, thus getting stuck, or how far a test subject
got in the game, as both can reflect on player skill, which in turn can help explain given
engagement ratings.

Page | 36
Med8 - Group 10ml883 May 2010

5.3 Pilot Test

The test methodology was initially tested through a simple pilot test. The intended goal of the
test was not to go through all the motions outlined by the method, but to see if the
interruptions that each round of questioning posed were too disruptive for the test subject.
The test was conducted on a PC, using the online flash puzzle game Bejeweled 2 [47] and an
interface identical to that outlined as the baseline interface. The game was chosen for its
simplicity and because it is easy to resume after a pause in gameplay. While Portal is a FPS
puzzle game, Bejeweled 2 has a similarly slow pace where players can pause for thought to
figure out what to do next. Based on this we argue that the games have enough similar
features for the pilot test to be valid, relative to test our methodology on it.

The results of the test were as follows:

5 Min 15 Min 30 Min


Intellectual 4 4 3
Physical 2 2 2
Sensory 2 2 1
Dramatic 1 1 1
Emotional 1 2 2
Completion 5 3 2
Exploration 1 1 1

Table 1: The results of the pilot test, showing the ratings for the engagement parameters at the five, fifteen and thirty
minute marks

Intellectual
5
4
Exploration 3 Physical

2
1 5 Minutes

0 15 Minutes

Completion Sensory 30 Minutes

Emotional Dramatic

Figure 10: The radar/spiderweb model used by Shønau-Fog and Bjørner to display the engagement map [16], with
the three data sets overlapping

Page | 37
Med8 - Group 10ml883 May 2010

The pilot test subject was a 20 year old female student, with limited experience with the
game. After the test the subject confirmed that it was enjoyable, and upon further inquiry she
confirmed that the questions did not disrupt her gameplay to the point that it was difficult to
resume the game. This can partially be attributed to the simplicity of the game chosen for the
test, but also due to the fact that each round of question took less than sixty seconds.
However, this did not mean that there were no issues with the setup; the pilot test revealed the
importance of having the engagement parameter information displayed upon request, as the
test subject had to review the engagement definitions in the questionnaire multiple times
during each round of questioning, despite having been briefed on the concepts prior to the
test.

The conclusion to that, would be to have a list describing each parameter available for the test
subject to see at all times, so what the parameters mean is not forgotten when playing the
game.

To analyse the data gathered we would observe patterns and trends in the data. Most
noticeable in the data is the decrease of the completion engagement parameter. This can be
attributed to the player familiarizing herself with the game interface, allowing for a lower
conscious focus on doing what the game asks of the player. This coincides with the decrease
in intellectual engagement, as the test subject finds it easier and less challenging to play the
game over time. If additional data sets were available the goal would be to examine them for
overall patterns in the engagement values. Is a specific pattern only present for one interface
setup? If so, it would be paramount to explain them. The accuracy and validity of the
mapping depends on its ability to encompass and display such differences.

To conclude on the pilot test, the method appears valid with each round of questions being
sufficiently brief, but the test subject requires easy and constant access to information on the
engagement parameters.

5.4 Test Implementation

In this section we will briefly explain how the different interface setups were set up. The
interfaces are defined in the Choice of Interfaces section, and the final setups are defined in
the Test Design section. These will be the setups that will be used for actual testing.

5.4.1 Baseline Interface

As described in the Choice of Interfaces section, the baseline interface will be that of a
laptop, using a generic mouse device, coupled with a video projector and a set of headphones.
Prior to the test, the test subject will have the engagement parameters thoroughly explained,
as described in the Final Test Requirements section.

Page | 38
Med8 - Group 10ml883 May 2010

The test subjects will be seated in front of a table with a laptop, approximately 2.5 to 3 meters
from the projected image of the video projector. The difference in distance is to take into
account the test subjects personal preference of distance to the laptop, as some might sit close
and some further away. Next to the laptop will be a printed questionnaire describing the
engagement parameters, which the test subject is to rate his or her experience on during the
test. To attract and reward test subjects candy will be available during the test.

There were no special technical requirements for this setup up. All components were plug
and play, and either supplied by group member or the university.

5.4.2 HMD Interface

The only difference between this interface and the baseline interface is the use of a HMD,
instead of the video projector and headphones.

Figure 11: The Vuzix iWear VR920 [48]

For this project a Vuzix iWear VR920 HMD [49] was used. It replaces the video projector
and the headphones from the baseline interface. The headphones are built into the device.
This particular HMD model was used due to its availability, and compatibility with the Portal
game, in that it allowed for stereoscopic graphics for the game when viewed through the
HMD right out of the box. The concept of stereoscopic vision is described in the Choice of
Interfaces section as a common feature in modern HMDs.

Page | 39
Med8 - Group 10ml883 May 2010

Figure 12: The HMD interface setup during testing

The Vuzix iWear VR920 also support the option to enable head tilt and rotation tracking, but
this functionality is limited to a select number of games and programs. Our chosen game for
the tests, Portal, did not support this.

The only technical requirements for the Vuzix iWear VR920 is the installation of drivers and
control software for the device. All of this can be downloaded from the manufacturer’s
website, as described in the Installation Guide in the appendix.

5.4.3 Wiimote Interface

The Wiimote interface was the most technically challenging interface to set up. This section
will detail how this was done. Beyond that, the only differences from this interface to the
baseline, would be the use of a Wiimote and Nunchuck instead of the mouse and keyboard of
the laptop. Equally, the test subjects for this test are not seated, but instead are standing up
using the Wiimote during the test. This is done to emulate normal Wiimote use and to give
enough room to use the device.
To ensure that test subjects use the Wiimote correctly we will also inquire if they are familiar
with the use of such a device prior to the test. Based on this answer more instructions on
correct Wiimote use may follow.

Page | 40
Med8 - Group 10ml883 May 2010

Figure 13: The hardware and software configuration for the Wiimote interface

The figure above illustrates the hardware and software needed for the test setup featuring the
Wiimote.

Communication between the Wii and its controller, the Wiimote, is done through a wireless
Bluetooth link. While most modern PCs come with a built-in Bluetooth adaptor, the default
Microsoft Windows protocol stack (driver) does not support the Wiimote out of the box. The
user is therefore required to install an alternative Bluetooth stack, such as the one offered by
BlueSoleil [50].

The choice of an alternative Bluetooth stacks is largely irrelevant, but we found BlueSoleil
recommended by the majority of guides we looked at. It is also available as a 30 day trial
version, making it an ideal choice for this project.

Located on top of the Wiimote is the IR sensor, which transmits an IR signal to the Sensor
Bar connected to the Wii. The official Nintendo Sensor Bar consists of nothing more than 5
IR diodes on either side, with a 20cm gap between them. It is therefore relatively simple to
build your own Sensor Bar following any of the tutorials found online [51]. Alternatively it is
possible to use a pair of candles as a substitute for the IR diodes. We initially made a device
following the linked tutorial, but it failed shortly after testing started. After this we switched
to candles for the remainder of the Wiimote interface tests. There was no noticeable
difference in Wiimote performance between the two IR light sources.

The IR functionality allows us to map the left-right and up-down motion of a mouse cursor to
the Wiimote itself.

The GlovePIE [52] application is used to emulate keyboard and mouse input on an external
device. Initially developed with virtual reality gloves in mind, it has since been updated to
support all kinds of devices, among them the Wiimote.

While there are numerous alternatives to GloviePIE, such as AutoHotkey [53] and Xpadder
[54], their functionality is virtually identical. Our choice to use GlovePIE over these

Page | 41
Med8 - Group 10ml883 May 2010

alternatives was in part due to its active forum community which provided valuable
assistance, as well as extensive documentation explaining how to set it up with the Wiimote.

A GlovePIE script is then needed to interpret the incoming Wiimote signals over Bluetooth
and map them to corresponding keyboard or mouse keys and commands. GlovePIE scripts
use a simple syntax which should be familiar to Java programmers. For instance, if we want
to assign the Wiimote trigger (“B” button) to the left mouse button, the syntax would look as
following:

• Mouse.LeftButton = Wiimote.B

Figure 14: Wiimote button mapping in GlobePIE

Similarly we can map the spacebar which acts as the default “jump” hotkey in Portal to the
Nunchuk’s “Z” button:

• Key.Space = Nunchuk.Z

To summarize, the test setup required a functional Bluetooth connection between the PC and
Wiimote. This is accomplished by installing an alternative Bluetooth stack. Next we need a
replacement for the Sensor Bar, in the form of a pair of candles or a homemade IR diode
cluster.

Finally GlovePIE is then able to pick up the signals coming in from the Wiimote, and with
the proper script map them to keyboard and mouse buttons or commands we want to emulate
in Portal.

5.5 Result Hypothesis

In this subsection we will outline our expectations for the test results.
The success criteria for the mapping method depend on it being sensitive enough to allow
detection of variations in gameplay experiences within the same game.

At the start of the project we had a very limited understanding of the concept of engagement,
leading us to expect test results such as the following:

Page | 42
Med8 - Group 10ml883 May 2010

Figure 15: A representation of our early hypothesis that engagement over time will differ greatly between the
baseline interface and the Wiimote interface

Essentially what the above figure shows is our initial expectation for engagement data for the
baseline interface, and the Wiimote interface. With the Wiimote we expected people to
become highly engaged very quickly, but also that it would rapidly decrease again due to
fatigue from using the Wiimote. The baseline interface was not expected to engage people to
begin with, but over time lead to increasing engagement.

The above hypothesis was formulated long before we properly understood the core
mechanics of engagement theory.

Now later, with the proper understanding of engagement theory in place, the following
becomes our actual result hypothesis for the data we will collect from the tests:

• The Wiimote interface might yield higher overall physical engagement than the
baseline interface.
o We expect this based on the fact that using the Wiimote as an interface detects
haptic input from the user, making it more sensitive to small movements,
which would require people to be more aware of their actions.

• We expect the HMD, with its ability to monopolize on audio and visual stimuli, to
yield higher sensory engagement.
o The Vuzix HMD goggles being immersive HMD goggles, combined with the
built in earphones, means that a test subject will not see or hear anything
outside of the game.

Aside from these hypotheses, we have also speculated on another aspect of the potential test
subject’s performances:

• We expect a difference in engagement between players that have played the game
before, and those who have not. While we are uncertain of what specific overall
reactions we will get, we find it highly plausible that a seasoned player will pay
attention to different details and aspects of the gameplay than a first time player.

Page | 43
Med8 - Group 10ml883 May 2010

o A seasoned player could end up much less intellectually engaged due to


familiarity with the puzzles
o A new player could be more engaged in exploration because the game
mechanics and levels are unknown, requiring exploration
o A seasoned played could be much less dramatically engaged, knowing the
story already
o New players could be more focused on sensory engagement, as the game is
unknown, meaning that more attention is needed to experience everything in it
o Both could be highly emotionally engaged, but for different reasons; the
seasoned player because the game is fun to play through, the new player
because he might get stuck at some point and get frustrated
o A new player could be more engaged in completion because he hasn’t
completed it before, while a seasoned player might screw around more and
just have fun with the game
o A player unfamiliar with the controls of a game should logically be more
physically engaged in learning to use the controls

Page | 44
Med8 - Group 10ml883 May 2010

6 Test Results

In this section we will present and subsequently analyse the data collected during the tests.
All the results are available in raw excel sheet format on the CD, as well as in printed form
attached at the end of the report.
First we will display all of the results in the form of charts for each interface test’s
engagement parameters. For the pilot test we used a radar chart to display the data, but with
the actual test data occasionally not showing much variation over time, we find that this leads
to overlapping in the charts. This makes them difficult to read, and thus we find it unsuited
for large scale statistics with multiple test subjects. For this reason we will display the
engagement for each test on individual charts.

The data will be displayed as the average for each of the question rounds, followed by a brief
description which will include the standard deviation for each of the three intervals. The
horizontal axis will show time and the vertical axis will display the engagement rating given
by the test subjects.

A total of 57 individuals were tested, 19 on each interface setup. Of those 10 were female,
making up 17.5 percent of the test group. Te average age was 24 for the individuals tested,
with a standard deviation of 3.9 years.

6.1 Standard PC Interface Test

Following charts display the averages of the baseline test. The test was with a laptop, mouse,
keyboard, headphones and a video projector was used.

Intellectual Engagement Physical Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The intellectual engagement data from the baseline test shows an increase in engagement
over time. The standard deviation for the 5 minute question round is 1.2, 0.8 for the 15
minute round and 0.6 for the final round. This shows that while the spectrum for the initial
engagement level was very broad, then on average the subsequent question rounds all had

Page | 45
Med8 - Group 10ml883 May 2010

either higher ratings or equally high ratings, indicating either an increase in intellectual
engagement, or a steady level of high intellectual engagement throughout the test.

The physical engagement data from the baseline test shows a steady low rating through the
test. The standard deviation for the 5 and 15 minute question rounds is 0.8 and 1.1 for the
final round. With the ratings moving between 1 and 2 throughout the test, this shows a clear
low physical engagement during the baseline test.

Sensory Engagement Dramatic Engagement


5
5

4 4

3 3

2 2

1
1
5 minutes 15 minutes 30 minutes
5 minutes 15 minutes 30 minutes

The sensory engagement data from the baseline test shows a steady above-medium rating
throughout the test. The standard deviation for the 5 and 15 minute question rounds is 0.8 and
1 for the final round. The numbers indicate a central clustering of the ratings, with the
standard deviations highlighting a range between the ratings of 4 and 3 to be to the most
common in the test.

The dramatic engagement data from the baseline test shows a steady low rating through the
test. The standard deviation for the 5 minute question round is 1.3, 1.1 for the 15 minute
round and 1.1 for the final round. As with the averaged data from the sensory engagement
ratings, this graph does not show any conclusive pattern. The averages hovers between 2 and
3, showing a mid to low engagement across the board.

Emotional Engagement Completion Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

Page | 46
Med8 - Group 10ml883 May 2010

The emotional engagement data from the baseline test shows a steady medium rating
throughout the test. The standard deviation for all three graph points is 1. The numbers show
a steady mid-range rating for all question rounds, focused around a rating of 3.

The completion engagement data from the baseline test shows a steady high rating
throughout the test. The standard deviation for the 5 minute question round is 0.9, 0.8 for the
15 minute round and 0.7 for the final round. With the high averages, despite the spread from
the standard deviations, the numbers indicate a clear high engagement in completion through
the test.

Exploration Engagement
5

1
5 minutes 15 minutes 30 minutes

The exploration engagement data from the baseline test shows an increasingly spread out
medium rating through the test. The standard deviation for the 5 minute question round is 1,
1.2 for the 15 minute round and 1.4 for the final round. With the increasing standard
deviation, it’s clear that there were many different ratings of the exploration engagement at
the end of the test.

6.2 HMD Interface Test

Following charts display the averages of the HMD interface test. The test where only a
laptop, mouse, keyboard, and a HMD with built in earphones was used.

Intellectual Engagement Physical Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

Page | 47
Med8 - Group 10ml883 May 2010

The intellectual engagement data from the HMD test shows a minor increase in engagement
over time. The standard deviation for the 5 minute question round is 0.8, 0.6 for the 15
minute round and 0.7 for the final round. This shows that while the spectrum for the initial
engagement level ranges primarily between 3 and 4, then on average the subsequent question
rounds all had either higher ratings or equally high ratings, indicating either an increase in
intellectual engagement, or a steady level of high intellectual engagement throughout the test.
These results have slightly higher averages than that of the baseline test, but the same overall
pattern.

The physical engagement data from the HMD test shows a medium to low focus on
engagement over time with a broad standard deviation. The standard deviation for the 5
minute question round is 1.2, 1.2 for the 15 minute round and 1.1 for the final round. As
stated previously the spectrum of ratings for all of the physical engagement data is very
broad, proven by the high standard deviations. Compared to the baseline test the average is
higher, but no conclusive statements can be made until a t-test is made.

Sensory Engagement Dramatic Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The sensory engagement data from the HMD test shows an above medium focus on sensory
engagement over time. The standard deviation for the 5 minute question round is 0.9, 0.9 for
the 15 minute round and 1 for the final round. Compared to the baseline test, then the
difference between the sensory engagement data is not particularly significant.

The dramatic engagement data from the HMD test shows a below medium focus on
dramatic engagement, increasing over time. The standard deviation for the 5 minute question
round is 1, 1 for the 15 minute round and 1.1 for the final round. Compared to the baseline
test, then the difference between the dramatic engagement data is not particularly different,
with a similar standard deviation, although the initial dramatic engagement is noticeably
lower.

Page | 48
Med8 - Group 10ml883 May 2010

Emotional Engagement Completion Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The emotional engagement data from the HMD test shows an increasing medium-centred
focus on emotional engagement over time. The standard deviation for the 5 minute question
round is 0.9, 1 for the 15 minute round and 0.8 for the final round. Compared to the baseline
test, then emotional engagement data shows an increasing trend with a slightly lower standard
deviation, but roughly the same averages.

The completion engagement data from the HMD test shows a steady high rating of
completion engagement over time. The standard deviation for the 5 minute question round is
0.7, 0.7 for the 15 minute round and 0.8 for the final round. Compared to the baseline test, the
difference between the completion engagement data is very small, with similar standard
deviations, and slightly higher averages showing.

Exploration Engagement
5

1
5 minutes 15 minutes 30 minutes

The exploration engagement data from the HMD test shows a broad medium-centred focus
on exploration engagement over time. The standard deviation for the 5 minute question round
is 1, 1.3 for the 15 minute round and 1.2 for the final round. Compared to the baseline test the
averages are slightly lower, but the wider standard deviation encompasses the same spectrum
as the baseline.

Page | 49
Med8 - Group 10ml883 May 2010

6.3 Wiimote Interface Test

Following charts display the averages of the Wiimote interface test. The test was only a
laptop, Wiimote, Nunchuck, headphones and a video projector was used.

Intellectual Engagement Physical Engagement


5 5

4 4

3 3

2 2

1 1

5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The intellectual engagement data from the Wiimote test shows a sharp increase in
engagement late in the test. The standard deviation for the 5 minute question round is 0.8, 0.8
for the 15 minute round and 0.7 for the final round. Compared to the baseline test, then the
intellectual engagement ratings start out slightly lower than the baseline, and remain in that
area for the first half of the test time, only to double for the final round of questions.

The physical engagement data from the Wiimote test shows a steady mid-ranged rating of
engagement over time. The standard deviation for the 5 minute question round is 1, 0.9 for
the 15 minute round and 1 for the final round. The ratings cluster around the range of 3 and 4,
making for a much higher physical engagement rating than that of the baseline test.

Sensory Engagement Dramatic Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The sensory engagement data from the Wiimote test shows a steady mid-ranged rating of
engagement over time. The standard deviation for the 5 minute question round is 0.9, 0.7 for
the 15 minute round and 0.9 for the final round. The ratings cluster around the range of 3 and
4, which is within the same range as the baseline test.

Page | 50
Med8 - Group 10ml883 May 2010

The dramatic engagement data from the Wiimote test shows a narrow below medium rating
of engagement over time. The standard deviation for the 5 minute question round is 0.9, 0.7
for the 15 minute round and 1 for the final round. The ratings cluster around the range of 3
and 2, with a smaller standard deviation than the baseline test, but otherwise in the same
spectrum as the baseline data.

Emotional Engagement Completion Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The emotional engagement data from the Wiimote test shows an increase in engagement
over time. The standard deviation for the 5 minute question round is 0.8, 0.9 for the 15
minute round and 1.1 for the final round. The ratings increase over time in value, but so does
the standard deviation, showing a larger spread in ratings at the end of the test. With the
average of the last round of questions being equal to where the first round of question’s
standard deviation peak. Compared to the baseline which centred on a rating of 3, then this
shows a different pattern of engagement over time.

The completion engagement data from the Wiimote test shows a steady high rating of
engagement over time. The standard deviation for the 5 minute question round is 0.7, 0.5 for
the 15 minute round and 0.6 for the final round. The ratings cluster around the range of 4 and
5, which is within the same range as the baseline test.

Exploration Engagement
5

1
5 minutes 15 minutes 30 minutes

The exploration engagement data from the Wiimote test shows a steady mid-ranged rating
of engagement over time. The standard deviation for the 5 minute question round is 0.9, 1 for
the 15 minute round and 1.1 for the final round. The ratings cluster between the range of 4
Page | 51
Med8 - Group 10ml883 May 2010

and 2, which is slightly lower than the baseline test, with an equally slightly smaller standard
deviation.

6.4 Test Result Analysis

This section will focus on the statements made in the Result Hypothesis section, to determine
if the test data supports or disprove the stated hypotheses. Additionally we will compare each
parameter to check for statistically significant differences between the baseline test data, the
HMD and Wiimote interface data respectively. Actual discussion of the results and any
possible implications will be covered in the Discussion chapter.

6.4.1 Baseline and HMD Interface Comparison

As stated in the Result Hypothesis section, we expected the HMD to yield higher sensory
engagement than the baseline test.
To ascertain if there is a statistically significant difference, we conduct a t-test on the two
data sets from the baseline and the HMD test, comparing the 5, 15 and 30 minute ratings for
sensory engagement between the two.

To conduct the t-test we first test the numbers to examine the variance of the data sets, to
determine if we are to use an equal or non-equal variance t-test.

Baseline Sensory Engagement HMD Sensory Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers are all above the alpha value of 0.05, being
0.85, 0.27 and 0.43 for the 5, 15 and 30 minute data sets respectively. This clearly shows that
there is no statistically significant difference in the numbers, indicating that our hypothesis
regarding there being such a difference is incorrect.

An actually significant difference in data between the HMD and baseline interface, can be
seen in the physical engagement parameters:

Page | 52
Med8 - Group 10ml883 May 2010

Baseline Physical Engagement HMD Physical Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each physical engagement data pair are
0.04 then 0.02 and 0.31 for the 5, 15 and 30 minute data sets respectively. This shows a clear
statistical significant difference for two first data sets, but not the third.

Baseline Emotional Engagement HMD Emotional Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each emotional engagement data pair are
0.02 then 0.71 and 0.24 for the 5, 15 and 30 minute data sets respectively. This shows a clear
statistical significant difference for first data set, but not for the rest.

Baseline Intellectual Engagement HMD Intellectual Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each intellectual engagement data pair
are 0.43 then 0.40 and 0.50 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

Page | 53
Med8 - Group 10ml883 May 2010

Baseline Dramatic Engagement HMD Dramatic Engagement


5 5

4 4

3 3

2 2

1
1
5 minutes 15 minutes 30 minutes
5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each dramatic engagement data pair are
0.04 then 0.77 and 0.27 for the 5, 15 and 30 minute data sets respectively. This shows that the
rating difference at the five minute mark is significantly different, while the rest are not.

Baseline Completion Engagement HMD Completion Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each completion engagement data pair
are 0.55 then 0.64 and 0.84 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

Baseline Exploration Engagement HMD Exploration Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each exploration engagement data pair
are 0.75 then 0.37 and 0.71 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

Page | 54
Med8 - Group 10ml883 May 2010

6.4.2 Baseline and Wiimote Interface Comparison

As stated in the Result Hypothesis section, we expected the Wiimote interface to yield higher
overall physical engagement than the baseline interface.
To find out if this is true or not, we did as with the sensory engagement analysis for the HMD
interface. We conducted an f-test to first check for equal variances, between the Wiimote
interface data on physical engagement, and that of the baseline data, then performed the
appropriate t-test.

Baseline Physical Engagement Wiimote Physical Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-tests show a statistically significant difference in the data. The p-numbers
are 9.21E-08, 1.62E-06 and 5.34E-4 for the 5, 15 and 30 minute data sets respectively,
making all of them p<0.05. This supports our hypothesis that the Wiimote interface would be
more physically engaging.

Another very noticeably different parameter data set is that of the intellectual engagement
parameter:

Baseline Intellectual Engagement Wiimote Intellectual Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each intellectual engagement data pair
are 0.44 then 0.43 and 0.65 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

Page | 55
Med8 - Group 10ml883 May 2010

Baseline Sensory Engagement Wiimote Sensory Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each sensory engagement data pair are
0.85 then 0.54 and 0.24 for the 5, 15 and 30 minute data sets respectively. This shows a clear
lack of statistical significant differences for the data sets.

Baseline Dramatic Engagement Wiimote Dramatic Engagement


5 5

4 4

3 3

2 2

1
1
5 minutes 15 minutes 30 minutes
5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each dramatic engagement data pair are
0.33 then 0.62 and 0.13 for the 5, 15 and 30 minute data sets respectively. This shows a clear
lack of statistical significant differences for the data sets.

Baseline Emotional Engagement Wiimote Emotional Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

Page | 56
Med8 - Group 10ml883 May 2010

The results of the t-test show that the p-numbers for each emotional engagement data pair are
0.30 then 0.86 and 0.18 for the 5, 15 and 30 minute data sets respectively. This shows a clear
lack of statistical significant differences for the data sets.

Baseline Completion Engagement Wiimote Completion Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each completion engagement data pair
are 0.57 then 0.17 and 0.82 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

Baseline Exploration Engagement Wiimote Exploration Engagement


5 5

4 4

3 3

2 2

1 1
5 minutes 15 minutes 30 minutes 5 minutes 15 minutes 30 minutes

The results of the t-test show that the p-numbers for each exploration engagement data pair
are 0.51 then 0.12 and 0.45 for the 5, 15 and 30 minute data sets respectively. This shows a
clear lack of statistical significant differences for the data sets.

6.4.3 Experienced and Inexperienced Player Data Comparisons

During testing data was collected on how experienced the test subjects were in playing the
game Portal, as well as how much they play computer games on average per week.
The data collected from the 57 test subjects show an average of 5.33 hours of gaming per
week, with a standard deviation of 6.4. The highest weekly average recorded was 30 hours
per week, indicating a daily amount of gaming in excess of 4 hours per day. Amongst the test
subjects, eleven equally reported playing less than an hour a week, if at all.

Page | 57
Med8 - Group 10ml883 May 2010

As stated in the Result Hypothesis section, we had several hypotheses on the potential
difference between experienced Portal players and new Portal players. However, in the data it
is obvious that this definition of experienced and inexperienced players is invalid.

The data shows several test subjects who had never played the game before, but had a high
weekly average in amount of time they spent playing computer games, performing on par
with experienced Portal players. This invalidates the assumption on which the original
hypotheses were made. However, if we redefine the criteria for experienced players to reflect
on weekly gaming averages instead, disregarding data on previous experience in the game
Portal, then we argue that we can make better comparisons between the two player types and
their engagement ratings.

There exists no consensus on the definition of an experienced or inexperienced gamer, similar


to the fact that there is no definition of a hardcore or casual gamer. Therefore, we have
defined the two player types, experienced and inexperienced, based on the average of our test
subject’s weekly time spent on gaming. Everyone above the average is considered
experienced and everyone below is considered inexperienced, relative to the experienced
players.

With these definitions our test subjects divide into 16 experienced players who all play six
hours or more per week, and 41 inexperienced players who play less than that.

It should be noted that with these numbers it is not possible to create accurately
representative statistics between the two groups. The number of experienced gamers are
simply too low for this. However, we can still do comparisons between the two groups, it
should just be taken into account that the numbers are not statistically representative in the
same way that the overall data is, due to the low numbers of experienced players.

The original hypotheses examples given, showed how all of the engagement ratings between
the two groups could differ. However, we will not compare all of the engagement data
between the two groups, we will only show, and later discuss, noteworthy differences in the
data between the two groups. It can safely be assumed that any engagement data not
mentioned or compared between the two is similar enough to not differ significantly.

The most noteworthy difference is in the 5 minute round of dramatic engagement data in the
baseline test. On the baseline test, the experienced players, totalling 5, gave an average rating
of 4 with a standard deviation of zero. For the inexperienced players, totalling 14, the average
was 2.7 with a standard deviation of 1.4. We confirmed this to be a statistically significant
difference, but the same cannot be said for the same data points in the HMD interface data or
Wiimote interface data, where experienced player’s dramatic engagement didn’t differ very
much from the inexperienced players. On the HMD interface the experienced players,
totalling 3, had an average dramatic engagement rating of 2, with a standard deviation of 1 at
the 5 minute question round. The inexperienced players had an average of 2.2 with a standard
deviation of 1.1. On the Wiimote interface the difference isn’t that great either. The
experienced players, totalling 7, had an average dramatic engagement rating of 3.1 with a

Page | 58
Med8 - Group 10ml883 May 2010

standard deviation of 0.6 at the 5 minute question round. The inexperienced players had an
average of 2.3 with a standard deviation of 1.

Aside from this, of other notable differences in the data between the two groups, the HMD
interface exploration engagement data shows that the experienced players have on average at
one point lower rating throughout the test, compared to the inexperienced players.

Page | 59
Med8 - Group 10ml883 May 2010

7 Discussion

In this chapter we will discuss the results of the tests, their implications, and how that fits into
the theoretical framework we developed for testing. We will evaluate the overall project and
reflect on what we have learnt, and discuss if the mapping method we have come up with is
valid.

7.1 HMD Interface Discussion

As determined in the Result Analysis section, then there was no statistically significant
difference between the HMD interface test’s sensory engagement ratings, and that of the
baseline test. Having pondered greatly over why this could be, we have come to the
realization that it quite simply was because there wasn’t that great a difference between the
HMD interface and the baseline test interface.
Without a feature such as head-tracking, wearing an iWear VR920 HMD is not that different
from having a small computer monitor very close to your eyes, giving the illusion that you’re
looking into a large monitor. Compared to the baseline test, that could quite simply have been
too similar to playing the game with the video projector. If the baseline test had not used a
video projector, but a laptop screen instead, there might have been a greater difference, or if
the game had supported the iWear’s head-tracking functionality. However, that would have
conflicted with our design of the Wiimote interface in that it too would have needed to be
altered, to be played on a regular laptop screen as well, to allow proper comparison between
the baseline and Wiimote interface.

Another factor to consider is that the iWear VR920 HMD featured stereoscopic graphics
support, allowing for 3D graphics. We had expected this to affect the test subject’s
experience more, but most test subjects actually reported that they did not notice the
stereoscopic effect at all.

When we chose our hardware for the tests, we had to balance several very tricky parameters.
The game had to be simple enough that we could map all of the user interface keys to a
Wiimote, plus the game had to be varied enough to allow for engagement in as many of the
engagement parameters as possible. Equally, when we originally chose to use a HMD, before
choosing a specific HMD model, we didn’t expect to have head tracking. So when it turned
out that our choice of HMD did support head tracking, we were quite satisfied. When we
discovered that our choice of game wasn’t compatible with the HMD’s head tracking, we
chose not to search for a new game that supported it.

As an alternative the game Half-Life 2 developed by Valve, the same company that made
Portal was briefly considered an option. The game supported our HMD’s head tracking, but
was built to require a keyboard with many buttons to play the game, to cycle through
different options and selections, making it very difficult if not impossible to play with a
Wiimote. Equally, Half-Life 2 is a sequel to the game Half-Life, and refers heavily to events
in the original game, making dramatic engagement difficult if you have not played that as

Page | 60
Med8 - Group 10ml883 May 2010

well. The game Portal does not have this limitation, allowing new players to get into the story
straight away, although the majority of the narrative twists do not occur until later in the
game, which few test subjects managed to reach during the timeframe of the testing. This did
however highlight the difference in engagement between experienced users and
inexperienced users for initial dramatic engagement at least for the baseline test subjects. We
understand their high initial dramatic engagement to be a sign of them focusing on the story
of the game even more than the game presents at that point, due to having played the game
before. Although as noted in the Result Analysis section, we cannot verify the statistical
significance of this, due to the lack of experienced gamers in the tests.

Now, while sensory engagement did not show any statically significant differences, then
there were three other parameter comparisons that did. Together they show an interesting
pattern indicating that using the HMD makes it a little more difficult to get properly engaged
in some areas. The first two physical engagement rating averages were on average 0.8 points
higher, indicating a higher physical engagement rating in the beginning and middle of the
test. We attribute this to players constantly having to adjust the HMD as it didn’t always fit
them that well – it was built to a standard head shape, with little room for adjustments. The
result of this was then the lower averages on the dramatic and emotional engagement
parameters at the five minute mark. Since the players had to focus just that tiny bit more on
keeping the HMD on in such a way that they could use it properly, then they did get engaged
in these two other areas to begin with. The later ratings for the two parameters are on par with
the baseline test, showing that after a short while the test subjects became accustomed to the
interface.

For all of the other parameters there were no statistically significant differences, indicating
that the HMD interface did not alter the engagement of players significantly beyond what we
have already covered. This can be attributed to the same factors discussed earlier when
discussing the sensory engagement hypothesis, that the HMD interface was too similar to the
baseline interface.

7.2 Wiimote Interface Discussion

With the Wiimote interface we successfully hypothesized a higher overall physical


engagement, compared to the baseline test. We find that this supports the validity of our
mapping method, in that it clearly shows that test subjects can report truly different
engagement patterns on the same game, when the hardware interface is different. It should be
noted that for our test, playing with the Wiimote and Nunchuck wasn’t a very physically
active experience. You would stand still, aiming the Wiimote and making small controlled
gestures with it.
Despite that, then relating to the concept of Wii fatigue mentioned in the Choice of Interfaces
section, we did get several comments from test subjects that standing up and playing Portal
for thirty minutes with a Wiimote tired their arms. This is understandable, which was also
why one of our two original hypotheses described in the Result Hypothesis section, simply

Page | 61
Med8 - Group 10ml883 May 2010

stated that engagement with a Wiimote would be very high initially, but then drop sharply
due to Wiimote fatigue. This hypothesis was made prior to us developing a proper
understanding of engagement, to which end we are quite surprised to see reports of fatigue
not adversely affecting engagement parameters. We expected tired players to focus more on
their physical engagement via fatigue and less on dramatic or emotional engagement, similar
to the pattern seen in the HMD data.

As with the HMD interface, then there are no other statistically significant differences in the
engagement ratings beyond what has been discussed already. Why this is we speculate is
again because the game experience did not differ as much as expected with the alternate
interface.

7.3 Experienced and Inexperienced Player Discussion

While we proposed several possible differences for the results between experienced players
and inexperienced players, we never attempted to balance the test subjects to make the results
reflect on the feasibility to examine these hypotheses. The whole experienced player versus
inexperienced player comparison should in that respect be considered very much secondary
in comparison to the overall stated goal of simply testing the mapping method.
Amongst the different parameters we had expected the greatest difference to be in dramatic
engagement. Indeed, one of the only notable differences was with the experienced baseline
test subjects having a higher initial dramatic engagement than the inexperienced baseline test
subjects, but nothing beyond that. However, as one test subject who had played the game
before commented, then he rated his 5 minute mark dramatic engagement high because at the
start of the game he was reminiscing previous experiences with the game. Later in the game
his dramatic engagement dropped as he focused more on intellectual engagement, to figure
out the puzzles.

However, with the spread of our test subjects’ gaming frequency being so wide we argue that
this is a positive fact for the overall statistics. For our statistical analysis this means that it
represents a much broader segment of the population, instead of only applying to a specific
gamer group. To specify, the data can be said to reflect on overall engagement patterns of
university students, average age 24, standard deviation 3.9.

However, while the data might seem very similar between experienced and inexperienced
players then it is not difficult to imagine different reasons for this. High exploration
engagement can be attributed to first time players attempted to discover what is possible in a
game, while experienced players can have a high exploration engagement even after
mastering the basics when they attempt to push the limits of what is possible to do in the
game, examining every nook and cranny for hidden features and things to exploit. Similar
examples could be made for all of the engagement parameters, such as intellectual
engagement reflecting on both immediate problem solving, or long term strategy.

Page | 62
Med8 - Group 10ml883 May 2010

This highlights the fact that while we did endeavour to specify what each engagement
parameter encompassed, and ensure that none overlapped, then each parameter is still broad
enough to encompasses sufficiently different concepts, that very different experiences can
yield similar engagement ratings, as the above examples show. The question then becomes if
more engagement concepts should be declared, such as with Schønau-Fog’s grounded theory
approach [19]. However, as we became aware of in the Synthesis on Engagement Theory
section, then one would have to be very careful not to coin completely redundant engagement
parameters, or include partial parameter definitions that could inevitably always be said to be
present, even if a player isn’t focusing on it, such as the Interfacing engagement parameter in
[19].

The question of the accuracy of the parameters remains and many possibilities for refinement
could exist, perhaps dividing each of our chosen seven parameters, plus social engagement,
into sub-categories that reflect on more specific concepts. This could be the subject of future
investigation.

7.4 Overall Evaluation

In the end few things seemed to matter for our test subjects. Out of the 57 test subjects, only
three reported that it was not a fun game experience, and that was either due to difficulty in
using the interface because of inexperience, or in the single case where a test subject got
nauseous during his test run with the HMD interface. Everyone else, even the most
inexperienced players who would become stuck for up to twenty minutes on a single level,
would invariably state that the game was fun, despite the difficulty. We had considered that if
more reported negative game experienced, then it would be possible to analyse their
engagement ratings to check for specific ‘unhappy’ engagement patterns, and vice versa.
Age, gender and game experience did not appear to have much effect on having fun, or
engagement ratings.
This brings us to a final evaluation. Did we achieve our goal of creating a method to map a
game experience during active gameplay? Did we make one that is actually useful, one that
can register differences in gameplay and not just report the same ratings no matter how you
play?

We would argue that we did, although further testing would be advisable. The differences
founds between the baseline data and the Wiimote interface, show that the method can be
used to detect differences in the user experience within the same game, between different
hardware interfaces.

Regarding the HMD interface, we speculate that the setup was quite simply too similar to the
baseline interface for the game experience to be any different. It did show some small
variations in the parameters, relating to test subject difficulty in getting the HMD to fit just
right so they could see properly, which increased their physical engagement at the cost of

Page | 63
Med8 - Group 10ml883 May 2010

their emotional and dramatic engagement. It is because of that we would suggest further
testing with better HMDs, perhaps also comparing different games with each other.

As for our requirement that the mapping method can detect and display changes in
engagement over time, which was one of our core requirements for the method, we would
very much argue that we have succeeded in this. It cannot be seen clearly on the overall
statistics, but when look at every single individual test subject data set, all test subjects
reported change in the various engagement parameters over time. With every single test
subject showing these changes, we consider this requirement fulfilled.

The fact that these changes in engagement are very diverse, as shown by the standard
deviations of the various engagement parameter statistics, simply highlights that engagement
is an incredibly individual concept that cannot be averaged out. So many individual variables
define what a person pays attention to and what they think about, that getting clear statistics
with narrow standard deviations is something that we now suspect is not entirely possible
when using a broad target group.

Another possible factor to consider, that might explain the lack of statistically significant
differences in the data, could be that we may have overestimated the influence of the different
interfaces. Only on the Wiimote interface did we find a difference, in the physical
engagement. Everything else was similar enough to the baseline test, for the interfaces not to
change the engagement rating averages very much.

Equally, another possibility is that the game Portal, while acceptable by the initial criteria we
set forth, had flaws we did not take into account when choosing the game. There were test
subjects that got stuck during the testing, being unable to advance any further simply running
around the level aimlessly and frustrated, while others would rush through the maps, figuring
out what to do instantly. The nature of a puzzle is that it can be very difficult for some, and
very easy for others. This was something we did not take into account. Our idea with the tests
was that each test subject should have the same overall gameplay experience and difficulty in
figuring out the puzzles, but on different interfaces. What we got was test subjects getting
different game experiences on different interfaces. This could very well explain why our data
is varied and scattered. A game with more linear gameplay, with less room for such player
errors, could thus have been more suitable for this.

Ultimately we find the method valid. The individual test subject data shows clear changes in
engagement over time, and the clear statistical difference shown in the physical engagement
for the Wiimote interface shows that the method can detect both individual and large scale
changes and differences. There is room for improvement, but as a proof of concept we find it
valid.

Page | 64
Med8 - Group 10ml883 May 2010

8 Conclusion

As stated in our Final Problem Statement section, we set out with the goal of creating an
engagement mapping method and testing it via different interfaces, using the game Portal.
This we accomplished, starting with an investigation of immersion, presence, flow and
engagement theories, looking at different testing methodologies, which led to the
development of our own testing method.

In the Pre-Analysis chapter we concluded that a mix of three different interfaces best suited
our goal, combined with the game Portal.

In the Synthesis on Engagement Theory chapter we applied Schønau-Fog and Bjørner’s


engagement theory [16] combined with Schønau-Fog’s independent work [19] and we came
to the conclusion which specific combination and definition of engagement parameters would
fit our goal the best.

Additionally, while not part of the final problem statement, then as stated in the Motivation
chapter, we also wished to use this project as a means to add our say in helping to reach a
consensus in the area of user experience and game experience theory. We did this in the
Synthesis of Engagement Theory section, explaining how we come to understand the
concepts of engagement, immersion, presence and flow to work together.

As described in the Methodology chapter we developed a questionnaire and conducted a pilot


test, discovering that our method did not interrupt the gameplay experience too much, after
which we concluded that certain hypothetical engagement patterns were likely to be
identified during testing, and we found these hypotheses to be partially correct as discovered
in the Test Results chapter.

As argued for in the Discussion chapter, we ultimately find the method valid, in that it
demonstrated the ability to show different engagement levels, as well as engagement changes
during a gameplay experience. We do acknowledge that further testing would be very
relevant to further refine the method, but as a proof of concept, to show that engagement
theory can successfully be used to map a game experience, without being too obtrusive and
interrupting during gameplay, then it met our requirements.

Page | 65
Med8 - Group 10ml883 May 2010

9 Future Perspectives

As stated in the Discussion chapter, we found our testing method to be partially flawed, in
that our HMD interface was insufficiently different from the baseline test. A simple further
development of this project could be a new test using our mapping method, but with the
HMD interface design redefined to use a HMD device and game that works together to make
head tracking possible.

Another possibility could be expanding the rating scale from 5 to 7, or 10. One would then
have to consider whether the test subjects will be able to properly make use of an expanded
rating scale, or if such a scale is too broad and confuses test subjects instead of giving them
clear choices. This however, is not something we can conclude on at this moment, but it
would be interesting to test on at a later date.

Another option described earlier was the possibility of broadening the spectrum of the
engagement parameters, either by defining more parameters, or making sub-parameters for
each main parameter. The point of this would be to attempt to avoid situations where
different game experiences yield similar engagement ratings. Again, this could be an
interesting concept to test on.

A simpler way to expand the sensitivity of the method, without altering it in any way, could
also be to include ways to collect data on immersion, presence and flow, depending on what
the goal at that point of the test is.

Finally we also discussed the possibility of testing specifically on more specific target groups,
such as for comparing engagement between experienced and inexperienced players.
However, this will require a much more thorough taxonomy of game experience, since no
consensus currently exists on how to define such players, or that of a casual or hardcore
gamer if that is the intended target group for testing. Also a greater focus on acquiring
corresponding test subjects that belong to either group would be needed.

9.1 HMD Specific Future Perspectives

The famous futurist, author and inventor Ray Kurzeil has described his predictions about the
development of technology in his books ‘The Age of Spiritual Machines’ and ‘The Age of
Intelligent Machines’ in the 90’s. A more recent book on the same topic is ‘The Singularity is
Near’. In one of his earlier books [55], he writes that by 2019 people will experience 3-D
virtual realities through the use of glasses as well as contact lenses, which project video and
images directly onto the retina (retinal displays). By year 2029 eyeglasses and headphones
will be obsolete as computer implants will replace them, which go directly into our eyes and
ears.
Already some of his predictions have happened and examples of such are in the form of
contact lenses [56]. They are still a prototype and under development by B. Parviz, from the

Page | 66
Med8 - Group 10ml883 May 2010

University of Washington. He claims that these could serve as a head-up display for gamers
as well as for pilots and will allow wider FOV. However the concept behind it is based on
beaming virtual graphics on real world environment (Augmented Reality). There are several
devices which are built using the same concept. Only certain information, such as maps, text
and small images get displayed. In those cases the devices are in the form of eyeglasses.

Figure 16: Future vision of visual displays in the form of plain eyeglasses [57]

To conclude on this, we would consider the tests in this project to be very interesting to
repeat at a later date with superior HMDs.

9.2 Future Theoretical and Practical Applications

With all of this said and done, one question remains: what are the possible applications of this
study? Who would use the information we have gathered?
We would argue that for academics this framework allows for the mapping of a game
experience, which can be very useful for those who wish to understand what gamers gain
from gaming. In knowing what a person is engaged in when playing, then it theoretically
becomes easier to understand what they learn or become better at from the gameplay
experience. Certain multiplayer games could help improve a person’s ability to handle group
work expressed through social engagement, or through FPS gaming one could improve
pattern recognition and hand-eye coordination, expressed through sensory engagement. The
possible applications of this are nearly endless.

For commercial use a future iteration of this framework could be used to refine target group
definitions, in order to find out exactly what kind of people certain games appeal to. Already

Page | 67
Med8 - Group 10ml883 May 2010

the concept of different age and gender groups are commonly used, but with this framework
it could be possible to market games much better by indicating what engaging qualities it can
offer. A future advert could thus be “If you liked game X, then you’ll love game Y – it has
even more of what you like” and so on, allowing for better marketing.

Page | 68
Med8 - Group 10ml883 May 2010

10 Bibliography

[1]. IJsselsteijn, Wijnand;. Characterising and Measuring User Experiences in Digital


Games. Salzburg : ACM Press, 2007.

[2]. Sweetser, Penelope and Wyeth, Peta. GameFlow: A Model for Evaluating Player
Enjoyment in Games. s.l. : ACM Computers in Entertainment, 2005. Vol. 3.

[3]. Chen, Jenova. Flow in games (and Everything Else). s.l. : COMMUNICATIONS OF
THE ACM, 2007. Vol. 50.

[4]. Poels, Karolien, Kort, Yvonne de and Jsselsteijn, Wijnand I. Identification and
Measurement of Post Game Experiences. Firenze : CHI Conference, 2008.

[5]. Douglas, Yellowlees and Hargadon, Andrew. The Pleasure Principle: Immersion,
Engagement, Flow. San Antonio : Hypertext 2000, 2000. ISBN:1-58113-227-1.

[6]. Brockmyer, Jeanne H., et al. The development of the Game Engagement Questionnaire:
A measure of engagement in video game-playing. Toledo : Journal of Experimental Social
Psychology, 2009. Vol. 45.

[7]. Cairns, Paul and Brown, Emily. A Grounded Investigation of Game Immersion.
Vienna : ACM, 2004. ISBN:1-58113-703-6.

[8]. Nacke, Lennart and Lindley, Craig. Boredom, Immersion, Flow –A Pilot Study
Investigationg Player Experience. Amsterdam : IADIS Press, 2008. ISBN: 978-972-8924-64-
5.

[9]. Ermi, Laura and Mäyrä, Frans. Fundamental Components of the Gameplay
Experience: Analysing Immersion. Vancouver : s.n., 2005.

[10]. Lindley, Craig A. Ludic Engagement and Immersion as a Generic Paradigm for
Human-Computer Interaction Design. Berlin : Springer Berlin / Heidelberg, 2004. Vol. 3166.

[11]. Calleja, Gordon. Revising Immersion:A Conceptual Model for the Analysis of Digital
Game Involvement. s.l. : DiGRA, 2007.

[12]. Slater, Mel and Wilbur, Sylvia. A Framework for Immersive Virtual Environments
(FIVE) - Speculations on the role of presence in virtual environments. 1997. Vol. 6.

[13]. Lombard, Matthew and Ditton, Theresa. At the Heart of It All: The Concept of
Presence. s.l. : Journal of Computer-Mediated Communication, 1997. Vol. 3.

[14]. Dow, Steven, et al. Presence and Engagement in an Interactive Drama. San Jose :
ACM, 2007.

[15]. Slater, Mel, Usoh, Martin and Steed, Anthony. Depth of Presence in Virtual
Environments. 1994 : Presence:Teleoperators and Virtual Environments. Vol. 3.

Page | 69
Med8 - Group 10ml883 May 2010

[16]. Schønau-Fog, Henrik and Bjørner, Thomas. Experiencing Motivation in Computer


Games. Copenhagen : Submitted for Publication, 8. Feb, 2010, 2010.

[17]. O’Brien, Heather L. and Toms, Elaine G. What is User Engagement? A Conceptual
Framework for Defining User Engagement with Technology. Halifax, Nova Scotia :
JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND
TECHNOLOGY, 2008.

[18]. Cziksentmihalyi, Mihaly. Flow – The Psychology of optimal experience. 1990.

[19]. Schønau-Fog, Henrik. Motivation in Computer Games. Copenhagen : Unpublished,


2010.

[20]. Consolidated Sales. Nintendo.co.jp. [Online] [Cited: April 16, 2010.]


http://www.nintendo.co.jp/ir/library/historical_data/pdf/consolidated_sales_e0912.pdf.

[21]. News. Mcvuk.com. [Online] Intent Media 2010. [Cited: April 16, 2010.]
http://www.mcvuk.com/news/37043/Xbox-360-sales-hit-39m.

[22]. Corporate Information. Scei.co.jp. [Online] Sony Computer Entertainment, Inc. [Cited:
April 16, 2010.] http://www.scei.co.jp/corporate/data/bizdataps3_sale_e.html.

[23]. Best selling Wii Games. Listal.com. [Online] [Cited: April 16, 2010.]
http://www.listal.com/list/bestselling-wii-games.

[24]. Best selling PS3 Games. Listal.com. [Online] [Cited: April 18, 2010.]
http://www.listal.com/list/bestselling-ps3-games.

[25]. Best selling X360 Games. Listal.com. [Online] [Cited: April 18, 2010.]
http://www.listal.com/list/bestselling-x360-games.

[26]. Financial Results Briefing Fiscal Year Ended March 2009.


[http://www.nintendo.co.jp/ir/pdf/2009/090508e.pdf] s.l. : Nintendo Co., Ltd., 2009.

[27]. Wii Fatigue. Blogfreespringfield.com. [Online] Blog Free Springfield, 2007. [Cited:
April 24, 2010.] http://blogfreespringfield.com/wii-fatigue.

[28]. Wii-Fatigue? Igameonabudget.blogspot.com. [Online] 2009. [Cited: April 24, 2010.]


http://igameonabudget.blogspot.com/2009/03/wii-fatigue.html.

[29]. Warren, George E. HMD: Helmet Mounted Displays. [Online] 1999.


http://www.combatsim.com/review.php?id=631.

[30]. Lantz, Ed. Future Directions in Visual Display Systems. s.l. : Computer Graphics,
1997.

[31]. Personal Tech. The New York Times. [Online]


http://www.nytimes.com/2008/01/17/technology/personaltech/17basics.html.

Page | 70
Med8 - Group 10ml883 May 2010

[32]. How Video Glasses Work. How Stuff Works. [Online]


http://electronics.howstuffworks.com/gadgets/travel/video-glasses.htm.

[33]. Organic light-emitting diode. Search.com. [Online] CBS Interactive Inc.


http://www.search.com/reference/Organic_light-emitting_diode.

[34]. Stereo Vision Starts with Two Views. Vision3D.com. [Online] [Cited: April 29, 2010.]
http://www.vision3d.com/stereo.html.

[35]. Tang, John G. Enhanced Image Display in Head-Mounted Display. US20080088529


USA, 2008.

[36]. Are You Suffering From Cyber Stress? Aboutstressmanagement.com. [Online] Manage
Stress, 2006-2010. [Cited: April 12, 2010.]
http://www.aboutstressmanagement.com/stressrelief/stress-management/causes-of-stress/are-
you-suffering-from-cyber-stress.htm.

[37]. Why aren't HMD's (Head Mounted Displays) popular? . www.ubervu.com. [Online]
UberVU Ltd, 2010. [Cited: April 27, 2010.]
http://www.ubervu.com/conversations/www.reddit.com/comments/bg4cz/why_arent_hmds_h
ead_mounted_displays_popular/.

[38]. Norman, Donald. The Design of Everyday Things. New York : Basic Books, 2002.
ISBN-0-465-06710-7.

[39]. Wachowski, Andy and Wachowski, Lana. The Matrix. Warner Bros. Pictures, 1999.

[40]. Jay Garmon. Geek Trivia: First shots fired. Tech Republic. [Online] [Cited: April 27,
2010.] http://articles.techrepublic.com.com/5100-10878_11-5710539.html.

[41]. Games :Portal. ValveSoftware.com. [Online] Valve Corporation. [Cited: March 23,
2010.] http://www.valvesoftware.com/games/portal.html.

[42]. Preece, Jenny, Rogers, Yvonne and Sharp, Helen. Interaction Design: Beyond
Human-Computer Interaction. s.l. : Wiley, 2007. 0470018666.

[43]. So, Richard H. Y., Lo, W. T. and Ho, Andy T. K. Effects of Navigation Speed on
Motion Sickness Caused by an Immersive Virtual Environment. Clear Water Bay, Kowloon,
Hong Kong : Human Factors and Ergonomics Society, 2001. Vol. Vol. 43.

[44]. IJsselsteijn, W.;. Measuring the Experience of Digital Game Enjoyment. Maastricht :
Proceedings of Measuring Behavior, 2008.

[45]. Ganglbauer, Eva, et al. Applying Psychophysiological Methods for Measuring User
Experience: Possibilities, Challenges and Feasibility. Uppsala, Sweden : Workshop in
Interact'09 conference, 2009.

[46]. Sanchez-Vives, Maria V., et al. Virtual Hand Illusion Induced by Visuomotor
Correlations. s.l. : Plos One, 2010.

Page | 71
Med8 - Group 10ml883 May 2010

[47]. Bejeweled2. Popcap.com. [Online] PopCap Games, Inc., 2000-2010. [Cited: April 29,
2010.] http://www.popcap.com/games/free/bejeweled2.

[48]. iWear VR920 Image. 7 gadgets.com. [Online] 2007.


http://www.7gadgets.com/2007/09/02/iwear-vr920/2084.

[49]. iWear VR920 Data Sheet. Vuzix.com. [Online] 2007.


http://www.vuzix.com/iwear/products_vr920.html.

[50]. IVT. bluesoleil.com. [Online] http://www.bluesoleil.com/.

[51]. Moore, Brian "DoctaBu". HOW TO - Make your own wireless Wii sensor bar!
[Online] 2006.
http://blog.makezine.com/archive/2006/11/how_to_make_your_own_wire.html.

[52]. Kenner, Carl. GlovePIE. carl.kenner.googlepages.com/glovepie. [Online] February 4,


2007. [Cited: May 21, 2009.] http://carl.kenner.googlepages.com/glovepie.

[53]. Open Source. Autohotkey.com. [Online] 2010. http://www.autohotkey.com/.

[54]. Xpadder.com. [Online] http://www.xpadder.com/.

[55]. Kurzweil, Ray. The Age of Spiritual Machines: When Computers Exceed Human
Intelligence . s.l. : Pinguin Books Ltd., 1999. ISBN: 0-670-88217-8.

[56]. Contact lenses to get built-in virtual graphics. Newscientist.com. [Online] 2009. [Cited:
April 23, 2010.] http://www.newscientist.com/article/dn18146-contact-lenses-to-get-builtin-
virtual-graphics.html.

[57]. Wearable Displays: Mobile Device Eyewear. Microvision.com. [Online] Microvision,


Inc, 1996-2009 . [Cited: May 2, 2010.]
http://www.microvision.com/wearable_displays/mobile.html.

[58]. Jung. Left4Dead Wii Zapper GlovePIE Script . Left4Dead411. [Online] 25 2 2009 .
http://www.left4dead411.com/forums/showthread.php?t=10027&page=5.

Page | 72
Med8 - Group 10ml883 May 2010

I Appendix

I.I Installation Guide – Wiimote

This guide explains how to setup the Nintendo “Wiimote” for use in Windows. You will need
the following:

• a PC with a built-in Bluetooth adaptor, or an external Bluetooth dongle


• a Nintendo Wiimote (MotionPlus not required) and Nunchuk
• a candle placed 1 meter away from the Wiimote to simulate the IR LEDs in the Wii
Sensor Bar
• a copy of the PC game “Portal”

First of all make sure the Bluetooth adaptor or dongle is turned on/enabled. Next connect the
Nunchuk to the Wiimote and make sure it has working batteries inside. Place the candle 1
meter away from the area you plan to stand at with your Wiimote.

Now copy the “Setup” folder from the included CD and paste it on your desktop. The folder
should contain the following subfolders and files:

• GlovePIEWithoutEmotiv043 - software to map the Wiimote signals to PC input


• IVT_BlueSoleil_6.4.305.0 - software to connect the Wiimote to the PC
• med8project.PIE - the GlovePIE script used

Once the “Setup” folder is on your desktop, go to the “IVT_BlueSoleil_6.4.305.0/install”


folder and run “setup.exe”. Follow the onscreen instructions and reboot your PC when
prompted. Once you are back in Windows, click on the “Bluetooth Places” icon that should
now be on your desktop. Double-click the “Search Devices” icon and press both the “1” and
“2” button on the Wiimote at the same time. If everything is setup correctly the Wiimote
should be recognized. Right click on the new Wiimote icon under “Bluetooth Places” and
choose connect. The icon should now turn green showing that the Wiimote is ready for use.

Next open the “GlovePIEWithoutEmotiv043” folder and run “GlovePIE.exe”. Once running,
go to File>Open... and select the med8project.PIE file. The script should now load and be
visible in the main GlovePIE window.

You can now click the Run button in GlovePIE to launch the script and start “Portal”.
Alternatively you can start Portal first and Tab out to turn the script on/off. If the Wiimote is
properly connected and the script is running, you should now be able to play Portal entirely
with the Wiimote and Nunchuk using the following controls:

• Character movement - analog stick on the Nunchuk


• Shooting blue portals - B button (trigger) on the Wiimote
• Shooting orange portals - A button on the Wiimote

Page | 73
Med8 - Group 10ml883 May 2010

• Jumping - Z button on the Nunchuk


• Picking up objects (use key) - C button on the Nunchuk

I.II GlovePIE Script

Below is the GlovePIE script used for this project. The script is based on the “Left4Dead
WiiZapper Script” developed by the user jung at the left4dead411.com forum [58] and
modified by us to work with Portal .

/* Left4Dead WiiZapper Script by jung. */

var.irAmount = 1 // 1 OR 2

var.xNunchuk = -4.4

/* Set these numbers so that the cursor reaches the edges

of your screen. (Defaults are both 1.1) */

var.xStretch = 1.1

var.yStretch = 1.1

/* Remember to lower the ingame mouse sensitivity to 1.5 */

/* The following variables are all a matter of personal preference.

/* Remember you can change mouse sensitivity in your options in game. This will

effect your look speed */

var.speed = 1/7 // Master Sensitivity (Default 1/7)

var.zoom = 1/3 // Speed When Aiming Down Sight (Default 1/3)

var.deadzone = 250 // Less Sensitive Area Around Cursor (Default 250px)

// Wiimote Game Control

Mouse.LeftButton = Wiimote.B // primary fire, Blue portal

Mouse.RightButton = Wiimote.A // secondary fire, Orange portal

Page | 74
Med8 - Group 10ml883 May 2010

// Nunchuk Game Controls

Key.E = Nunchuk.C // use key

Key.Space = Nunchuk.Z // jump

// Rumble and LED Flare for Wiimotes

If Pressed(Wiimote.B) Then // Rumbles wiimote when weapon is fired

Wiimote.Rumble = True

wait 125 ms

Wiimote.Rumble = False // LED Flare when weapon fired

Wiimote.LEDS = 6

Wait 75 ms

Wiimote.LEDS = 15

Wait 50 ms

Wiimote.LEDS = 0

EndIf

If Pressed(Wiimote.A) Then // Rumbles wiimote during Melee

Wiimote.Rumble = True

wait 100 ms

Wiimote.Rumble = False

EndIf

// Control Stick Movement (WASD movement keys)

w = Wiimote1.Nunchuk.JoyY < -0.5

a = Wiimote1.Nunchuk.JoyX < -0.5

s = Wiimote1.Nunchuk.JoyY > 0.5

d = Wiimote1.Nunchuk.JoyX > 0.5

Page | 75
Med8 - Group 10ml883 May 2010

/* Code for Pointing Mechanism

---------------------------

Do Not Modify */

// If Using A Two-Dot Sensor, Find Midpoint

If var.irAmount = 2 Then

var.xPos = (Wiimote.dot1x + Wiimote.dot2x) / 2

var.yPos = (Wiimote.dot1y + Wiimote.dot2y) / 2

Else

var.xPos = Wiimote.dot1x

var.yPos = Wiimote.dot1y

EndIf

// If At Least One Dot Is Visible, Move Cursor

If Wiimote.dot1vis Then

// Locate Infrared Focal Point

var.xPoint = (1-(round(var.xPos) / 1024)) * Screen.Width

var.yPoint = ((round(var.yPos) / 768)) * Screen.Height

// Create Virtual Grid System And Draw Point Coordinates

var.xGrid = var.xPoint - (Screen.Width / 2)

var.yGrid = var.yPoint - (Screen.Height / 2)

// Find Cursor on Grid

var.xCursor = Mouse.CursorPosX - (Screen.Width / 2)

var.yCursor = Mouse.CursorPosY - (Screen.Height / 2)

// Calculate Distance To Move

Page | 76
Med8 - Group 10ml883 May 2010

var.xDist = (var.xGrid - var.xCursor)

var.yDist = (var.yGrid - var.yCursor)

// Calculate Speed Multipliers Based On DeadZone Parameters

If abs(var.xDist / var.deadzone) > 1 Then var.xMult = 1 Else var.xMult = abs(var.xDist /


var.deadzone)

If abs(var.yDist / var.deadzone) > 1 Then var.yMult = 1 Else var.yMult = abs(var.yDist /


var.deadzone)

// Calculate Motion Speed &amp; Add In Scope Multiplier If Needed

If Wiimote.A = True Then

var.xMotion = var.xDist * var.xMult * var.Speed * var.Zoom

var.yMotion = var.yDist * var.yMult * var.Speed * var.Zoom

Else

var.xMotion = var.xDist * var.xMult * var.Speed

var.yMotion = var.yDist * var.yMult * var.Speed

EndIf

// Calculate New Cursor Position

If abs(var.xDist) > 0 Then var.xCursor = (var.xCursor + var.xMotion) * var.xStretch

If abs(var.yDist) > 0 Then var.yCursor = (var.yCursor + var.yMotion) * var.yStretch

// Move Cursor

Mouse.CursorPosX = var.xCursor + (Screen.Width / 2)

Mouse.CursorPosY = var.yCursor + (Screen.Height / 2)

// Backup Last Motions

var.xLast = var.xMotion

var.yLast = var.yMotion

// Reload (If Minus Fails)

r = Wiimote.One

ElseIf Pressed(Wiimote.One) Then

// Terminate Motion - Safeguard

Page | 77
Med8 - Group 10ml883 May 2010

var.xLast = 0

var.yLast = 0

Else

// Move Using Last Known Value

If var.xMult = 1 Then Mouse.CursorPosX = Mouse.CursorPosX + var.xLast

If var.yMult = 1 Then Mouse.CursorPosY = Mouse.CursorPosY + var.yLast

EndIf

// Debug

debug = 'Debug: xNunchuk: ' + (Wiimote.Nunchuk.Roll + var.xNunchuk) + ' SpeedY: ' +


var.speedY + ' SpeedX: ' + var.speedX

I.III Installation Guide – Vuzix iWear VR920

To setup the Vuzix iWear VR920 HMD, simply follow steps 1 to 7 in the accompanying
“Quick Start Guide”. The guide, along with the required drivers and software was included
inside the box when borrowing the HMD through the AAU booking system.

I.IV Questionnaire

“Engagement” is defined as what you are focusing on when playing a game. Do keep in mind
that it is possible to be focused on several different areas at the same time at different levels.

Please rate the importance of the following seven engagement parameters on a scale of 1 to 5,
with 1 being the lowest and 5 the highest value.

Intellectual engagement – Focus on intellectual challenges which encourage creativity and


thinking, for instance solving puzzles or creating strategies.

1 - I didn’t have to think much and was not challenged at all.

5 - The puzzles and level layout made me think a lot and come up with creative solutions.

Physical engagement – Focus on physical actions carried out with the use of input devices,
hand to eye coordination and similar physical activity to play the game.

1 - I didn’t pay much attention to my body movements, it felt easy to me.

Page | 78
Med8 - Group 10ml883 May 2010

5 - Using the interface was challenging and I had to focus a lot on using it correctly.

Sensory engagement – Focus on various visual and auditory cues from the game. Such as
the graphics, animations, visual effects, sound effects, music and dialogue. How much
attention you pay to what you see and hear.

1 - The levels looked and sounded boring and I had no need or interest to pay attention.

5 - I paid a lot of attention to the game and the sights and sounds it had.

Dramatic engagement – Focus on the story experienced while playing the game, much the
same as it is in books or movies.

1 - The story was boring/predictable and I did not care what would happen next.

5 - I was very curious/intrigued by the story and could not wait to progress in the game to see
what happens next.

Emotional engagement – Focus on the player’s own emotions during gameplay such as
frustration over hard challenges, excitement/joy when overcoming a difficult challenge as
well as feelings towards game characters and non-player characters. This includes both
positive and negative emotions – any emotional connection to the game.

1 - I was not emotionally attached to anything in the game, nor did I feel any emotions myself
relating to the game.

5 - I felt strong connections with characters or parts of the game, such as intense dislike,
annoyance, frustration, the joy from overcoming challenges.

Completion engagement – Focus on completing available tasks and levels in the game.
Trying again and again even in the face of impossible odds.

1 - I didn’t think about bothering with the tasks at all.

5 - I constantly worked towards completing all possible tasks I could find in the game.

Exploration engagement – Focus on exploring everything that is possible to explore in the


game, be it a level, lore, character interaction or anything else in the game.

1 - I didn’t think much about exploring the game at all.

5 - I looked everywhere, at everything, all the time. No stone was left unturned.

Page | 79
Med8 - Group 10ml883 May 2010

II CD Content

The content of the CD is as follows:

- Full versions of this report in both .docx and .pdf format


- All of our test data, in Excel format
- A copy of the BlueSoleil program used
- A copy of the GlovePIE program and script used

Page | 80
Med8 - Group 10ml883 May 2010

III Test Data

On the attached printouts the raw test data, as well as the data processing spreadsheets used in
the Test Results chapter, can be seen.

Page | 81

You might also like