Escolar Documentos
Profissional Documentos
Cultura Documentos
Declaration
This work has not previously been accepted in substance for any degree and is
not being currently submitted for any degree.
September 4, 2014
Signed:
Statement 1
This dissertation is being submitted in partial fulfillment of the requirements for
the Science Without Borders Programme.
September 4, 2014
Signed:
Statement 2
This dissertation is the result of my own independent work/investigation, except
where otherwise stated. Other sources are specifically acknowledged by clear
cross referencing to author, work, and pages using the bibliography/references. I
understand that failure to do this amounts to plagiarism and will be considered
grounds for failure of this dissertation and the degree examination as a whole.
September 4, 2014
Signed:
Statement 3
I hereby give consent for my dissertation to be available for photocopying and for
inter-library loan, and for the title and summary to be made available to outside
organisations.
September 4, 2014
Signed:
Abstract
Contents
Contents
iii
1 Introduction
2 Background Information
2.1 Virtual Reality . . . . . . . . .
2.2 Stereoscopy . . . . . . . . . . .
2.2.1 Rendering Stereo Images
2.2.2 Health Concerns . . . .
2.3 Applications . . . . . . . . . .
.
.
.
.
.
3
3
4
6
7
9
. . . . . . . . . . . . . . . . . . . . . . . . . .
10
10
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
4 Game Development
4.1 Development Environment . . . . . . . . .
4.2 FPSpaceInvaders . . . . . . . . . . . . . .
4.2.0.1 Design Level . . . . . . .
4.2.0.2 The Player . . . . . . . .
4.2.0.3 The Enemies . . . . . . .
4.2.0.4 Oculus Rift Integration .
4.2.0.5 3D Audio Implementation
4.3 Empirical Evaluation . . . . . . . . . . . .
5 Results and Discussion
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
13
13
14
15
17
19
19
20
21
23
iii
CONTENTS
6 Summary
6.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
6.2 Future Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
25
25
26
A Questionnaire
27
References
32
iv
Chapter 1
Introduction
Since the release of the movie Avatar in 2009, the world started to realize the potential of 3D technology and its applications. One of the latest ground-breaking
inventions is the Oculus Rift [1]. It represents a new era on the gaming experience, the first head mounted display (HMD) to be affordable for the general
public, bringing virtual reality to a whole new level.
HMDs rely on 3D stereoscopic images to enforce the sense of depth. Stereoscopy increases the experience of immersion and spatial presence [2]. However,
such images arent natural to our eyes because the stimulus they produce differ from the ones of the real world [3]. This effect happens because the image
provided to each eye is produced on a flat surface. Most common symptoms
associated are eye strain, disorientation and nausea.
The challenge nowadays is to create realistic images in a way that minimizes
the problems it may cause to the users. Therefore, game designers have to take
in account several factors when designing games for such devices, having in mind
that an immersive experience depends on how the application is adapted to the
new environment. For instance, the relation of size between objects must be realistic, otherwise the perception of depth may be inconsistent. Furthermore, the
head tracking device must be able to reflect the same sings given by the player in
the game, for instance if the player moves its head at a certain speed, the game
must do it the same way, otherwise the brain may lose the sense of movement
Chapter 2
Background Information
This chapter is about defining the necessary concepts that will be used during
this report. Section 2.1 links stereoscopy and its role in the VR world, as well
as within the Rift environment. Section 2.2 gives a basic explanation about
stereoscopy and its drawbacks, while section 2.3 shows some cases where the Rift
was used as VR display.
2.1
Virtual Reality
The term Virtual Reality has been target of the media in the last few years due
to the to the incredible advances we are facing. The release of novel technologies
has brought VR to a level never seen before. VR is by no means an original
concept, as various similar concepts have been commercially available since the
early 1970s [4]. It refers to a synthetic environment created to enhance the user
interaction and immersion with the application, relying on three-dimensional,
stereoscopic, head-tracked displays, hand/body tracking, and binaural sound [5].
VR will be introduced into daily life and serve the people in various ways. Its
applications, going beyond just entertainment, could bring strong contributions
in fields such as: education, military training simulation and remote robot operation. The VR market is expected to grow exponentially in the next few years
as there are major enterprises interested in investing on these new technologies.
The latest deal worth mentioning is the one made by Facebook, who bought the
start-up named Oculus VR for 2 billion dollars. Sony is also working on a HMD
called Morpheus.
Stereoscopic displays play an important role in VR environments. They use
stereoscopic images to improve the users perception of immersion in the virtual
world, enabling a better understanding of the presented data, proportions and
positions of objects. Devices that use stereoscopic displays include head mounted
displays. The HMDs have become particularly popular since the release of Oculus
Rift in the last year. The next section gives a historical and technical background
about the process used to generate images used in these displays.
2.2
Stereoscopy
Ever since the early times, the man has realized that viewing with one eye (left
or right) is slightly different than viewing with both at the same time. However,
this phenomenon was not documented until 1838 when Charles Wheatstone first
explained the binocular vision and invented the stereoscope (Figure 2.1). He
demonstrated that our perception of depth is given by the combination our brain
does of the images seen by each one of our eyes.
Later on, by the end of the 19th and the beginning of the 20th century, stereoscopy began to attract interest from the cinematographic industry. William
Friese-Greene registered the first patent, where stereoscopic 3D films were broadcast on two separate screens. Viewers could then view the screens through a
stereoscope. In 1922 the first commercial 3D movie was released and the anaglyph
glasses were released. In the decades that followed, the high costs and the effects
of the Great Depression on film studios prevented large investments and the studios faced several ups and downs. The Polarized lenses were released in 1934, but
they only became popular in 1986 when IMAX released the movie Transitions,
this new technology introduced several advantages over the anaglyph glasses. It
was in 2009 when James Camerons Avatar was released that the public was truly
amazed with the experience. The movie was one of the most maybe the most
expensive movies ever made, due to the fact that the technology was entirely
new. Commercially however, it was worth the investment because the movie is
the highest grossing film so far, proving that the viewers are willing to pay since
the experience is worth the price.
Stereoscopy is a technique to create or enhance the illusion of depth in images.
Most methods use two images rendered with a slight horizontal offset mimicking
the different perspective through which our eyes see the world. (see Figure 2.2).
Our brains combine these images giving the feeling of spatial depth. Stereoscopic
vision probably evolved as a means of survival [6]. In games, it increases the
experience of immersion, spatial presence, and also what is known as simulator
sickness [2]. Results related to attention and cognitive involvement indicate more
direct and less thoughtful interactions with stereoscopic games, pointing towards
a more natural experience through stereoscopy. However, these advantages come
with a health concern. Section 2.2.2 discusses the possible side effects stereo images can cause to users.
2.2.1
2.2.2
Health Concerns
One of the biggest drawbacks of devices that use stereo images is the adverse
effects it may have on users. Recently, we see game developers being concerned
about the future of such technologies due to simulator sickness. Aaron Foster,
developer of the Routine [10] horror game, wrote recently on a Steam Community update that the development team had to slow down the VR integration
due to motion sickness, arguing that though his team is very excited about the
immersion level it brings to the game, they are sceptic and, using his own words,
for now, we cant fully commit to a VR version of Routine.
Indeed, the way our brain processes a regular scene is different from a stereo
image of the scene. In the natural world, objects at different distances will provide certain amount of stimulus to the accommodative system. On the other
hand, in a computer screen there will be always the same stimulus because there
is only one focal distance, even when objects are placed at different distances
inside the scene. In other words, in the real world when we focus our view on
an object, the others around appear blurred. While on a screen, there is only
one focal distance and the entire scene is always on focus. Howarth [3] describes
in depth these topics and explains how movement, IPD, discrepancy and focus
influence the user experience.
One of the recurring side-effects reported by stereoscopic displays users is the
Visually Induced Motion Sickness (VIMS). When people are exposed to motion
e.g. traveling by car, plane, or in a cruise they may start having symptoms such
as pallor, sweating, nausea and, in some cases, vomiting [11, 12, 13]. As the
source of this sickness is our sight, the same can be induced by moving images in
displays. The VIMS is introduced when the movement in an image gives a sense
of vection (illusion of self-motion). Bles [11] defined motion sickness as follows all
situations which provoke motion sickness are characterized by a condition in which
the sensed vertical as determined on the basis of integrated information from the
eyes, the vestibular system and the non-vestibular proprioceptors is at variance
with the expected vertical as predicted on the basis of previous experience.
2.3
Applications
Due to the relative novel release of the technology, devices have been available for
only a year, research involving the Oculus Rift is currently limited. Contributions
are in the form of open source projects, under development, or short publications.
In this section we list the most significant contributions existing in literature.
Bolton [14] presented PaperDude, a VR cycling based exergame system inspired by the Ataris Paperboy. The implementation was made using an Oculus
Rift VR headset, a Trek FX bicycle attached to a Kickr power trainer and a
Kinect camera. The idea is that the user rides the bike and throws papers in
mailboxes. The Kinect sensor is used to capture the players arm movement,
providing a natural input interface. While the HMD increases the players immersion in the environment.
Pittman [15] used the Rift combined with other input devices for robot navigation. The results show a preference of head rotation motions other than head
gestures. The subjects found that though it could cause nausea, the head movement was a very effective input movement.
Ikeuchi [16] joined a drone, Kinect and a HMD to simulate flying. A video
stream is captured and shown in the HMD, while Kinect tracks the movements
of the player. Players can control the drone through natural gestures, promoting
a realistic experience of flying.
Halley-Prinable [17] compared the level of immersion on Rift with traditional
monitors using fear. A game was developed and the subjects submitted to play
half time with the traditional screen and the rest wearing the Oculus Rift. The
players hearth rate was recorded during the experiment. From the 56 subjects,
results show that 2 people found the screen more immersive, 3 people found both
options equally immersive, and the remaining 51 people found the Rift to be
the most immersive. The author also concluded from the questionnaire and the
hearth rate data that fear level was increased when using the Rift.
Chapter 3
The Oculus Rift
The Oculus Rift is a virtual reality HMD developed by Oculus VR. The project
received support from the community in the Kickstarter [18] to release its first
version. Currently, only the Development Kit (DK) version is available, as the
consumer version is expected to become available in 2015. Rift is seen as the
mark of a new era for virtual reality, it is free of most of the restrictions applied
to previous products such as Nintendos Virtual Boy, CAVE environments [17]
and Nvis SX60. Virtual Boy was reported as a commercial failure due to a number of reasons including price and discomfort to use, CAVE environments are also
expensive and too large. Young [19] compared Rift and Nvis in perception and
action tasks, the results show that though some people felt less simulator sickness
on the Nvis, it is expensive and Rift consistently outperformed the Nvis on all
other aspects.
The Rift has two main components: the headset and the control box. A diagram is shown on the Figure 3.1.
3.1
Technical Aspects
The company offers a Software Development Kit (SDK) [9] for developers to
adapt and build new games for the device. The first release of the development
10
kit was in 2013 with the DK. Finally, the current version of the DK was released
in July 2014. The SDK offers a reliable source of code, samples and documentation about features and capabilities of the device. It also offers official support
for commercial and open-source game engines. There are normally two ways to
develop for the Rift: to use a Game Engine or to a stereoscopic rendering environment anew.
Oculus VR offers official support for the Rift development on Unity and Unreal game engines. Other non-commercial engines that support the Rift are also
available. On the other hand, for developers who want to create game engines
with native support for the Rift or even simple games without an engine, the
SDK offers an interface in C that can be easily used to setup the device. The developer also needs a graphic toolkit such as OpenGL or Directx. The interface in
C allows developers to link the code from other languages such as Python or Java.
One important point in the development of applications for the Rift is that
the stereo images cant be used on their own as there is a wide-angle optics in
front of the display. The lenses cause a distortion (pincushion form) and chromatic aberration on the edges. Therefore, there must be a post-processing step
called warping, where a barrel distortion is applied to the original image. The
comparison between both distortions are shown in Figure 3.2.
11
The warping is normally done at shader-level and from version 0.2 of the SDK
this functionality has been given to the SDK, which means that the developer
can simply pass the references for the textures and the API will perform the
distortion (SDK rendering mode). Also, there is the possibility for developers to
customize the distortion shaders (client side rendering), as well as combine other
shaders in the process.
12
Chapter 4
Game Development
The game development process is complex, time consuming and demands a broad
knowledge of graphics, artificial intelligence, and game design; in the case of
HDMs, knowledge of all aspects related to immersive environments is also needed.
Indeed, the paradigms of game design and user interface components must be revised in order to improve the user immersion and engagement with the game.
This chapter shows the games development process and the challenges we found
to work with these difficulties.
4.1
Development Environment
13
use. In this project we chose to use the Unity game engine since to build everything from the beginning would have been time consuming and we wouldnt be
able to focus on the game itself. The Unity support is offered in form of a Unity
Prefab 1 which is attached to a Unity project. The scripting language used is C#
which is also the main programming language used for development in Unity.
The machine used is a laptop Dell with 4GB RAM, processor Intel Core i54200U at 1.60 GHz 64 bits with a NVIDIA GFORCE 740M video card, running
Microsoft Windows 8. We also used a repository for the code maintenance. The
service is based on Git and called Bitbucket [21].
Finally, we had to create an application, having in mind that the time was
limited and the game should be simple enough to be finished and evaluated in
time. In the previous studies we found that applications in space gave very
interesting experiences to the users, then one idea came out: Space Invaders.
The idea was evaluated and validated, then we started the development process.
4.2
FPSpaceInvaders
The game developed is a remake of the classic arcade game Space Invaders developed by Tomohiro Nishikado and released in 1978. The original version is a 2D
space where the player controls a cannon and can move it sideways to shoot the
alien enemies. The enemies come in waves, attempting to destroy the player by
firing at it while they approach the bottom of the screen. If the enemies arrive
to the bottom of the screen, the alien invasion is successful and the game ends.
Figure 4.1 shows the game design.
Our remake is a 3D version where the player is a spaceship and the environment is a stellar skybox. The spaceship can move up and sideways, while the
cannon attached is controlled by the HMD sensor, which means that to aim an
enemy spaceship, the player can simply rotate his or her head towards the enemy
1
Asset type that allows to store a GameObject object complete with components and properties.
14
4.2.0.1
Design Level
The design started by looking for ways to create a realistic space environment.
First, we searched for a stellar skybox and found a generator called SpaceScape [22],
where we were able to create a skybox with stars and nebulas. The next step was
to create some planets and a Sun. As part of the environment, there are asteroids
that may hit the Player.
One important aspect in the development process is the object modeling. Indeed, we were able to model the spaceships on Blender [23] but as this isnt our
expertise, the models turned out too heavy (memory consumption) and the textures of low quality. Thats why we decided to use the spaceship models provided
in the Unity Demo Space Shooter. They are free and can be obtained on the
Unity Tutorials website.
In order to start the game, the player has to select the options New Game or
Load Game from a main menu. The menu was designed to be different from the
15
conventional style. The player is created as a regular person on a plane and the
menu items are placed on a semi-circle around it. To select an option the player
must walk, using the keyboard, through the item. Then the action selected will
be performed. This interface offers a more natural interaction. Figure 4.2 shows
a segment menu where two options can be seen.
16
when changing states. Game controller is also responsible for keeping score.
4.2.0.2
The Player
The player was attached to the spaceship, as if it was manning a cannon on a war
tank. To accomplish its goal, it has to shoot and destroy the enemy spaceships
before they perform the invasion. The cannon aim is attached to the HMD sensor,
which means that when the player wants to shoot a target, all it has to do is to
turn its head towards the target and press mouse left click. Also, the player is
allowed to move vertically (keys W and S) and horizontally (keys A and D) on
the screen. Figure 4.3 is a snapshot that shows the players aim.
17
cameras position and rotation to spawn bullets when the mouse is clicked, therefore the shot and the cross-hair will always be synchronized with the HMDs
sensor.
Another important component related to the player is the HUD (heads-up display), which consists of several pieces of information about the character. This is
important for the user experience and differs from the traditional style because
it isnt simply about positioning components on screen coordinates. We found
that components on the 3D space were better to access and created a higher level
of immersion. Here we found a lack of guidelines to such environment on the
literature, and due to time constraints, we werent able look closer into this issue.
However, there is no mandatory formula to good results, as there are multiple
ways to achieve it.
Our game contains a very simple HUD with two components: the life bar
and the level progression bar. The first was designed in a curved shape and located on the left-hand side of the screen to follow eyes curvature, while the last
was positioned on the top of the screen. Both were designed using GIMP editor
[24]. Note that, different from the traditional paradigm, these components are
positioned in world coordinates. Game designers must analyze beforehand the
impacts the components may cause on the games course. Specially when pop-up
menus or more detailed HUDs are required, as this may cause confusion and bad
user experience if not well designed.
The player is destroyed when: (1) the enemies perform the invasion; or (2)
players life ends due to collisions with asteroids or shots by enemies. The player
has a percentage of damage it can take before being destroyed, so the incidents
mentioned in (2) may happen several times before the player is destroyed.
18
4.2.0.3
The Enemies
The enemies are spaceships whose goal is to destroy the player. In the original
game, the enemies dont have artificial intelligence, they simply are spawned on
top of the screen and fly in a zig-zag path towards the bottom. Our remake is
very similar, except that they are randomly spawned in a plane in front of the
player and fly towards a plane behind the player. If any enemy arrive to the latter
plane, the invasion is completed and the player is destroyed.
Different from the player, the enemies dont have any tolerance to damages.
Therefore, when hit by an asteroid or the player, the enemy ship is destroyed immediately. For future works, it would be interesting to have some special ships,
who demand more than one hit to be destroyed, as well as special weapons.
4.2.0.4
The integration for the Rift environment is one of the most important steps in
our process. Some issues were mentioned already, such as menu and user interface
components design (sections 4.2.0.1 and 4.2.0.2). Others arise such as creating
the crosshair so the player could aim the targets easily. The problem in this case is
that the crosshair cant be placed in screen coordinates, which is the conventional
way. We also added a feature to detect when a target is on the aim, changing the
crosshairs color. The last feature can easily be implemented in Unity by casting
a ray of length D from the cameras forward vector, then detecting whether it
collides with any object. This technique is known as raycast.
The Oculus VR provides two Prefabs ready to be attached on an object or
scene. The OVRPlayerController consists of a regular player controller with
a stereo camera attached. This means that it has already the attributes from
a regular player such as walking and gravity, as well as the camera controllers
already set-up for the Rift. This Prefab was used on the main menu. The second
Prefab is the OVRCameraController which consists of two configured cameras
ready to be attached to an object. This Prefab was used on the player spaceship,
19
3D Audio Implementation
20
4.3
Empirical Evaluation
As part of the game evaluation, we invited 10 participants to try the game and
complete a short questionnaire. Appendix A shows the questionnaire used. In
developing our questionnaire we followed the approach used by Jennet [26]. We
aimed at collecting information about users background and gaming experience,
their evaluation of our game in terms of level of engagement, immersive experience and interaction with menus and controls.
The evaluation form therefore was divided into three main sections: (1) demographics, (2) game experience and immersion and (3) interactions. The first
section, demographics, regards information about participants, such as age, sex
and previous game experiences. The second section is focused more on the game
21
itself, we gather information about the participants experience during the game,
factors such as immersion, focus and sense of spatial presence are evaluated. Finally, we evaluated the players interaction with the game regarding access to
controls, performance and menus, as well as suggestions and comments.
Participants were welcomed in the laboratory and verbally informed about
the procedure, then played the game for 10-15 minutes. Finally, each participant
filled the questionnaire and was thanked by the test administrator. Please notice
that participants wore stereo headphones during the whole test in order to take
advantage of the 3D sound, as well as to prevent distractions and noises.
Most participants invited are friends with the test administrator, others are
students of the laboratory where we developed the game. All of the participants
were students, male, nine of them were 18-24 years old and one of them 2534. Six of them have Intermediate and one of them Expert gaming experience.
Evidencing that 70% of the participants have relevant experience in games. The
other three are Beginners. The platform most used is Desktop (9 participants),
followed by Smartphone (7 participants) and Playstation (4 participants).
22
Chapter 5
Results and Discussion
Results obtained give us a feedback about what we have done and serve as guiding factor to future works. However, in-depth studies with more participants are
needed to enforce and detail the results.
Preliminarily results show that for most players the sense of spatial presence
was enhanced, and they lost track of time while playing because the game was
able to hold their full attention. These results match with [2]. Subjects reported
they felt as the game as an experience rather than simply a task, showing that
they enjoyed the activity. Subjects also reported a feeling of actually being in the
virtual world, stating that this experience was more engaging than their previous
ones.
The interactions with the game were an interesting point of this project. Traditionally, aiming for targets is done with the mouse cursor. However, as shown
in Section 4.2.0.2, our game uses the tracking device from the HMD to control the
aim. This new paradigm caused confusion on the beginning because players were
trying to move the aim with the mouse instead of the head. This phenomenon
is understandable when having the first experiences within the new environment
because participants were used to aim with the mouse, and none of them had had
any experience with HMDs.
The main menu was target of both complains and compliments. While some
23
participants pointed that the walk-through selection provided an interesting interaction, other thought that it could be done by approaching the item and pressing
a key to select or using mouse click. Therefore, more studies are needed to find
out which one provides the best experience.
Though most players found the game easy to control, some of them had difficulties finding the right keys to press, specially at the beginning. This happened
because they cant see the keys they are pressing. A study using joysticks would
be interesting to evaluate whether it could be used as an effective replacement
for the keyboard. The stereo audio experience could be clearly noted when enemies were nearby, however the effect was diminished when several objects were
emitting sounds at the same time.
The general complain among the participants was the screen resolution, referring to the evident individual pixels that can be seen. Indeed, this problem
was reported by other authors [14, 15, 17] and contributed for VIMS symptoms.
Oculus VR claims that the DK2 fixes this problem by having a resolution of 960
x 1080 per eye (against the 1200 x 800 of DK1).
The minority of the participants suffered from early symptoms of VIMS after
the tests. We associated these effects to the fact that the game was designed with
continuous waves of enemies, therefore there was no intervals. It was noticed that
an unnecessary effort was demanded, and may have contributed to the symptoms
when they stopped playing. Future works should allow moments of rest, such as a
small interval between waves, when the player can relax during the game instead
of being constantly focused. Also, studies are needed to create interactions in a
way to prevent the players from moving the head too fast and abruptly. This
may cause the symptoms mentioned before, as well as neck discomfort.
24
Chapter 6
Summary
6.1
Conclusion
Recent advances on the virtual reality have brought it to a new era. One of the
most expected device was the Oculus Rift, that is the first affordable HMD to
the general population, bringing the power of the virtual world to a market not
previously explored. The applications of such devices can be found on the most
different fields of study.
In this project we developed an application to the Rift environment. Starting
by studying the stereoscopic images used on such devices, we evaluated methods
of rendering them and the dangers of poor quality. Later, we studied the Rift itself
and previous applications developed for it in order to understand what changes
in relation to the common design principles. Finally, we developed a game based
on the classic Space Invaders where the player can control the spaceships weapon
aim with the tracking sensor of the HMD, this input mode was associated with
the mouse input for shooting and the keyboard to move the spaceship. We conducted a small experiment with subjects and drew some conclusions over it.
As previous mentioned, the overall experience depends on several factors altogether. We can say that our game fulfilled its goals and good results were
achieved. The Oculus Rift indeed is a very powerful device that should become
25
widespread in the next few years, bringing immersive experiences to the consumers. Efforts are being done by the game industry to adapt and build games
for this new environment. This report can be used as a starting read for beginners
on the topic, as well as to point some items that should be looked out to allow
more immersive game experiences within the Rift.
6.2
Future Works
The Rift is still on its early developments, however the perspective is very promising. The field of study created about this technology is still on its early stages,
therefore, there is a lot to be done in order to improve the user experience. Generally speaking, we found a lack of guidelines on how to create user interfaces (i.e.
menus, GUIs, HUD) that take advantage of this environment. Another challenge
is to create efficient input methods, in part prevent the users from having adverse
effects from using Rift, as well as to create interactive and immersive experiences
associating devices such as the HMDs sensor, mouse, keyboard, joysticks, cameras such as the Kinect and leap motion sensors.
Specifically about the developed game, there is room for improvements and
extensions. We found the need to create a mechanism to allow the user to rest
between waves of enemies, as the lack of it may have caused an unnecessary
demand of effort from the players. Furthermore, the aspects mentioned before
about input methods apply here as well. It is possible to extend the current
game by adding special spaceships and weapons, as well as creating different
space environments. The results encourage us to carry on the work.
26
Appendix A
Questionnaire
27
Immersion Questionnaire
Please answer the following questions by circling the relevant answer.
Demographics
Sex:
Female
Male
Age group:
12-17 18-24 25-34 35 and over
Seldom
Often
Beginner
Intermediate
Expert
If you play video-games which of the following platforms have you used? (You can choose multiple
options)
Desktop/Laptop
Smartphone
Tablet
Playstation
Xbox
Wii
Other (please specify)
A lot
To what extent did you feel you were focused on the game?
Not at all
A lot
How much effort did you put into playing the game?
Very little 1
A lot
Assessing Immersion
A lot
To what extent did you feel consciously aware of being in the real world whilst playing?
Not at all
Very much so
Very aware
To what extent did you notice events taking place around you?
Not at all
A lot
Did you feel the urge at any point to stop playing and see what was happening around
you?
Not at all
Very much so
To what extent did you feel that you were interacting with the game environment?
Not at all
Very much so
To what extent did you feel as though you were separated from your real-world
environment?
Not at all
Very much so
To what extent did you feel that the game was something you were experiencing, rather
than something you were just doing?
Not at all
Very much so
To what extent was your sense of being in the game environment stronger than your
sense of being in the real world?
Not at all
Very much so
Very much so
To what extent were you interested in seeing how the games events would progress?
Not at all
A lot
Very much so
Were you in suspense about whether or not you would win or lose the game?
Not at all
Very much so
To what extent did you find the immersive experience more engaging than your previous gaming
experiences?
Not at all
Very much so
Assessing Controls
To what extent did you find the Menu selection easy to control?
Not at all
Very much so
Very much so
To what extent did you find easy to assess your performances during the game?
Not at all
A lot
To what extent did you enjoy the graphics and the imagery?
Not at all
A lot
How much would you say you enjoyed playing the game?
Not at all
A lot
Definitely yes
References
[1] Oculus vr, 2014. URL http://www.oculusvr.com/.
[2] Jonas Schild, Joseph LaViola, and Maic Masuch. Understanding user experience in stereoscopic 3d games. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems, CHI 12, pages 8998, New York, NY,
USA, 2012. ACM. ISBN 978-1-4503-1015-4. doi: 10.1145/2207676.2207690.
URL http://doi.acm.org/10.1145/2207676.2207690.
[3] Peter A Howarth. Potential hazards of viewing 3-d stereoscopic television,
cinema and computer games: a review. Ophthalmic and Physiological Optics,
31(2):111122, 2011. ISSN 1475-1313. doi: 10.1111/j.1475-1313.2011.00822.
x. URL http://dx.doi.org/10.1111/j.1475-1313.2011.00822.x.
[4] R. Shields. The Virtual. Key ideas. Routledge, 2003. ISBN 9780415281805.
URL http://books.google.co.uk/books?id=x6JLD8pHdhIC.
[5] R.A. Earnshaw, M.A. Gigante, and H. Jones. Virtual Reality Systems. Academic Press, 1993. ISBN 9780122277481.
[6] Vision 3d, 2014. URL http://www.vision3d.com/stereo.html.
[7] John M. Zelle and Charles Figura. Simple, low-cost stereographics: Vr
for everyone. In Proceedings of the 35th SIGCSE Technical Symposium on
Computer Science Education, SIGCSE 04, pages 348352, New York, NY,
USA, 2004. ACM. ISBN 1-58113-798-2. doi: 10.1145/971300.971421. URL
http://doi.acm.org/10.1145/971300.971421.
[8] Paul Bourke. Stereographics theory.
exhibition/vpac/theory.html.
32
URL http://paulbourke.net/
REFERENCES
33
REFERENCES
URL https://
[22] Alex C. Peterson. Spacescape: tool for creating space skyboxes with stars
and nebulas, 2014. URL http://alexcpeterson.com/spacescape.
[23] Blender modeling toolkit, 2014. URL http://www.blender.org/.
[24] Gimp - the gnu image manipulation program, 2014. URL www.gimp.org.
[25] Kai-Uwe Doerr, Holger Rademacher, Silke Huesgen, and Wolfgang Kubbat.
Evaluation of a low-cost 3d sound system for immersive virtual reality training systems. IEEE Transactions on Visualization and Computer Graphics,
13(2):204212, 2007. ISSN 1077-2626. doi: http://doi.ieeecomputersociety.
org/10.1109/TVCG.2007.37.
34
REFERENCES
[26] Charlene Jennett, Anna L. Cox, Paul Cairns, Samira Dhoparee, Andrew
Epps, Tim Tijs, and Alison Walton. Measuring and defining the experience of
immersion in games. Int. J. Hum.-Comput. Stud., 66(9):641661, September
2008. ISSN 1071-5819. doi: 10.1016/j.ijhcs.2008.04.004. URL http://dx.
doi.org/10.1016/j.ijhcs.2008.04.004.
35