Você está na página 1de 16

Computers & Education 68 (2013) 570585

Contents lists available at SciVerse ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

Evaluation of learners attitude toward learning in ARIES augmented


reality environments
Rafa1 Wojciechowski*, Wojciech Cellary
 University of Economics, Mansfelda 4, 60-854 Poznan
 , Poland
Department of Information Technology, Faculty of Informatics and Electronic Economy, Poznan

a r t i c l e i n f o

a b s t r a c t

Article history:
Received 12 January 2012
Received in revised form
21 October 2012
Accepted 6 February 2013

The ARIES system for creating and presenting 3D image-based augmented reality learning environments
is presented. To evaluate the attitude of learners toward learning in ARIES augmented reality environments, a questionnaire was designed based on Technology Acceptance Model (TAM) enhanced with
perceived enjoyment and interface style constructs. For empirical study, a scenario of a chemistry
experimental lesson was developed. The study involved students of the second grade of lower secondary
school. As follows from this study, perceived usefulness and enjoyment had a comparable effect on the
attitude toward using augmented reality environments. However, perceived enjoyment played a
dominant role in determining the actual intention to use them. The interface style based on physical
markers had signicant impact on perceived ease of use. Interface style and perceived ease of use had a
weak inuence on perceived enjoyment. In contrast, these two constructs had a signicantly stronger
inuence on perceived usefulness.
2013 Elsevier Ltd. All rights reserved.

Keywords:
Interactive learning environments
Evaluation of CAL systems
Multimedia/hypermedia systems
Authoring tools and methods
Augmented reality

1. Introduction
The two most important social and economic processes occurring nowadays are: emerging electronic knowledge-based economy and
transformation toward global information society. Therefore, creativity and innovation become more and more prominent determinants of
the competitiveness on the labor market in the 21st century. This is a major challenge for education and teaching that needs to be addressed
in the near future (Cellary, 2002). The aforementioned challenges require signicant improvement of teaching methods, which will
transform the role of learners from passive recipients of information to active participants in knowledge acquisition (Walczak,
Wojciechowski, & Cellary, 2006).
In response to this need, an increasing interest in teaching based on the constructivist learning theory has taken place since the 90s of the
20th century (Wilson, 1996; Jonassen, 1999; Marshall, 1996). There are a wide variety of perspectives on what the term constructivism means
(Piaget, 1973; Vygotsky, 1978; Bruner, 1996). In this paper, the constructivist learning is understood as an active process of constructing
knowledge by the learner, in contrast to passive acquiring the information (Duffy & Cunningham, 1996).
According to the constructivist approach, a teacher is a facilitator of learning rather than a transmitter of knowledge (Chaille & Britain,
2002). There are a number of possible pedagogic activities implementing the constructivist principles, such as experimentation, conducting
discussions, performing projects, etc. All these activities encourage learners to be active and to make their own discoveries, inferences, and
conclusions. Deployment of constructivist principles in a classroom requires usage of interactive and dynamic learning environments,
where the learners are able to modify appropriate elements, test ideas, and perform experiments (Roussou, 2004).
Learning based on performing experiments and further reection on their results is the basis of the learning-by-doing paradigm (Schank,
Berman, & Macperson, 1999). This paradigm implies that the best and the most natural way of learning how to do something is trying to do
it. A learning strategy that implements the learning-by-doing approach is experiential learning (Kolb, 1984; Beard & Wilson, 2006). This
strategy greatly increases understanding and retention of the learned material in comparison to the methods that solely involve listening,
reading, or even viewing, as learners are usually intrinsically motivated to learn when they are actively engaged in the learning process
(Yang, 2012).

* Corresponding author.
E-mail addresses: rawojc@kti.ue.poznan.pl (R. Wojciechowski), cellary@kti.ue.poznan.pl (W. Cellary).
0360-1315/$ see front matter 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.compedu.2013.02.014

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

571

A key determinant of the effectiveness of experiential learning is interactivity (Roussou, 2004). As far as learning content is concerned,
the interactivity is dened as: the extent to which users can participate in modifying the form and content of a mediated environment in
real time (Steuer, 1992, p. 14). In traditional learning, the highest level of interactivity can be achieved in teaching labs, where students are
able to conduct experiments putting theoretical concepts into practice. However, there are serious limitations associated with the experiments performed in teaching labs, since they may require much space, expensive equipment, appropriate safety measures, and trained
staff. These restrictions make the large scale dissemination of experiential learning in educational institutions practically impossible or
economically unjustiable (Jara, Candelas, Puente, & Torres, 2011). Without breaking down those barriers, experiential learning will remain
of more theoretical rather than practical signicance.
In this paper, we consider application of image-based augmented reality (AR), which is an extension of virtual reality (VR), to create
learning environments enabling experiential learning. Virtual reality is dened as a high-end usercomputer interface that involves real
time simulation and interactions through multiple sensorial channels. These sensorial modalities are visual, auditory, tactile, smell, and
taste (Burdea & Coiffet, 2003, p. 3). In this paper, we focus on two essential aspects of VR, namely three-dimensional (3D) visualization and
interactivity. Such a VR interface is called a virtual environment (VE), which is a 3D digital model of a real, abstract, or imagined environment.
Virtual environments potentially offer a much broader range of forms of interactivity than real environments. Users are able to freely
navigate in a virtual environment, observe the environment from different perspectives, and interact with selected virtual objects. Virtual
environments can be used to implement virtual laboratories in which users are able to perform experiments (Dalgarno, Bishop, Adlong, &
Bedgood, 2009; Jeong, Park, Kim, Oh, & Yoo, 2011). However, to create a virtual environment offering a high level of credibility it is required
to create a 3D model of an entire real environment, which is both time-consuming and expensive. The current state of the VR technology
causes the separation of humans from the real world, requires the use of expensive equipment to display and manipulate virtual objects, and
also offers an indirect non-intuitive user interface.
In comparison to virtual reality, which is aimed at immersing a user in a synthetic environment, augmented reality supplements the
users perception of the real world by the addition of computer-generated content registered to real-world locations (Azuma, 1997).
Augmented reality combines virtual reality with video processing and computer vision technologies (Parker, 1997; Davies, 2005). The AR
technology enable merging virtual objects with the view of real objects, resulting in augmented reality environments. In augmented reality
environments both virtual and real objects can co-exist and interact in real time (Milgram & Kishino, 1994).
The creation of AR environments requires design of virtual representation of a relatively small part of these environments. A signicant
part of AR environments consist of real objects, for which it is not necessary to create detailed 3D models, while offering the highest possible
level of reality. In AR environments, users are able to interact with virtual objects in a direct and natural way by manipulating real objects
without the need of sophisticated and expensive input devices (Wojciechowski, Walczak, White, & Cellary, 2004). Also, in contrast to virtual
environments, in which users communicate in a mediated way via avatars, AR environments afford users direct face-to-face contact with
each other.
AR environments offer better opportunity of learning-by-doing through physical movements in rich sensory spatial contexts (Dunleavy,
Dede, & Mitchell, 2009). Therefore, users have an opportunity to perform experiments on virtual objects by hands-on experiences in their
real environments. This feature of AR supports situated learning which means that learning should take place in the context in which it is
going to be applied (Lave & Wenger, 1991). AR allows students to seamlessly combine learning environments with the real world in which
they live and apply the knowledge and skills learned. AR environments with possible direct face-to-face interaction between learners foster
the creation of communities of practice focused on the goal of gaining knowledge related to the presented content, since the learners are able
to easily share gained information and experiences with the group (Lave & Wenger, 1991).
The main advantages of AR applications in the education domain are: activity of learners, cost and safety. AR environments allow
learning content to be presented in meaningful and concrete ways including training of practical skills. They may play active roles in a wide
range of learning activities within interactive educational scenarios developed in accordance with the learning-by-doing paradigm. The
experience gained by learners during the learning process within an AR environment can be the basis for reection and further group
discussion in a classroom. The main aspects of learning afforded by AR environments are: spatial ability, practical skills, conceptual understanding, and inquiry-based activities (Cheng & Tsai, 2012).
Application of AR environments for teaching is followed by cost reduction due to replacing real expensive resources, such as laboratory
equipment and supplies, with their virtual counterparts. A signicant advantage of AR environments is safety, since unskilled learners may
explore potentially dangerous situations without any risk of harm to themselves or damage to expensive equipment.
There are a number of possible applications of AR environments in education (Walczak et al., 2006). They can be used for teaching about
objects and phenomena impossible to see by naked eye (e.g., molecular movements), simulation of potentially dangerous situations (e.g.,
chemical reactions), and visualization of abstract concepts (e.g., magnetic elds). In addition, the level of complexity of the presented
phenomena can be reduced to allow the learners to easier gain knowledge about the underlying concepts. AR environments may be used in
a wide spectrum of domains from natural sciences (e.g., chemistry, physics, biology, astronomy), through computer and information sciences, mathematics, engineering (e.g., mechanical, electrical, biomedical), to humanities (e.g., history, linguistics, anthropology).
This paper is organized as follows. In Section 2, basic concepts related to augmented reality environments are introduced, as well as an
overview of applications of AR in education. In Section 3, an overview of the ARIES system is presented. In Section 4, the TAM-based research
model and an application scenario of the ARIES system are described. This section also contains a description of the research study. In
Section 5, the results of the system evaluation are presented. Finally, Section 6 concludes the paper.
2. Augmented reality in education
2.1. Categories of augmented reality environments
Augmented reality is a broad concept, which applies to technologies that combine the real and the virtual in any location-specic way,
where both real and virtual information play signicant roles (Klopfer, 2008, p. 92). In general, AR systems are divided into location-based
and image-based systems (Cheng & Tsai, 2012).

572

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

The location-based AR systems use the data about the position of mobile devices, determined by the Global Positioning System (GPS) or
WiFi-based positioning systems. The location-based AR systems enable users moving around with mobile devices in the real environment.
Users can observe computer-generated information on the screens of mobile devices, while the information is dependent on the current
location of the users in an environment.
In contrast to the location-based AR, the image-based AR is focused on image recognition techniques used to determine the position of
physical objects in the real environment for appropriate location of the virtual contents related to these objects. The image-based AR
systems are divided into marker-based and marker-less tracking. The marker-based AR requires the placement of articial markers in the
real environment to determine the position of physical objects in the environment. The marker-less AR does not require articial markers
placed in the real environment, but instead it is based on tracking of natural features of physical objects present in the environment.
In this paper, we focus on the image-based AR using marker-based tracking of physical objects. An image-based augmented reality
environment consists of a real environment and a virtual scene, which is presented in the context of the real environment. The real environment contains real objects, which are automatically tracked using image processing and computer vision techniques. The virtual scene
consists of virtual objects and virtual representations of real objects present in the real environment. The virtual objects and the virtual
representations of real objects are overlaid on captured views of a real environment giving users an impression that the virtual content
actually exist in the real environment. The virtual objects can be displayed anywhere in the context of the real environment, whereas the
virtual representations of real objects are displayed aligned with the corresponding real objects. In this way, users viewing the augmented
reality environment get the illusion that virtual objects and virtual representations of real objects are an integral part of the real
environment.
The image-based augmented reality environments can be presented to end users via different display devices, which are categorized into
four types: head-mounted displays (HMDs), desktop monitors, large-screen projection systems, and handheld displays (Drascic & Milgram,
1996). To superimpose virtual graphics on a real-world view, an AR system requires wearing by a user an HMD optionally combined with a
device that can measure the position and orientation of the users head. There are two approaches to generation of the augmented views on
HMDs: optical see-through and video see-through systems (Azuma, 1997). The optical see-through approach allows a user to look through
the display at the real world. The optical see-through display is based on optical image combiners, e.g., half-silvered mirrors, in front of the
users eyes used to mix the virtual with the real images. The video see-through approach requires using one or two video cameras capturing
views of the real world. The cameras are attached in front of a closed-view HMD. The cameras are used to capture the real world images,
which are augmented with virtual content and displayed on the HMD worn by a user. These systems do not allow a user to look directly at
the real world.
Image-based AR systems can be also implemented based on monitor-based congurations. Instead of using see-through HMDs, a
monitor-based system is composed of one or two video cameras and a monitor. Optionally, if the images displayed are stereoscopic, the user
has to wear a pair of stereo glasses. The cameras capture an environment, whereas the monitor displays the captured images overlaid with
virtual content. In monitor-based congurations users can observe augmented reality environments displayed on a monitor screen.
Alternatively, a monitor can be replaced with a projection system for presentation to a larger audience.
Handheld displays are usually embedded in mobile devices, such as smartphones and tablets. All currently available AR systems based on
mobile devices are video see-through, where real-world views are captured by cameras built into mobile devices. The two main advantages
of the displays embedded in mobile devices are the portability and ubiquity. The disadvantages of handheld displays are their small size and
image distortion caused by built-in cameras on mobile devices.
2.2. Related works
The application of AR technology in education is still in a very early stage. The reason is that this technology is often perceived by teachers
as too expensive, complicated, and time-consuming (Champion, 2006). In recent years, there have been only few attempts to apply AR
technology to teaching.
In location-based AR systems, the presentation of information does not have to contain 3D virtual content registered in relation to real
environments. In these systems, students usually work in groups to solve a problem. Each of them plays a different role, e.g., a chemist, a
doctor, an environmentalist, or other domain experts. Students taking on different roles have to resolve a variety of tasks, which are pieces of
a larger puzzle. In Alien Contact! students have to investigate a mysterious alien crash (Dunleavy et al., 2009). In Mad City Mystery students
must explain a death of a virtual character (Squire & Jan, 2007), whereas in Environmental Detectives students play the role of environmental engineers investigating a toxic spill within a local watershed (Klopfer & Squire, 2008).
In image-based AR, students can observe a real environment augmented with 3D virtual content registered in relation to real objects. The
existing image-based AR learning systems, such as Construct3D, Augmented Chemistry, Mixed Reality Classroom, and AR-Dehaes, have been
developed to support a relatively narrow range of potential teaching subjects. For instance, Construct3D is a simple 3D construction tool in
an immersive AR environment for educational purposes. The application domain of Construct3D is geometry education (Kaufmann,
Schmalstieg, & Wagner, 2000). The Augmented Chemistry system is an application designed to assist in teaching abstract organic chemistry concepts such as molecular forms, the octet rule, and bonding (Fjeld, Juchli, & Voegtli, 2003). The Mixed Reality Classroom is an AR
educational system developed for primary schools in Singapore (Liu, Cheok, Mei-Ling, & Theng, 2007). The system is composed of two
thematic modules on Solar System and Plant. AR-Dehaes is an augmented book designed to visualize 3D virtual objects in order to help
engineering students to develop spatial skills (Martn-Gutirrez et al., 2010).
The existing image-based AR solutions are developed for a specic pre-dened domain to teach only a narrow range of topics. In these
systems, the role of a teacher is limited solely to instruction during the learning experience. Teachers are not able to update and adjust the
existing learning content to their needs, changes of curricula, and different levels of learners. They also cannot easily create new learning
content on their own. As a consequence, the potential application of such systems within real curricula is very restricted. As far as experiential learning is concerned, only the Augmented Chemistry system offers the possibility of experimentation to some extent.
In most existing solutions, the process of creation of advanced interactive AR environments requires involvement of highly qualied
skilled IT professionals, who are experts in design and implementation of interactive 3D content. Also, any changes in the content often

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

573

cannot be made without assistance of the programmers. On the one hand, programmers do not have sufcient domain and pedagogical
knowledge required to build a complete AR environment. On the other hand, teachers do not have appropriate technical knowledge to
create and modify the learning content on their own. As a result, teachers are condemned to use ready-made content only.
Reusability and adaptability are two of the most important requirements for effective creation of learning materials (Boyle, 2003).
Reusability allows teachers to create learning content that can be used in different learning contexts without much additional effort. Thus,
teachers do not have to create the content from scratch but they can build it reusing some of existing materials. To this end, the learning
materials should be appropriately modularized to enable easy sharing among teachers. The materials should be treated as regular digital
products that are produced and distributed (Landowska & Kaczmarek, 2005). Adaptability makes it possible to adjust learning materials to
the individual and situational needs. It should be possible to tailor the materials to the age, learning styles, abilities, and performance
characteristics of learners.
3. System design
In this section, a system for building AR learning environments, called ARIES, is presented. ARIES is an e-learning system which enables
domain experts, i.e. teachers, to actively participate in the authoring process of interactive educational scenarios. The ARIES system has been
built as an implementation of the Augmented Reality Environment Modeling (AREM) approach (Wojciechowski, 2012). The AREM approach
enables teachers to design and create learning scenes for augmented reality environments.
3.1. AREM approach
In AREM, scenes and objects, both virtual and real, are modeled as instances of classes based on the object-oriented paradigm. The
process of creating learning content is called learning content preparation, while the process of using the learning content is called learning
content use.
Learning content preparation begins with design of the content form, i.e., scene classes and object classes describing visual and behavioral
aspects of the learning content. Denition of classes is performed by designers who have programming skills required for designing 3D
graphics and writing some high-level scripting code in XML. Next, the actual learning content in the form of learning scenes and objects is
created based on the classes. Denition of the scenes and objects is performed by the use of a simple graphical user interface by domain
experts without programming skills but with domain-specic knowledge necessary to produce high-quality learning content.
When the learning content is ready, it can be used for learning in a classroom. To this end, learning content setup is performed by an
instructor just before or during a lesson. The instructor selects an appropriate scene and sets it up for using in the classroom environment in
which the lesson is going to take place. Then, a new AR environment is created based on the selected scene. The setup of AR environments
can be performed by people without programming skills but with competence to guide the instruction during the lesson. After the content
setup is completed, an AR environment is ready for use and the learning process can begin. During the learning process, learners can interact
with the learning content using real objects present in the AR environment.
3.1.1. AR-Classes and AR-Objects
The AREM approach is based on two main concepts: AR-Class and AR-Object, according to the object-oriented paradigm (Wojciechowski,
2012). However, the conventional object-oriented approach is not sufcient to model interactive AR environments, thus the conventional
concepts of class and object have been appropriately extended. AR-Objects are representations of virtual objects, real objects, and scenes
composed of the both kinds of objects. AR-Classes are created in the content design stage by content designers, whereas AR-Objects are
created during the content creation stage by content creators.
AR-Classes implement the basic class features originating from the object-oriented paradigm, such as: attributes, operations, and inheritance. In the context of AR environments, these features have been extended with 3D geometry, interactive behavior, media objects,
constraints on the attributes, and aggregation relationships with other classes. An AR-Class represents a group of AR-Objects that share
similar characteristics, such as geometry, media objects, behavior, relationships to other AR-Objects, and semantics. There are three kinds of
the relationships among AR-Classes: specialization, composition, and containment. Specialization denes hierarchical structure of ARClasses, where one class is a specialization of another class. Composition and containment are kinds of aggregation, where one class is
part of another class.
Attributes are used for describing visual, behavioral, and semantic characteristics of AR-Objects. Therefore, the attributes of AR-Classes
are used for parameterization of their geometry and behavior. Since different AR-Objects being instances of an AR-Class may have different
values of the attributes, thus the presentation of different AR-Objects may differ in visual and behavioral aspects.
In the context of education, AR-Classes are used for modeling learning concepts. AR-Objects of an AR-Class represent specic instances of the
learning concept. The instances can be presented in an AR environment. Learning concepts encompass all the concepts necessary to create an
AR environment. They are divided into domain concepts and presentation concepts. Domain concepts are directly related to the domain-specic
knowledge, whereas presentation concepts are related to the presentation of the knowledge. For example, in the chemistry domain, domainspecic concepts are: liquid, solid, acid, base, etc., while presentation concepts correspond to glassware and equipment necessary to setup and
conduct chemical experiments, such as a pipette, measuring cylinder, test tube, beaker, Bunsen burner, or thermometer.
For specifying AR-Classes and AR-Objects including all their constituent elements, a new high level, XML-based language, called
Augmented Reality Scenario Modeling Language, has been developed.
3.1.2. Geometry
Each AR-Class may contain geometry which is a 3D digital model specifying how the AR-Objects instantiated from this AR-Class are
visually presented in AR environments. Geometry may be encoded in any language enabling modeling of virtual environments, for example
X3D (Web3D Consortium, 2011), extended with parameterization features. A notable example of such language is X-VRML (Walczak &
Cellary, 2003).

574

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

Geometry of an AR-Class can be directly parameterized with the attributes of this AR-Class and indirectly with the attributes of its
component AR-Classes. The geometry may be customized in the AR-Objects by setting different attribute values during learning content
preparation. The attribute values can dynamically change during learning content use as a result of behavior of the AR-Objects present in an
AR environment. Hence, visualization of AR-Objects can also dynamically change at runtime. The possible changes of the geometry depend
on the parameterization capabilities offered by this geometry. Consider the AR-Class Beaker that has properties specifying the diameter and
height of geometry. Different AR-Objects created based on the Beaker AR-Class may represent beakers of different diameters and heights.
Geometry of a composite AR-Class may embed geometries of its component AR-Classes and may depend on the attributes of these ARClasses. Consider the AR-Class Measuring cylinder with liquid composed of two component AR-Classes: Measuring cylinder and Liquid.
The Measuring cylinder AR-Class contains geometry that can be displayed in an AR environment. The Liquid AR-Class has empty geometry,
so it cannot be directly visualized, because the liquids shape is conned to a container it lls. Therefore, geometry of Measuring cylinder
with liquid describes visualization of the associated liquid in the context of geometry included from the Measuring cylinder AR-Class.
Geometry of the liquid can be parameterized by the following attributes: diameter of Measuring cylinder, quantity of liquid in
Measuring cylinder with liquid, and color and opacity of Liquid. In the example, geometry of the Measuring cylinder with liquid AR-Class
is directly parameterized by attributes this AR-Class and indirectly by attributes of its component AR-Classes.
Parameterization of geometry provides a exible mechanism for building highly dynamic 3D graphics content. In particular, the visualization of the objects can be dynamically changed as a result of user actions performed in an AR environment. For instance, when a user
pours different liquids between different containers, visualization of the liquids lling the containers is dynamically adjusted according to
the properties of the liquids.
3.1.3. Behavior
Behavior in AR-Classes is dened by two kinds of operations: methods and activities. Methods are sequences of commands that access and
process data in an immediate way. On the contrary, activities access and process data continuously for a period of time.
Activities describe behavior of AR-Objects in time, in particular describe reactions to some events occurring in an AR environment. Each
activity denotes some distinctive behavior of the AR-Objects instantiated from an AR-Class. Each AR-Object can contain a number of
different activities. Activities can be activated and deactivated at runtime. When an activity is activated for an AR-Object, an activity instance
is created. Execution of activity instances depends on user interaction and behavior of other AR-Objects. An activity instance is executed
until it is explicitly deactivated or it is completed. For one AR-Object, a number of instances of different activities can be executed at the same
time. In particular, a number of instances of one activity can run simultaneously.
Each activity denes an interaction context for instances of an AR-Class. An interaction context of an activity dened for an AR-Class
species the classes of objects that can interact with the AR-Objects instantiated from the AR-Class. For instance, consider an AR-Class
Pipette representing pipettes used for transferring liquids between containers. Thus, the Pipette class should contain specications of
two activities called DrawingFrom and DrippingTo, respectively. The DrawingFrom activity enables a pipette to draw liquid from a
container, whereas the DrippingTo activity enables a pipette to drip liquid to a container. The interaction context of these activities contains
the Container with liquid class. A number of instances of these activities can be executed simultaneously for different AR-Objects
instantiated from the Container with liquid class. As a result for each instance of the Pipette class it is possible to indicate which containers can be used to draw liquid from, and which ones can be used to drip liquid to. Those associations can be dynamically changed at
runtime.
3.1.4. Media objects
AR-Class may contain media objects, such as images, videos, and audio clips. AR-Classes may dene attributes whose values are allowed
to be media objects. Each such attribute can be associated with a default media object contained in the AR-Class. In the AR-Objects
instantiated from the AR-Class, media object attributes can be set to media objects different from their default values.
Media objects contained in an AR-Class can be referenced in the specication of geometry and behavior of the AR-Class. In the geometry
specication, images and videos can be used as textures. Also, audio clips can be embedded in a geometry model, if it is supported by the
modeling language used for the geometry specication. The media objects can also be used as an aid in instruction during learning scenarios
presented in AR environments. The media objects can provide supplementary information on the learning content presented. Images and
videos can be displayed as 2D overlays on top of the view of an AR environment, while audio clips can be played in the background. Audio
clips can contain sound effects, background music, or voice instructions.

3.2. ARIES system


3.2.1. System architecture
The overall architecture of the ARIES system is presented in Fig. 1. The central role plays the learning content repository component storing
AR-Classes and AR-Objects used for building AR environments. AR-Classes and AR-Objects are created in the repository in the learning
content preparation phase. They are created and managed by the use of the AR-Class Manager and the AR-Object Manager, respectively. ARClass Manager and AR-Object Manager are web applications accessible over the Internet. Thus, these tools can be used without the need of
installing any additional software apart from a Web browser. The AR-Class and AR-Object managers cooperate with external authoring tools
for creation and modication of geometry and media objects. The media object authoring tools encompass a variety of graphics, audio, and
video editors.
In the learning content preparation phase, the constituent elements of AR-Classes are dened, i.e., attributes, geometry, behavior, media
objects, and relationships with other AR-Classes. AR-Classes are administered by content designers using AR-Class Manager. Creation of
different elements of AR-Classes requires different skills, thus, the creation of AR-Classes can be an iterative process in which new constituent elements are added incrementally and existing elements are modied by different designers. Using AR-Class Manager, content
designers can navigate the hierarchy of AR-Classes, and also create/modify/delete the AR-Class denitions.

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

575

Fig. 1. Architecture of the ARIES system.

The user interface of AR-Class Manager hosted in a web browser window is presented in Fig. 2. On the left side of the window, there is a
tree representing an AR-Class inheritance hierarchy. The root of this hierarchy is the Object class, which has three subclasses: Real Object,
Virtual Object, and Scene. All the classes representing real objects, virtual objects, and scenes are descendants of the Real Object, Virtual
Object, and Scene classes, respectively. The real object classes are denoted by the R icon, the virtual object classes are denoted by the V
icon, and the scene classes are denoted by the S icon. Abstract classes are marked with the icons in gray, whereas concrete classes are
marked in green. On the right side of the window, there are a number of tabs allowing content designers to edit the constituent elements of
AR-Classes. In particular, using the details tab, a user can associate 3D geometry with an AR-Class.
Geometry and media objects associated with AR-Classes are created and edited using external authoring tools, and then imported by ARClass Manager into the learning content repository. To create 3D geometry, different tools and methods can be used depending on its
complexity and whether the geometry represents a concrete or an abstract concept. Geometry of abstract concepts can be modeled with a

Fig. 2. AR-Class Manager hierarchy of AR-Classes for a chemistry lesson.

576

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

3D modeling package such as 3ds Max (3ds Max, 2012). On the contrary, geometry of real objects can be modeled with photogrammetry
techniques. Photogrammetry enables automatic generation of textured 3D models of real objects from photographic images of the objects
(Luhmann, Robson, Kyle, & Harley, 2006). Parameterization of geometry can be performed using a 3D modeling package extended with
additional plug-ins enabling the parameterization. Using AR-Object Manager, domain experts can easily create, modify, and delete ARObjects. When a domain expert creates a new AR-Object, he/she sets values of the attributes dened in the corresponding AR-Class.
The user interface of AR-Object Manager hosted in a web browser window is presented in Fig. 3. Similarly to AR-Class Manager, AR-Object
Manager contains the tree representing the AR-Class inheritance hierarchy. In the central part of the window, there is a list of AR-Objects
being instances of the AR-Class selected in the hierarchy. In the example, there are three instances of the Cylindrical container with liquid
AR-Class in the list. Each of the AR-Objects is dened with different attribute values. On the right side of the window, there are two tabs with
control elements enabling content creators to specify the attribute values for the currently selected AR-Object. By setting the attribute values
content creators are able to affect the geometry and behavior of the AR-Objects being dened. The attributes of AR-Objects are divided into
the tabs depending on whether they can be set during the creation or setup stage. Furthermore, the values of the setup attributes can be
changed in the content setup stage before a lesson takes places.
In the learning content use phase, AR-Objects are retrieved from the repository and loaded into the presentation module, which is
responsible for visualization of AR environments based on the AR-Objects retrieved from the learning content repository.
3.2.2. Presentation module
The presentation module is used for building AR environments. To this end, the module uses the AR installation, which comprises a video
camera, a display device, and a set of real objects in the form of square cardboard markers located in a real environment.
The presentation module consists of two components: Web Browser and AR Browser. Web Browser offers a web-based user interface for
browsing AR-Classes and AR-Objects representing scenes. An instructor can browse scene AR-Objects created in the content creation stage and
select the scene that should be used for building his/her AR environment. Next, the instructor can customize the visualization and behavioral
properties of the AR environment. To this end, the instructor may set values of the attributes dened in the setup modication interface of the
scene AR-Class. These attributes allow instructors to customize learning scenarios to their needs immediately before the learning stage.
The instructor can initiate a learning scenario, and then the presentation module switches to AR Browser, which generates an AR
environment based on the AR-Objects contained in the scene. AR Browser operates in full-screen mode and enables learners to see the AR
environment in which they can interact with the learning content. The AR Browser displayed on a monitor is shown in Fig. 4.

Fig. 3. AR-Object Manager creating AR-Objects representing different virtual cylindrical containers for liquids.

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

577

Fig. 4. AR Browser component students performing a chemical experiment in an AR environment.

AR Browser combines virtual objects and representations of real objects with live video images captured by a video camera. Visualization
of an AR environment is performed in a loop. In each cycle of the loop, a video frame is grabbed and analyzed to nd and identify particular
real objects present in the real environment. Then, positions and orientations of the real objects relative to the camera are calculated. Finally,
virtual content is rendered and superimposed on the captured image. In particular, the rendered content is transformed according to the
locations of the recognized real objects.
In the AR Browser, tracking of real objects is performed using the ARToolKit library (Kato, Billinghurst, Popyrev, Imamoto, & Tachibana,
2000). ARToolKit is capable of tracking special square-shaped markers placed in a real environment. The ARToolKit library uses computer
vision techniques to calculate position and orientation of a camera relative to markers in real time. The markers have a form of black squares
with a white inner area containing a non-symmetrical pattern. The markers have to be attached to real objects that should be tracked in an
AR environment. The real objects have a form of square cardboard pieces with markers printed on their surfaces. Learners can freely
manipulate the real objects and in this way interact with the virtual content presented in the AR environment, as shown in Fig. 4.
A complete view of an AR environment can be displayed by the presentation module on a head-mounted display, desktop monitor, or
projection system. Using an HMD would be appealing for learners that would like to interact with learning content looking directly at a real
environment instead of a computer display. However, using an HMD in a real classroom is rather difcult for organizational and nancial
reasons. Thus, it is recommended to use rather large-screen displays to enable easier access and allow a number of learners to collaborate in
an AR environment at the same time. In the evaluation of the system, we applied 22-inch LCD monitors. Monitors of this size were
satisfactory, because each AR installation was used by maximum two users at a time.
4. Methods
4.1. Research model
The aim of the experiment was to evaluate the learners attitude toward experiential learning in AR environments. In the experiment we
adopted the Technology Acceptance Model (TAM) that enables to explain the determinants that encourage system use (Davis, 1989; Davis,
Bagozzi, & Warshaw, 1989). The TAM model is a widely used model in technology acceptance studies (Teo, 2009; Sun & Cheng, 2009). The
basic TAM model is shown in Fig. 5.
In the TAM model, acceptance of a system is represented by intention to use, which is determined by the users attitude toward using the
system and perceived usefulness. Attitude toward using a system is determined by users perceptions of the usefulness and ease of use of the
system. According to TAM, perceived usefulness is determined by perceived ease of use. In addition, perceived usefulness and perceived ease
of use can be affected by various external variables. These variables describe user characteristics, system features, and the setting in which
the system is used.

Fig. 5. Technology Acceptance Model (TAM).

578

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

Perceived usefulness is dened as the degree to which a person believes that using a particular system would enhance his or her job
performance (Davis, 1989, p. 320). In the learning context, the user believes that a system would yield positive benets for learning.
Perceived ease of use refers to the degree to which a person believes that using a particular system would be free of effort (Davis, 1989, p.
320). The perceived usefulness and perceived ease of use have been extensively investigated in a number of studies, which proved that they
are important factors positively inuencing computer acceptance (Yuen & Ma, 2002; Liaw & Huang, 2003; Lin & Wu, 2004). However, some
studies criticized the original TAM model due to the omission of intrinsic factors that inuence the computer acceptance (Moon & Kim,
2001; Chung & Tan, 2004). Furthermore, prior studies proved that perceived enjoyment has a signicant positive inuence on attitude
toward using, thus, it should be included in the TAM model (Chung & Tan, 2004; Wu, Chen, & Lin, 2007; Teo & Noyes, 2011).
The original TAM model takes into consideration only extrinsic motivation in the form of perceived usefulness. Extrinsic motivation is
considered to be instrumental in achieving objectives that are distinct from the activity itself. In contrast, intrinsic motivation is related to
the process of performing the activity per se. Thus, perceived usefulness is a form of extrinsic motivation, whereas perceived enjoyment is
considered as intrinsic motivation (Davis, Bagozzi, & Warshaw, 1992; Teo, Lim, & Lai, 1999).
Davis et al. proposed the revised TAM model including perceived enjoyment as intrinsic motivational factor. Perceived enjoyment is
dened as the extent to which the activity of using the computer is perceived to be enjoyable in its own right, apart from any performance
consequences that may be anticipated (Davis et al., 1992, p. 1113).
For evaluation of the acceptance of AR environments by learners we adopted the TAM model enhanced with perceived enjoyment
proposed by Davis et al. (1992). The research model for examining the impact of extrinsic and intrinsic factors on using the ARIES system by
learners is presented in Fig. 6. According to the model, perceived usefulness and perceived enjoyment directly inuence attitude toward
using and intention to use the system. Furthermore, perceived ease of use may directly affect both extrinsic and intrinsic motivation, and the
attitude toward using.
The AR interface based on real objects enabling direct manipulation of virtual objects particularly distinguishes AR environments from
other learning environments built with traditional computer systems. Therefore, in this work we explored the inuence of the AR interface on
the main constructs directly determining the attitude toward using; i.e., perceived usefulness, perceived enjoyment, and perceived ease of use.
To this end, in the research model we included one external variable interface style (IS), which may have a signicant inuence on the
determinants of attitude toward using. A signicant impact of interface style on the attitude of users toward using a system was proved in
previous studies (Hasan & Ahmed, 2007; Sun & Cheng, 2009).
The following research hypotheses were formulated on the basis of the research model:
H1. Perceived usefulness (PU) will positively affect attitude toward using (ATU).
H2. Perceived usefulness (PU) will positively affect intention to use (ITU).
H3. Perceived enjoyment (PE) will positively affect attitude toward using (ATU).
H4. Perceived enjoyment (PE) will positively affect intention to use (ITU).
H5. Perceived ease of use (PEU) will positively affect perceived usefulness (PU).
H6. Perceived ease of use (PEU) will positively affect perceived enjoyment (PE).
H7. Perceived ease of use (PEU) will positively affect attitude toward using (ATU).
H8. Attitude toward using (ATU) will positively affect intention to use (ITU).
H9. Interface style (IS) will positively affect perceived usefulness (PU).
H10. Interface style (IS) will positively affect perceived enjoyment (PE).
H11. Interface style (IS) will positively affect perceived ease of use (PEU).
4.2. Application scenario
The AREM approach can be applied to teaching in different domains such as chemistry, physics, geography, biology, and cultural heritage
(Wojciechowski, Walczak, & Cellary, 2005; Walczak & Wojciechowski, 2005). To illustrate the concept of AR environments we have chosen
the chemistry domain due to several reasons. First, chemistry is fundamentally an experimental science, in which experimentation and

Fig. 6. Research model based on TAM.

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

579

observation are essential for understanding many chemical concepts. Second, chemistry is particularly challenging area as far as the
interactive visualization of chemical experiments is concerned due to high dynamism of visual and behavioral aspects of chemical reactions.
Consider an example interactive scenario that shows learners the reaction of hydrochloric acid (HCl) and sodium hydroxide (NaOH). The
scenario allows learners to gain knowledge of the acidic and basic nature of the substances. In an acidbase reaction, an acid and a base react
forming a salt and water. In the example scenario, the reaction of hydrochloric acid (HCl) and sodium hydroxide (NaOH) produces a table salt
(NaCl) and water.
The experiment requires the following laboratory equipment: laboratory beakers, measuring cylinder, pipette, porcelain evaporating dish,
Bursen burner, and pair of tongs. In the experiment the following chemical supplies are used: HCl solution in water, NaOH solution in water,
phenolphthalein solution in ethanol, and distilled water. One of the beakers is lled with the HCl solution, the other is lled with the phenolphthalein solution. The measuring cylinder contains the NaOH solution. To conduct the experiment, a learner must perform the following steps:
1. Fill the pipette with the phenolphthalein solution from the beaker and drip it into the measuring cylinder with the NaOH solution. The
solution in the cylinder changes color from colorless to pink.
2. Rinse the pipette with distilled water. If a learner does not do so, he or she should get appropriate to remind him/her of the need to rinse
the pipette before drawing another liquid.
3. Fill the pipette with the HCl solution from the beaker and drip it into the measuring cylinder with the NaOH solution. The mixture in the
cylinder changes color from pink to colorless. Continue dripping until the neutralization process is complete.
4. Take the measuring cylinder and pour the mixture into the evaporating dish.
5. Place the evaporating dish over the laboratory burner. Heat the evaporating dish over the burner ame until the solution evaporates to
dryness. The evaporation process produces steam leaving NaCl crystals in the dish.
In traditional teaching, such an experiment is carried out by a teacher, who demonstrates and explains the phenomena occurring during
the experiment. The participation of students in carrying out such an experiment is limited to the observation of the phenomena taking
place and asking questions to the teacher. However, this way of learning is not very engaging for students, who are not allowed to carry out
chemical experiments in person due to safety measures, limited resources in laboratories, and the limited time available for a given group of
students. In addition, when performing experiments teachers cannot perform potentially dangerous experiments, which could cause an
explosion, re, or the emission of hazardous substances.
4.2.1. Learning content preparation
The AR-Classes and AR-Objects created for the acidbase reaction scenario are presented in Fig. 7. The upper part of the diagram shows a
fragment of the AR-Class inheritance hierarchy dened for the scenario. Below the AR-Classes, a selection of AR-Objects used in the scenario
is presented.

Fig. 7. AR-Classes and AR-Objects dened for the application scenario.

580

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

AR-Classes have a single line border, whereas AR-Objects are framed by a double line. The is-instance-of relationships between ARObjects and AR-Classes are represented by a dashed line with a solid arrowhead that points to an AR-Class. The composition and
containment relationships are denoted by a solid line with a solid or an empty diamond, respectively.
The acidbase reaction scenario is implemented as the HClNaOH reaction object that is an instance of the Acidbase reaction class.
The HClNaOH reaction object represents a scene describing an AR environment. The Acidbase reaction class is connected by the
containment and composition relationships with AR-Classes representing both virtual and real objects. The HClNaOH reaction object is
connected by the containment and composition relationships with the appropriate AR-Objects conforming to the AR-Classes specied for
the relationships dened in the Acid-base reaction class.
4.2.2. Learning content use
When a lesson in a classroom is going to take place, an instructor selects the HClNaOH reaction object from the learning content
repository and sets the learning scenario presentation parameters. Next, the scenario is started and the required AR-Objects are retrieved
from the learning content repository and loaded into the presentation module, which builds a complete AR environment.
When the acidbase reaction scenario is initiated, a learner is provided via the presentation module with the AR environment containing
both real and virtual objects. There are two beakers with the phenolphthalein and HCl solutions standing on the virtual caption boxes with
appropriate labels, the measuring cylinder with the NaOH solution, the pipette, and two beakers for rinsing the pipette with distilled water
standing on the green virtual caption box. The presented laboratory glassware is represented as virtual objects. The cylinder and the pipette
are attached to real objects having form of square cardboard pieces.
At the beginning of the scenario a learner should drip some quantity of the phenolphthalein solution into the NaOH solution. Manipulating the appropriate marker, the learner draws the phenolphthalein solution into the pipette from the beaker. To this end, he/she has to
move the pipette close enough to the beaker with the solution. While the tapered pipette end is being located close enough over the beaker,
the pipette is lling with the solution. The lling stops when the learner moves the pipette from the beaker away. Next, the learner drips the
phenolphthalein solution into the NaOH solution by placing the pipette end over the measuring cylinder. While the dripping takes place, the
solution in the cylinder changes color from colorless to pink, because the NaOH solution has a basic pH.
The next step of the chemical experiment is transferring the HCl solution from the beaker to the measuring cylinder. Before that, the
learner has to rinse the pipette with distilled water for safety reasons. If he/she forgets about it, a message appears on the screen, which
reminds him/her that the pipette has to be rinsed every time before using it for transferring various liquids. After rinsing the pipette, the
learner transfers some quantity of the HCl solution with the pipette from the beaker to the measuring cylinder containing the NaOH solution
mixed with the pH indicator. While the learner drips the HCl solution, the mixture in the cylinder changes color from pink to colorless,
because the acid neutralizes the basic pH of the NaOH solution. When the mixture in the cylinder has pH neutral, the learner pours the
mixture from the cylinder to the evaporating dish. Next, the learner grabs the evaporating dish with the tongs and place the dish above the
burner ame in order to heat the mixture.
When the mixture is heated sufciently, water evaporates and condenses into mist. At the same time, at the bottom of the evaporating
dish sodium chloride (NaCl) appears. The learner can control the heating process by manipulating the tongs with the marker. The scenario
nishes when all the water evaporates from the mixture leaving dry NaCl at the bottom of the evaporating dish.
4.3. Design of empirical study
The empirical study consisted of using the ARIES system to carry out a chemical experiment in an AR environment according to the
application scenario presenting the reaction between hydrochloric acid and sodium hydroxide. The study involved 42 participants of the
second grade of lower secondary school at the age of 1416 years. The chemistry curriculum in the second grade of lower secondary school
in Poland includes topics such as acids, bases, salts, and pH. Therefore, the application scenario used in the study concerned the topic of the
reaction between acids and bases, which is consistent with the curriculum of the students.
In order to perform the study, we setup six AR installations for carrying out chemical experiments in AR environments. Each AR
installation was composed of a desktop PC with a monitor, a webcam, and a set of square cardboard markers. One of the AR installations is
shown in Fig. 8. In the installation, a webcam is placed on top of the monitor. It captures the area in which a set of square cardboard markers
are placed. Students sit in front of the monitor and can freely manipulate the markers. The image captured by the webcam displayed on the
screen is ipped horizontally. This allows the students to see on the monitor their mirror image augmented with virtual objects. In this way,
students get the illusion that virtual objects exist in their environment. Furthermore, the students have the opportunity to directly interact
with the virtual objects using the real markers in a natural and intuitive way, as shown in Figs. 9 and 10.
Students were performing chemical experiments in groups of two. They were collaborating on performing experiments, exchanging
remarks about the presented learning content, and giving to each other instructions on how to use the system. The students after a few
minutes of working with the system took on relevant experience in interacting with the learning content using an interface based on the
cardboard markers. The students were free to manipulate the markers and could focus on carrying out chemical experiments in an AR
environment, as shown in Fig. 10.
Each student could carry out the experiment at his/her own pace tailored to his/her personal preferences. Students were able to freely
manipulate the real markers but the experiment scenario restricted possible interactions between virtual objects to those that were
essential for the proper execution of the experiment. While performing the experiment, the students were following the guidance displayed
on the screen. The guidance was comprised of instructions and explanations of the chemical and physical phenomena, and thus the teacher
involvement was kept to minimum. There were no technical problems during the study, so the students could focus on the merits of the
experiment.
All of the 42 participants completed the experiment successfully. After the completion of the experiment, the participants were asked to
ll out an anonymous questionnaire with statements about working with the ARIES system and the attitude toward using such a system in
the learning process in the future. We developed a questionnaire to measure each of the constructs comprising the research model presented in Fig. 6. The participants were asked to provide demographic information and respond to 18 statements grouped into six groups

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

581

Fig. 8. AR installation a desktop PC with a monitor, a webcam, a set of physical markers.

representing the constructs of the research model. The questionnaire statements were adapted from previous studies dealing with the TAM
model with changes in expressions in order to adjust them to the context of AR environments. Each statement in the questionnaire was
measured according to a ve-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree), with the exception of one reversed
item for attitude toward using, which was measured in a ve-point Likert scale ranging from 1 (strongly agree) to 5 (strongly disagree).
5. Results
5.1. Descriptive statistics
The statements of the questionnaire and the descriptive statistics for each statement are presented in Table 1. All mean values are within
a range of 3.93 and 4.62. The standard deviation range from 0.623 to 1.310.
To measure the internal consistency of statements a coefcient Cronbach alpha was calculated for the statements belonging to each
construct specied in the research model. To consider the internal reliability of statements concerning the same construct as satisfactory
Cronbach alpha should be greater than 0.7. The obtained Cronbach alpha values for each construct except ATU are at a satisfactory level, as
shown in Table 2. In the case of ATU, the value is slightly lower, which may indicate minor differences between the statements formulated
regarding attitude toward using. This discrepancy could be inuenced by the fact that one of the three statements was a reversed item
phrased in the opposite semantic direction from the other statements. Negative statements used together with positive statements can
decrease the degree of internal consistency, because the negative items may not be considered the exact opposite of the positive ones
(Barnette, 2000).
5.2. System evaluation
To verify hypotheses H5, H6, H9, H10, and H11 we examined the relationships between pairs of the appropriate constructs dened in the
research model using regression analysis. The results of the regression analysis are presented in Table 3.

Fig. 9. Mirrored image augmented with virtual objects direct interaction of students with virtual objects.

582

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

Fig. 10. Students carrying out a chemical experiment by manipulating real objects.

The coefcient p was less than the assumed signicance level of 0.05 for all of the calculated regression values. Thus, for each of the
hypotheses we rejected the null hypothesis indicating the lack of dependence.
Perceived usefulness depends to a similar extent on perceived ease of use (R2 0.491) and interface style (R2 0.478). Based on the
regression values the hypotheses H5 and H9 were supported. Perceived enjoyment was dependent to a relatively small extent on both
perceived ease of use (R2 0.346) and interface style (R2 0.368). However, the regression values were sufcient to accept the hypotheses
H6 and H10. Interface style had signicant impact on perceived ease of use (R2 0.596), so the hypothesis H11 was supported.
In order to thoroughly investigate factors that affect attitude toward using and intention to use we used stepwise multiple regression
analysis. The results of the analysis are presented in Table 4.
As a result of stepwise multiple regression analysis, we received a regression model of attitude toward using based on perceived usefulness and perceived enjoyment (R2 0.827). The results of the regression analysis supported the hypotheses H1 and H3. The stepwise
multiple regression algorithm excluded perceived ease of use due to the p value higher than signicance level (p 0.243). This meant that
the H7 hypothesis was not supported. Based on the results, perceived usefulness and perceived enjoyment had a strong impact on attitude
toward using the system.
Based on the stepwise multiple regression analysis, intention to use depended on attitude toward using and perceived enjoyment
(R2 0.737). The results of the regression analysis supported the hypotheses H4 and H8. The stepwise multiple regression algorithm caused
the exclusion of perceived usefulness because of the too high p value (p 0.953). This meant that the H2 hypothesis was not supported.
Based on the regression model, perceived enjoyment and attitude toward using had a strong positive effect on intention to use the system.

Table 1
The questionnaire statements and the means, standard deviations of the answers.
Questionnaire statements

S.D.

3.93

0.921

4.17

0.935

4.10

0.958

4.14

1.095

4.19

1.087

4.31

0.869

4.29

0.708

4.50

0.707

4.33

0.846

4.40

0.767

4.43

0.859

4.29

0.944

4.55

0.803

4.21

0.782

4.12

1.310

4.29

0.835

4.62

0.623

4.40

1.037

Interface style (IS)

 Moving virtual objects using the cardboard cards is easy.


 Operation of a computer with the cards is a good idea.
 I could easily control the course of the chemical experiment using the cards.
Perceived usefulness (PU)

 The use of such a system improves learning in the classroom.


 Using the system during lessons would facilitate understanding of certain concepts.
 I believe that the system is helpful when learning.
Perceived ease of use (PEU)

 I think the system is easy to use.


 Learning to use the system is not a problem.
 Operation with the system is clear and understandable.
Perceived enjoyment (PE)

 I think the system allows learning by playing.


 I enjoyed using the system.
 Learning with such a system is entertainment.
Attitude toward using (ATU)

 The use of such a system makes learning more interesting.


 Learning through the system was boring (reversed item).
 I believe that using such a system in the classroom is a good idea.
Intention to use (ITU)

 I would like to use the system in the future if I had the opportunity.
 Using such a system would allow me to perform chemical experiments on my own.
 I would like to use the system to learn chemistry and other subjects.

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

583

Table 2
The Cronbach alpha values.
Variable

Cronbach alpha

Interface style (IS)


Perceived usefulness (PU)
Perceived ease of use (PEU)
Perceived enjoyment (PE)
Attitude toward using (ATU)
Intention to use (ITU)

0.808
0.839
0.737
0.735
0.639
0.839

Table 3
The regression analysis.
Dependent variable

Independent variable

R2

Perceived usefulness (PU)

Perceived ease of use (PEU)


Interface style (IS)
Interface style (IS)
Perceived ease of use (PEU)
Interface style (IS)

0.491
0.478
0.596
0.346
0.368

<0.001
<0.001
<0.001
<0.001
<0.001

Perceived ease of use (PEU)


Perceived enjoyment (PE)

Table 4
The stepwise regression analysis.
Dependent variable

Predictors

R2

Attitude toward using (ATU)

Perceived usefulness (PU)


Perceived enjoyment (PE)
Attitude toward using (ATU)
Perceived enjoyment (PE)

0.827

<0.001
<0.001
<0.001
0.020

Intention to use (ITU)

0.737

6. Conclusions
Following the empirical study, we found that perceived usefulness and perceived enjoyment had a similar effect on attitude toward using
image-based AR environments. With regard to the intention to use of AR environments, perceived enjoyment was a much more signicant
factor than perceived usefulness. Therefore, the use of AR environments during lessons could provide extra motivation to learn for young
students. Before performing the empirical study, we wondered whether the interface style based on physical markers would not act as a
disincentive to use the system. Based on the analysis, it turned out that although interface style had a strong inuence on perceived ease of
use, these two factors had a little effect on perceived enjoyment. Despite the fact that at the beginning the user interface based on physical
markers required a little practice, it was not a factor limiting perceived enjoyment. Furthermore, perceived enjoyment was the factor having
the essential inuence on the willingness of students to use the system in the learning process.
An alternative interpretation of the study results is that the positive attitude of learners to the AR technology was expressed due to its
novelty. It can be assumed that the positive attitude of students to learning in AR environments will fade with time, since the learners will
get used to the technology. To maintain the learners interest in AR technology in the longer term, continuous provision of engaging learning
content is of crucial importance. Thus, successful dissemination of the AR technology in education on a large scale will greatly depend on the
availability and quality of learning content for AR environments. Therefore, research on the application of AR in education should focus
primarily on the development of new methods for the creation of interactive 3D content for AR learning environments. The proposed ARIES
system ts in with this trend and enables teachers to develop new learning content by the creation of AR-Classes and AR-Objects. The
opportunity for the creation of new content by teachers themselves fosters continuous development of high-quality learning content, since
they have substantive and pedagogical knowledge required to prepare such content in accordance with the curriculum and pedagogy.
Image-based AR environments seamlessly combine interactive 3D learning content with real environments containing physical objects.
The learners can interact with the content in a direct and intuitive way by manipulation of physical objects, thus they have an opportunity to
perform different experiments in person. The active participation of learners in hands-on activities has a particularly positive effect on the
perceived enjoyment, resulting in their increased motivation for learning.
Another important advantage of image-based AR environments is the freedom of experimentation, which could be impossible to achieve
in the real world due to cost and safety reasons. In the presented application scenario, AR environments are used to implement experiential
learning for performing chemical experiments. In these environments students are able to carry out experiments in person using virtual
counterparts of real laboratory equipment and chemicals. The replacement of the actual laboratory resources with their virtual counterparts
enables educational institutions to achieve signicant nancial savings. Once designed and developed, virtual objects can be reused by a
number of students for performing various experiments. One installation for learning in image-based AR environments can be used for a
broad spectrum of chemical experiments without having to make changes to the physical conguration of this installation. The AR
installation takes up much less space than a typical workbench for chemical experiments and does not require any special chemistry
laboratory infrastructure.
It is possible to setup a number of AR installations in a classroom in order to enable performing in parallel chemical experiments by a
number of students individually or in small groups. Setting a number of AR installations in a classroom enables each student to perform an
experiment independently at his/her individual pace. Also, students can perform different experiments in parallel, if decided by a teacher.

584

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

Experimenting in person encourages students to test various what-if scenarios that are possible for a given chemical experiment. In
particular, students can perform potentially dangerous tasks without compromising their health and safety.
AR has great potential for educational applications because it supports situated learning (Johnson, Smith, Willis, Levine, & Haywood,
2011). To ensure successful situated learning, AR environments should provide a reliable representation of reality to allow students to
gain knowledge applicable in the real world. The realistic learning contexts offered by AR environments considerably facilitate the transfer
of the abilities learned to the real world, in comparison to learning out of the real context.
However, not all learning experiences are equally educative (Dewey, 1938). Some experiences can be mis-educative unless they are
followed by the opportunity to reect on what happened. Learners should be able to draw generalizations from the experiences and understand how to use these generalizations in future experiences. Thus, teachers have to create learning environments in which learners are
actively engaged in meaningful tasks and carefully guided in reection on their experiences. During the learning content preparation for AR
environments, a special attention must be paid to the representation of potentially dangerous activities. The ease of exploring the consequences of such actions and a sense of security offered by AR environments cannot foster the students illusory belief that performing the
same actions in the real world would also not result in hazardous consequences. Therefore, AR learning environments should include
appropriate guidance to the operations performed by students, and particularly dangerous activities should always be accompanied with
the relevant warnings about safety, or even prohibited.
We conclude that learning in image-based AR environments can be particularly attractive and evocative for younger generations, by
whom it can be perceived more like edutainment than pure learning. This is the right time to introduce AR to teaching on a large scale. In
recent years, wide availability of 3D computer games and movies based on 3D computer graphics has led to widespread familiarity of people
with the 3D technologies. The young generations accustomed to 3D games and movies demand similar experiences in education. In this
context, the AR technology may offer a great help to educational institutions in increasing the attractiveness of teaching, thereby providing
better motivation for students to learn.
The study on the attitude of learners toward AR learning environments is only the rst step in the dissemination of the AR technology in
education. Further research should focus on whether students are actually acquiring knowledge and to what extent their knowledge of the
concepts and processes presented in AR environments is increased. Next, a comparative experimental study should be carried out to
determine if students taught with the use of AR achieve signicantly better results and the level of self-efcacy compared to a control group
taught using the traditional methods.

References
3ds Max. (2012). Autodesk 3ds Max. Accessed 19.10.12. http://www.autodesk.com/3dsmax.
Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 6(4), 355385.
Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those
negatively worded stems. Educational and Psychological Measurement, 60, 361370.
Beard, C., & Wilson, J. P. (2006). Experiential learning: A best practice handbook for educators and trainers (2nd ed.). London: Kogan Page Limited.
Boyle, T. (2003). Design principles for authoring dynamic, reusable learning objects. Australian Journal of Educational Technology, 19(1), 4658.
Bruner, J. S. (1996). The culture of education. Cambridge, MA: Harvard University Press.
Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. NJ: John Wiley & Sons.
Cellary, W. (2002). Social changes. In W. Cellary (Ed.), Poland and the global information society: Logging on (pp. 2933). Warsaw: UNDP. Accessed 19.10.12. http://hdr.undp.org/
es/informes/nacional/europacei/poland/poland_2001_en.pdf.
Chaille, C., & Britain, L. (2002). The young child as scientist: A constructivist approach to early childhood science education (3rd ed.). Boston: Allyn & Bacon.
Champion, E. (2006). Enhancing learning through 3D virtual environments. In E. K. Sorensen, & D.. Murch (Eds.), Enhancing learning through technology (pp. 103124).
London: Ideas Group Inc., Information Science Publishing.
Cheng, K.-H., & Tsai, C.-C. (2012). Affordances of augmented reality in science learning: suggestions for future research. Journal of Science Education and Technology, . http://
dx.doi.org/10.1007/s10956-012-9405-9.
Chung, J., & Tan, F. B. (2004). Antecedents of perceived playfulness: an exploratory study on user acceptance of general information-searching websites. Information and
Management, 41(7), 869881.
Dalgarno, B., Bishop, A. G., Adlong, W., & Bedgood, D. R. (2009). Effectiveness of a virtual laboratory as a preparatory resource for distance education chemistry students.
Computers & Education, 53(3), 853865.
Davies, E. R. (2005). Machine vision: Theory, algorithms, practicalities (3rd ed.). San Francisco, CA: Morgan Kaufmann Publishers.
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319340.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models. Management Science, 35(8), 9821003.
Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the workplace. Journal of Applied Social Psychology, 22(14), 11111132.
Dewey, J. (1938). Experience & education. New York, NY: Simon & Schuster (republished in 1997).
Drascic, D., & Milgram, P. (1996). Perceptual issues in augmented reality. In M. T. Bolas (Ed.), Spie Volume 2653: Stereoscopic displays and virtual reality systems III (pp. 123134).
San Jose, CA: SPIE.
Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Handbook of research for educational
Communications and technology (pp. 170198). NY: Simon & Schuster.
Dunleavy, M., Dede, C., & Mitchell, R. (2009). Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journal of
Science Education and Technology, 18(1), 722.
Fjeld, M., Juchli, P., & Voegtli, B. M. (2003). Chemistry education: a tangible interaction approach. In M. Rauterberg (Ed.), Proceedings of humancomputer interaction
INTERACT03 (pp. 287294). Amsterdam: IOS Press.
Hasan, B., & Ahmed, M. U. (2007). Effects of interface style on user perceptions and behavioral intention to use computer systems. Computers in Human Behavior, 23, 3025
3037.
Jara, C. A., Candelas, F. A., Puente, S. T., & Torres, F. (2011). Hands-on experiences of undergraduate students in automatics and robotics using a virtual and remote laboratory.
Computers & Education, 57(4), 24512461.
Jeong, J.-S., Park, C., Kim, M., Oh, W.-K., & Yoo, K.-H. (2011). Development of a 3D virtual laboratory with motion sensor for physics education. In T.-H. Kim (Ed.), Proceedings of
ubiquitous computing and multimedia applications: Second international conference (UCMA 2011), part I (pp. 253262). Berlin Heidelberg: Springer-Verlag.
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 horizon report. TX: The New Media Consortium.
Jonassen, D. H. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and models: their current state of the art (pp. 215
239). Mahwah, NJ: Lawrence Erlbaum Associates.
Kato, H., Billinghurst, M., Popyrev, I., Imamoto, K., Tachibana, K. (2000). Virtual object manipulation on a table-top AR environments. In Proceedings of international symposium
on augmented reality ISAR 2000 (pp. 111119).
Kaufmann, H., Schmalstieg, D., & Wagner, M. (2000). Construct3D: a virtual reality application for mathematics and geometry education. Education and Information Technologies, 5(4), 263276.
Klopfer, E. (2008). Augmented learning. Cambridge, MA: MIT Press.

R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585

585

Klopfer, E., & Squire, K. (2008). Environmental detectives the development of an augmented reality platform for environmental simulations. Educational Technology Research
and Development, 56(2), 203228.
Kolb, D. A. (1984). Experiential Learning: Experience as the source of learning and development. NJ: Prentice Hall.
Landowska, A., & Kaczmarek, J. (2005). Educational resources as digital products. In K. Baukneht (Ed.), Proceedings of e-commerce and web technologies: Sixth international
conference EC-web 2005 Copenhagen (pp. 228237). Berlin Heidelberg: Springer-Verlag.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. NY: Cambridge University Press.
Liaw, S.-S., & Huang, H.-M. (2003). An investigation of user attitudes toward search engines as an information retrieval tool. Computers in Human Behavior, 19(6), 751765.
Lin, F.-H., & Wu, J.-H. (2004). An empirical study of end-user computing acceptance factors in small and medium enterprises in Taiwan: analyzed by structural equation
modeling. Journal of Computer Information Systems, 44(3), 98108.
Liu, W., Cheok, A. D., Mei-Ling, C. L., & Theng, Y.-L. (2007). Mixed reality classroom: learning from entertainment. In K. K. W. Wong (Ed.), Proceedings of the second international
conference on digital interactive media in entertainment and arts (DIMEA07) (pp. 6572). NY: ACM Press.
Luhmann, T., Robson, S., Kyle, S., & Harley, I. (2006). Close range photogrammetry: Principles, techniques and applications. NJ: John Wiley & Sons, Inc.
Marshall, H. H. (1996). Implications of differentiating and understanding constructivist approaches. Educational Psychologist, 31(3/4), 235240.
Martn-Gutirrez, J., Saorn, J. L., Contero, M., Alcaiz, M., Prez-Lpez, D. C., & Ortega, M. (2010). Design and validation of an augmented book for spatial abilities development
in engineering students. Computers & Graphics, 34(1), 7791.
Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality virtual displays. IEICE Transactions on Information and Systems, E77-D(12), 13211329.
Moon, J.-W., & Kim, Y.-G. (2001). Extending the TAM for the world wide web context. Information and Management, 38(4), 217230.
Parker, J. R. (1997). Algorithms for image processing and computer vision. Indianapolis, IN: Wiley Publishing, Inc.
Piaget, J. (1973). To understand is to invent: The future of education. NY: Grossman Publishers.
Roussou, M. (2004). Learning by doing and learning through play: an exploration of interactivity in virtual environments for children. ACM Computers in Entertainment, 2(1),
123. NY: ACM Pres.
Schank, R. C., Berman, T. R., & Macperson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.). Instructional design theories and models: A new paradigm of instructional
theory, Vol. II, (pp. 161181). Mahwah, NJ: Lawrence Erlbaum Associates.
Squire, K. D., & Jan, M. (2007). Mad City Mystery: developing scientic argumentation skills with a place-based augmented reality game on handheld computers. Journal of
Science Education and Technology, 16(1), 529.
Steuer, J. (1992). Dening virtual reality: dimensions determining telepresence. Journal of Communication, 42(4), 7393.
Sun, H.-M., & Cheng, W.-L. (2009). The input-interface of Webcam applied in 3D virtual reality systems. Computers & Education, 53(4), 12311240.
Teo, T. (2009). Modelling technology acceptance in education: a study of pre-service teachers. Computers & Education, 52(1), 302312.
Teo, T. S. H., Lim, V. K. G., & Lai, R. Y. C. (1999). Intrinsic and extrinsic motivation in Internet usage. OMEGA: International Journal of Management Science, 27(1), 2537.
Teo, T., & Noyes, J. (2011). An assessment of the inuence of perceived enjoyment and attitude on the intention to use technology among pre-service teachers: a structural
equation modeling approach. Computers & Education, 57(2), 16451653.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.
Walczak, K., & Cellary, W. (2003). X-VRML for advanced virtual reality applications. IEEE Computer, 36(3), 8992.
Walczak, K., & Wojciechowski, R. (2005). Dynamic creation of interactive mixed reality presentations. In Y. Chrysanthou, & R. Darken (Eds.), Proceedings of ACM symposium on
virtual reality software and technology (VRST 2005) (pp. 167176). NY: ACM Press.
Walczak, K., Wojciechowski, R., & Cellary, W. (2006). Dynamic interactive VR network services for education. In Proceedings of ACM symposium on virtual reality software and
technology (VRST 2006) (pp. 277286). NY: ACM Press.
Web3D Consortium. (2011). X3D specication website. Accessed 19.10.12. http://www.web3d.org/x3d/specications/.
Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design. Englewood Cliffs, NJ: Educational Technology Publications.
Wojciechowski, R. (2012). Modeling interactive augmented reality environments. In W. Cellary, & K. Walczak (Eds.), Interactive 3D multimedia content models for creation,
management, search and presentation (pp. 137170). London: Springer.
Wojciechowski, R., Walczak, K., & Cellary, W. (2005). Mixed reality for interactive learning of cultural heritage. In S. Richir, & B. Taravel (Eds.), Proceedings of rst international
VR-learning seminar, in conjunction with the 7th international conference on virtual reality, VRIC Laval virtual 2005 (pp. 9599).
Wojciechowski, R., Walczak, K., White, M., & Cellary, W. (2004). Building virtual and augmented reality museum exhibitions. In Proceedings of 9th international conference on
3D web technology (Web3D 2004) (pp. 135144). NY: ACM Press.
Wu, J. H., Chen, Y. C., & Lin, L. M. (2007). Empirical evaluation of the revised end user computing acceptance model. Computers in Human Behavior, 23(1), 162174.
Yang, Y.-T. C. (2012). Building virtual cities, inspiring intelligent citizens: digital games for developing students problem solving and learning motivation. Computers &
Education, 59(2), 365377.
Yuen, A., & Ma, W. (2002). Gender differences in teacher computer acceptance. Journal of Technology and Teacher Education, 10(3), 365382.

Você também pode gostar