Você está na página 1de 12

Computers & Education 114 (2017) 286e297

Contents lists available at ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

Accessing online learning material: Quantitative behavior


patterns and their effects on motivation and learning
performance
Liang-Yi Li a, *, Chin-Chung Tsai b
a
Research Center for Science and Technology for Learning, National Central University, Taoyuan, Taiwan
b
Program of Learning Sciences, National Taiwan Normal University, Taipei, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history: Accessing learning materials, that is, lecture slides, video lectures, shared assignments, and
Received 15 November 2016 forum messages, is the most frequently performed online learning activity. However,
Received in revised form 12 July 2017 students with different purposes, motivations, and preferences may exhibit different be-
Accepted 13 July 2017
haviors when accessing these materials. These different behaviors may further affect their
Available online 15 July 2017
learning performance. This study analyzed system logs recorded by a Learning Manage-
ment System in which 59 computer science students participated in a blended learning
Keywords:
course to learn mobile phone programming. The results revealed several significant
Distributed learning environments
Media in education
findings. First, the students viewed the learning materials related to their classroom lec-
Post-secondary education tures (i.e., lecture slides and video lectures) for longer and more often than other learning
Teaching/learning strategies materials (i.e., shared assignments and posted messages). Second, although the students
spent a great deal of time viewing the online learning materials, most did not use anno-
tation tools. Third, students’ viewing behaviors showed great variety and were clustered
into three behavior patterns: “consistent use students” who intensively used all of the
learning materials, “slide intensive use students” who intensively used the lecture slides,
and “less use students” who infrequently used any learning material. These different
behavior patterns were also associated with their motivation and learning performance.
The results are discussed, and several suggestions for teachers, researchers, and system
designers are proposed.
© 2017 Published by Elsevier Ltd.

1. Introduction

Learning management systems (LMSs), such as Blackboard and Moodle, have been widely used in companies, universities,
and educational institutions. They provide several advantages for teaching and learning. First, LMSs support teachers in
conducting a variety of in- and after-classroom activities such as reading learning materials, asynchronous discussion,
quizzes, and self-/peer-assessment. Second, students can freely select the course materials and control their learning pace and
paths. The freedom of selecting and controlling can increase students' motivation (Kopcha & Sullivan, 2008) and provide
them with more learning opportunities. Finally, students’ learning tracks are recorded in system logs. These logs can be
analyzed to generate useful information for teachers to improve their instructional design.

* Corresponding author. No.300, Jhongda Rd., Jhongli City, Taoyuan County 32001, Taiwan.
E-mail addresses: lihenry12345@gmail.com (L.-Y. Li), tsaicc@ntnu.edu.tw (C.-C. Tsai).

http://dx.doi.org/10.1016/j.compedu.2017.07.007
0360-1315/© 2017 Published by Elsevier Ltd.
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 287

With the wide adoption of LMSs in the recent decade, many students' learning tracks have been recorded and accumulated
in system logs. Researchers have analyzed these logs using statistical and data mining techniques for understanding students'
engagement levels of online participation (Henrie, Halverson, & Graham, 2015), for discovering students' behavior patterns of
online participation (Lust, Elen, & Clarebout, 2013a,b; Lust, Vandewaetere, Ceulemans, Elen, & Clarebout, 2011), and for
exploring the relationships between online participation behaviors and learning performance (Cheng & Chau, 2016; Graf, Liu,
& Kinshuk, 2010). The results of these studies have revealed that students’ behaviors of accessing LMS tools vary greatly, and
the variations can significantly affect their learning performance (Lust, Collazo, Elen, & Clarebout, 2012).
Accessing online learning materials is one of the most frequently performed learning activities on LMSs (Lust, Elen, &
Clarebout, 2013a,b; Lust et al., 2012). In general, teachers and students publish and create several kinds of online learning
materials. These materials provide different advantages for learning. For example, lecture slides can provide students with an
overview of the lecture contents and facilitate students' note-taking (Worthington & Levasseur, 2015), video lectures can let
students review difficult concepts and prepare for examinations (Kay, 2012), while peers’ assignments and messages posted
in discussion forums are important resources for self-reflection (Cho & Cho, 2011; Dennen, 2008). Students with different
purposes, preferences, and motivations may demonstrate different engagement levels and behavior patterns when accessing
these learning materials. These different engagement levels and behavior patterns may in turn affect their learning perfor-
mance. Therefore, understanding how students access different kinds of learning materials and how their behaviors in
accessing these materials affect their learning performance may give teachers useful information to improve their instruc-
tional design and provide insights for content and instructional designers to develop personalized learning supports.

2. Analyses of LMS participation behaviors

2.1. LMS learning material use

Most of the previous studies mainly analyzed discussion activities (Canal, Ghislandi, & Micciolo, 2015; Xie, Yu, & Bradshaw,
2014; Zheng & Warschauer, 2015) or whole LMS online participation (Cerezo, Sanchez-Santillan, Paule-Ruiz, & Nunez, 2016;
Lust et al., 2013a,b; You, 2016). In our search, we found very few studies that focused on the investigation of the behaviors of
accessing online learning materials. The most related study by Heffner and Cohen (2005) examined the relationships between
the behaviors of viewing online course materials (e.g., number of viewing syllabus, number of viewing course information,
and number of viewing instructor information), individual difference (gender), and final course grade. They found that
number of times course materials were accessed had a positive relationship with the final course grade, and that female
students more frequently accessed course materials than male students.

2.2. Overall LMS tool use

Studies which analyze overall LMS tool use can provide useful information for understanding how students access online
learning materials and how the different accessing behaviors affect learning performance. Several studies have used
descriptive statistics to report their findings. For example, Lust et al. (2013a,b) found that basic information tools (i.e., number
of course outlines viewed and number of Web-lectures viewed) were the most frequently accessed tools. You (2016) found
that students visited instructional videos on a regular basis, and that their video viewing time varied widely.
Studies have also used correlation and regression analyses to explore the relationships between online participation
behaviors and learning performance. For example, You (2016) used six variables (i.e., regular study score, total time of viewing
instructional videos, number of logins, late submission score, number of times replying to the course information, and
number of messages created) to explore the relationship between these variables and the final course score and to predict
students’ course achievement. The results revealed that total time of viewing instructional videos demonstrated positive
relations to number of logins and number of messages created. They also found that students' regular study score, late
submission score, number of logins, and number of replies to the course information significantly predicted their course
achievement. However, total time viewing instructional videos did not significantly predict course achievement.
Cerezo et al. (2016) examined the relationships between six variables (i.e., time spent on quizzes, time spent viewing
theoretical content, time spent on forums, number of words written on forums, number of other relevant actions, and number
of days that students took to hand in the task) and the final examination score. They found that time spent on viewing
theoretical content was significantly positively related to time spent on forums and to number of words in forum posts, but
was not related to the final examination score. These results revealed that time spent accessing lecture slides and video
lectures is related to the engagement levels of other learning activities. However, time spent accessing these learning ma-
terials was unrelated to learning performance.

2.3. Students’ behavior patterns in LMS

In addition to using statistical methods, several studies have used cluster analysis to classify students into distinct groups
and to examine the effects of the groups on learning performance. For example, Lust et al. (2011) used cluster analysis with
the tool-use variables of online participation (i.e., number of homepage hits, number of messages read and posted, number of
quiz attempts, course material outline use, number of accessed Web links, number of accessed Web lectures, number of
288 L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297

downloads of learning support, number of posts, and number of reading posts, and time spent viewing Web lectures) and
face-to-face time (e.g., amount of participation in support sessions) and generated three tool-use patterns: no-users (N ¼ 36),
that is, those who did not use face-to-face tools and who used online tools with low or medium frequency and for a short
time; intensive users (N ¼ 67), that is, those who used the available face-to-face tools and accessed the online tools with a
high frequency; and incoherent users (N ¼ 53), that is, those who used the available face-to-face tools with a medium fre-
quency, and selectively accessed the online tools. They found that the intensive users used Web lectures more frequently and
intensively than the incoherent users and no-users.
Cerezo et al. (2016) used six variables, that is, time spent on quizzes, time spent viewing theoretical content, time spent on
forums, number of words written on forums, number of other relevant actions, and number of days that students took to hand
in the task, to classify students into four clusters: the non-task oriented and low procrastination group (N ¼ 42), that is, those
who did not delay completing the quizzes and allocated a small amount of time to working on the quizzes and a large amount
to viewing the theoretical contents; the task oriented and low procrastination group (N ¼ 41), that is, those who did not delay
completing the quizzes and allocated a large amount of time to working on the quizzes and forums and a small amount to
viewing the theoretical contents; the task oriented and medium procrastination group (N ¼ 27), that is, those who sometimes
delayed completing the quizzes and allocated a large amount of time to working on quizzes but a small amount to viewing the
theoretical contents and to participating in forums; and the non-task oriented and high procrastination group (N ¼ 30), that
is, those who frequently delayed completing the quizzes and allocated a small amount of time to working on the quizzes but a
large amount to viewing the theoretical contents and to participating in the forums. They found that the students who spent
more time on the quizzes (i.e., the task oriented groups) performed better than those who spent less time on them (i.e., the
non-task oriented groups). The students who handed in the tasks later (i.e., the medium and high procrastination groups)
received lower scores.

2.4. Motivation and online participation

Learning on LMSs is a self-regulated process in which learners can freely select the course materials and control their
learning pace and paths. Motivation is crucial for effective self-regulation (Cho & Kim, 2013; de Barba, Kennedy & Ainley,
2016). It affects the cognitive and metacognitive strategies used during the self-regulated process (Pintrich & Degroot,
1990). The use of these strategies can further influence online participation behaviors.
Studies have found that motivation influences and is influenced by students' online participation. For example, de Barba,
Kennedy and Ainley (2016) examined the relationships between motivation, online participation behaviors, and learning
performance. They found that the impact of value beliefs on final grade was mediated by video hits and situational interest,
while the mastery-approach impact on final grade was mediated by quiz attempts and situational interest. They concluded
that motivation influenced and was influenced by the students’ online participation; and the outcomes were indirectly
influenced through their participation. Giesbers, Rienties, Tempelaar, and Gijselaers (2013) investigated the relationship
between student motivation, participation, and learning performance. They found that the students participating in four
web-videoconferences had significantly higher scores on the intrinsic motivation subscale than the students who did not
participate or who participated once or twice.
In addition to the relationships between motivation and online participation behaviors, different motivational factors can
have different effects on online participation behaviors. For example, de Barba, Kennedy and Ainley (2016) found that value
beliefs were positively correlated to video hits and mastery approach was positively correlated to quiz attempts. Therefore,
value beliefs and mastery approach contributed to the students’ participation with video lectures and quizzes in different
ways.

2.5. Research questions

In sum, most of the previous studies used an overall LMS tool to classify students into different behavior pattern groups
(Cerezo et al., 2016; Lust et al., 2011). However, in this study, time spent accessing four kinds of online materials was used. This
can give us a clearer understanding of the behavioral differences in viewing these learning materials and of the relationships
between the behavioral differences and learning performance.
In addition, motivation is a crucial factor that is related to online participation behaviors (de Barba, Kennedy & Ainley,
2016). In our search, we just found one study that examined the relationships between behavior patterns and learning
motivation (Lust et al., 2013a,b). However, the study only investigated students’ goal orientation and self-efficacy. Our study
examined the effects of different behavior pattern groups on multiple motivation subscales (i.e., intrinsic goal orientation,
extrinsic goal orientation, task value, control belief of learning, self-efficacy, and test anxiety). This can give us a more in-
depth understanding of the relationships between behavior patterns and different motivation factors.
Based on the above discussion, this study attempted to answer the following four questions:

1. What online learning materials did the students spend more time on?
2. What kinds of behavioral patterns exist when students viewed online learning materials?
3. How did the different behavior patterns relate to students' learning performance?
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 289

4. How did the different behavior patterns relate to students' learning motivation?

3. Method

3.1. Participants

A total of 59 third-year computer science undergraduate students (44 males and 15 females) participated in this study.
They were enrolled in a blended course (classroom-based course with online components) called “mobile phone program-
ming” to learn Windows Phone (WP) programming. They attended face-to-face classes for four hours each week in a
computerized classroom, in which each student used one computer with internet access. In the classroom, the course
teacher's lecture involved three steps. First, he introduced a WP programming concept with lecture slides. Next, he
demonstrated a programming example related to the concept. Finally, he assigned a programming assignment related to this
concept for all students and gave them 10e20 min to complete it. When doing their assignments, they could search for
relevant resources and refer to the learning materials on the Web for their assignments. If a student could not finish them in
the classroom, he/she could do them after class.
In addition to lecturing in the classroom, the course teacher published video lectures, lecture slides, and programming
homework assignments on the 21CS LMS (http://www.21cs.tw). The students could learn from the published materials,
submit their assignments, and post messages on the LMS after class.

3.2. Learning management system

The 21CS LMS was developed using the Java programming language and is hosted on an Apache Tomcat server (http://
tomcat.apache.org/). To set up the system on the server, first a war file consisting of all the programming codes of the sys-
tem is uploaded to the server. Next, a user can link to the setup page where he/she must create a user with the role
“administrator” and configure the database server. Finally, the user clicks the setup button, and the server automatically
completes the setup process.
The LMS provides seven modules for teachers to manage instructional activities, to enhance students' after-classroom
interaction, and to scaffold students’ learning. These modules are user management, course management, assignment
management, discussion forum, assignment sharing, lecture slide reading, and video annotation modules. A teacher can
create, edit, delete, and query user accounts, courses, and assignments in the user, course, and assignment modules
respectively. A student can post or reply to a question using the discussion forum module, can access shared assignments
using the assignment sharing module, can read, highlight text, and comment on text in the lecture slides using the lecture
slide reading module, and can view the published video lectures and comment on any frame of a video using the video
annotation module.
All students’ clicking events on the system are recorded in system logs. Each recorded event comprises four information
points: who caused the event, what the event was, when the event occurred, and the web page when the event occurred.
In general, LMSs, such as Moodle, only recorded the operation of opening a web page (Kovanovic, Gasevic, Joksimovic,
Hatala, & Adesope, 2015). The time spent for viewing a web page is the sum of the time that the web page is active in the
browser. This measurement cannot detect the time that the students’ mouse cursor focuses on other windows. However, the
LMS used in this study not only recorded the operation of opening a web page, but also recorded the operations of the mouse
focusing in and out of the web page. By doing so, we could obtain more accurate information to measure time spent on the
web page based on the user switching his/her mouse focus.
Although we used the difference between mouse focusing in a page and mouse focusing out of the page to represent the
time a student spent viewing learning material, we cannot know whether the student has left his/her. Therefore, we referred
previous studies and used time-oriented heuristics to place a threshold (30 min) (Khan & Pardo, 2016; Kovanovic, Gasevic,
Dawson et al., 2015). If a student's mouse cursor stayed in a Web page longer than the threshold, we replaced the time
spent as the threshold. For example, a student's mouse cursor stayed in a Web page for 50 min, we replaced the time spent in
the Web page as 30 min.

3.3. Instruments

In this study, the students could learn from different kinds of learning materials, namely lecture slides, video lectures,
shared assignments, and messages posted in the forum. Table 1 lists the primary content involved in these materials and the
tools that supported the students to access these materials. There were 7 lecture slides (108 pages), 20 video lectures
(2h25m34s), 29 shared assignments, and 25 discussion threads published before the final examination.
The learning performance of the students was evaluated by homework assignments and a final examination. All home-
work assignments involved completing a program. They were designed by the course teacher and two teaching assistants
who were Master's students majoring in Information Management. There were two kinds of homework assignments, namely
basic programming assignments that involved modifying the programming syntax of a program demonstrated by the course
teacher in the classroom lecture, and advanced programming assignments that involved integrating different concepts to
complete a program.
290 L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297

Table 1
Learning materials published in the LMS.

Learning materials Primary content Supporting tools


Lecture slides Explanations of the concepts of Windows phone Highlighting text, commenting on text, and posting questions
programming and programming code with executed
results
Video lectures Screen capture of the teacher's classroom lecture Commenting and questioning on any frame of a video
Shared assignments Programming code Highlighting text, commenting on text, and posting questions
Posted messages Questions and answers related to how to implement a A notification page where students can view how many posts they
function and compiling errors have not viewed

Each homework assignment was evaluated by the course teacher and the teaching assistants with two scores: a correct
score and a time score. Regarding the basic programming assignments, if a submission can be correctly executed, it was
marked with the correct score of 100. If it cannot be correctly executed, the teacher and the teaching assistants then examined
the programming code and marked the correct score based on the percentage of the programming syntaxes that were
correctly modified. In addition, if the students submitted their solution within one day of the assignment being published,
within one week of the assignment being published, within two weeks of the assignment being published, one week before
the final examination, and before the final examination, they were awarded a time score of 1, 0.9, 0.8, 0.7, and 0.6, respectively.
The score of a basic programming assignment equals the correct score multiplied by the time score.
Regarding the advanced programming assignments, the teacher and teaching assistants examined each submission and
marked the correct score based on the following criteria: (1) whether the submitted program could be correctly executed; (2)
how many requirements of the assignment were satisfied by the submitted program; (3) whether the interface of the sub-
mitted program was user-friendly; and (4) whether the submitted program was easily operated. In addition, each advanced
programming assignment was given one week for completion. If the students submitted their solutions within four days of
the assignment being published, they were awarded a time score of 1. The teacher then selected two to five submissions for
each advanced programming assignment and shared them using the assignment sharing module. The students who had not
completed the assignment could then refer to the shared submissions. If students submitted their solution within three days
after an assignment was shared, they were awarded a time score of 0.9. The score of an advanced programming assignment
equals the correct score multiplied by the time score.
Before the final examination, 33 basic programming assignments and 11 advanced programming assignments were
published. The homework score was the average of the average score of the basic programming assignments plus the average
score of the advanced programming assignments.
The final examination designed by the course teacher consisted of two parts: a paper-based test and a programming test.
The paper-based test primarily evaluated the students' cognitive levels of remembering and understanding, for example, the
syntax of opening a local file and the procedure of setting the permission. The students were required to answer 16 true/false
(each correct answer ¼ 0.25 points), 6 single selection (each correct answer ¼ 1 point), and 10 open-ended (each correct
answer ¼ 1 point) questions. The total score is 20. In the programming test, the students had to complete four programs. The
test primarily evaluated students’ cognitive levels of applying, analyzing, and integrating. The students were awarded 20
points for each correctly completed program. The total score of the programming test is 80. The final examination score equals
the sum of the scores of the paper-based test and the programming test. Table 2 is a summary of the measurements of
learning performance.
The level of students' motivation was measured by the Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich,
Smith, Garcia, & McKeachie, 1993). The questionnaire, consisting of 31 items in 6 subscales, namely intrinsic goal orientation
(4 items), extrinsic goal orientation (4 items), task value (6 items), control belief of learning (4 items), self-efficacy (8 items),
and test anxiety (5 items), was translated into Chinese. These items were arranged for scoring on a five-point Likert scale,
ranging from (5) strongly agree to (1) strongly disagree. The score of a subscale for a student is the average score of the items
in the subscale. In our study, the reliability of each subscale is, respectively, 0.78, 0.87, 0.89, 0.84, 0.90, and 0.76, evaluated by
Cronbach's alpha. The global reliability of the questionnaire is 0.92.

3.4. Procedure

This study lasted for eight weeks. Each week the teacher taught the slide content and used screen capture software to
record what he taught in the classroom. The lecture slides were published one week before the slide content was taught; the
video lectures were immediately published on the system after the class. Each week, the teacher assigned 3e9 basic pro-
gramming assignments and 1-3 advanced programming assignments for the students to practice after class.
The final examination was conducted in the eighth week. The paper-based test took one hour. The students could not refer
to any resources while taking the test. After the paper-based test, the students had to complete four programs in the pro-
gramming test within two hours. The students could refer to the learning materials published on the LMS to complete the
programs.
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 291

Table 2
Summary of the measurements of learning performance.

Learning performance Assignment/Test Tasks Measurement


Homework score (average of scores Basic programming assignment Modifying the programming syntax Correct score * Time score
of basic programming of a program demonstrated by the
assignments and advanced course teacher in the classroom
programming assignments) lectures
Advanced programming Integrating different concepts to
assignment complete a program
Final examination score (sum of the Paper-based test Answering 16 true/false, 6 single Each correct true/false
scores of the paper-based test selection, & 10 open-ended answer ¼ 0.25 points.
and the programming-based questions Each correct single selection
test) answer ¼ 1 point.
Each correct open-ended
answer ¼ 1 point.
The total score ¼ 20.
Programming-based test Completing four programs Each correctly completed
program ¼ 20 points.
The total score ¼ 80.

After the final examination, the MSLQ was administered to all students. The students were required to answer the
questionnaire on the 21CS LMS. Before answering the questionnaire, the students must log into the system. Therefore, we
could identify the answers of each student.

3.5. Data analysis

There were three kinds of data collected, namely (1) system logs, (2) the results of the MSLQ questionnaire, and (3) the
students’ scores of the final examination and homework assignments. To analyze the data, we first used hierarchical clus-
tering analysis to determine the number of clusters based on the dendrogram. Next, a k-mean clustering analysis was con-
ducted, which generated three clusters.
To compare the students’ online participation behaviors and test scores of the three clusters, group comparison methods
were conducted. If a dependent variable is an interval scale and if it satisfies the assumptions of independent observations,
normal distribution, and homogeneity of variance, a one-way analysis of variance (ANOVA) is suitable to test the results of
three independent groups. However, if a dependent variable is an interval or ordinal scale and satisfies the assumption of
independent observations, a Kruskal-Wallis nonparametric test can be used to determine if there are statistical significances
between the three groups for the dependent variable. If a Kruskal-Wallis test is statistically significant, indicating that at least
one group is different from another group, then a post hoc test (e.g., a Mann-Whitney U test) should be conducted to discover
which groups are different from which other groups (Lane et al., 2014; Lopez, Valenzuela, Nussbaum, & Tsai, 2015).
The data were analyzed using SPSS software. Before analysis, the dependent variables were checked for normal distri-
bution and homogeneity of variance. Except for the homework scores, all variables violated the assumption of normality, as
assessed by a Shapiro-Wilk's test (p < 0.05). To address these violations, and for the sake of consistency, Kruskal-Wallis
nonparametric tests were used. Post-hoc analyses were performed using Mann-Whitney U tests.

4. Result

This study aimed to answer four research questions. This section presents the results for answering the four questions in
the same order as the questions posed for the study.

4.1. The time spent accessing online learning material

During the eight-week course, all students (59) viewed lecture slides and video lectures, 55 viewed shared assignments,
56 viewed posted messages, and 44 posted messages in the discussion forum. However, only five students ever highlighted
text or commented on text. Table 3 lists the means and standard deviations of the variables related to the behaviors of viewing
the online learning materials. As the descriptions in Table 3 indicate, the students spent most of their time viewing lecture
slides, followed by video lectures, shared assignments, and posted messages.

4.2. The differences in the students’ viewing behaviors

To classify the students with similar viewing patterns into a homogeneous group, k-means cluster analysis was performed
on the four variables: time spent viewing lectures, time spent viewing video lectures, time spent viewing shared assignments,
and time spent viewing posted messages. Before doing the analysis, the four variables were transformed in order to reduce
the bias in the cluster analysis (Lust et al., 2011). The 33.33% lowest, intermediate, and highest time durations were allocated a
292 L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297

Table 3
Descriptive statistics of the behavioral variables of accessing online learning materials.

Nos Variables Mean S.D


1 Total time spent accessing the four kinds of learning materials (second) 16258.90 10409.36
2 Time spent accessing lecture slides (second) 9305.71 6348.64
3 Number of lecture slides accessed 33.64 16.85
4 Average time spent on each access of lecture slides (No2/No3) 302.63 182.78
5 Time spent accessing video lectures (second) 5541.90 4684.30
6 Number of video lectures accessed 30.19 16.89
7 Average time spent on each access of video lectures (No5/No6) 180.50 135.02
8 Time spent accessing shared assignments (second) 1084.83 1116.29
9 Number of shared assignments accessed 19.90 17.09
10 Average time spent on each access of shared assignments (No8/No9) 53.79 56.25
11 Time spent accessing posted messages (second) 326.46 588.77
12 Number of posted messages accessed 16.25 16.19
13 Average time spent on each access of posted messages (No11/No12) 28.85 56.61
14 Number of messages posted 2.63 2.84

value of 1, 2, and 3, respectively, indicating low, moderate, and high viewing time. As shown in Table 4, three clusters were
identified. The students in cluster 1 (n ¼ 21) spent more time viewing video lectures, shared assignments, and posted
messages than the students in the other clusters. The students in cluster 3 (n ¼ 19) spent more time viewing lecture slides
than the students in the other clusters. However, the students in cluster 2 (n ¼ 19) spent less time viewing any online learning
materials than the students in the other clusters.
In order to clearly understand the students’ behaviors when accessing these materials among the three clusters, we used
Kruskal-Wallis tests to compare the three clusters in terms of the number of accesses, the average time spent on each access,
and the total time spent on all accesses. As shown in Table 5, the students in the three clusters significantly demonstrated
different behaviors when viewing the online learning materials, except for the average time spent on each access for viewing
lecture slides (x2 (2, N ¼ 59) ¼ 5.372, p ¼ 0.068) and for viewing posted messages (x2 (2, N ¼ 59) ¼ 3.676, p ¼ 0.159).
Pairwise Mann-Whitney-U-Tests revealed that statistical significances existed in all comparisons except in the following
cases: the total time viewing the four kinds of learning materials (cluster 1 vs. cluster 3), the number of accesses for lecture
slides (cluster 1 vs. cluster 3), the average time spent on each access for viewing video lectures (cluster 1 vs. cluster 3), and the
average time spent on each access for viewing shared assignments (cluster 1 vs. cluster 3). Although the average time spent on
each access for viewing lecture slides among the three clusters only demonstrated marginally significant difference (x2 (2,
N ¼ 59) ¼ 5.372, p ¼ 0.068), a Mann-Whitney-U-Test revealed a significant difference (U ¼ 290.0, z ¼ 2.451, p ¼ 0.014 < 0.05)
between cluster 1 and cluster 3. In addition, the total time viewing lecture slides of cluster 1 and cluster 3 also demonstrated a
marginally significant difference (U ¼ 271.0, z ¼ 1.937, p ¼ 0.053).
In sum, the above results give us a clearer understanding of the behavioral differences in viewing these learning materials
among the three clusters. Because cluster 2 accessed the materials significantly fewer times and spent a shorter time
accessing slides, videos, shared assignments, and posted messages than the cluster 1 and cluster 3 students, cluster 2 was
labeled as “less use students”. In addition, cluster 3 spent significantly longer average time on each access of the lecture slides,
and spent marginally significantly longer viewing the lecture slides than cluster 1, while cluster 1 accessed significantly more
times and spent longer viewing video lectures, shared assignments, and posted messages than cluster 3. Therefore, cluster 1
was labeled as “consistent use students,” and cluster 3 was labeled as “slide intensive use students”.

4.3. Learning performance

To examine whether the three clusters were different in terms of their learning performance, two Kruskal-Wallis tests
were conducted. The results revealed significant effects of the three clusters on the homework scores (x2 (2, N ¼ 59) ¼ 31.072,
p ¼ 0.000) and examination scores (x2 (2, N ¼ 59) ¼ 6.070, p ¼ 0.048). Pairwise Mann-Whitney-U-Tests revealed statistically
significant differences in all comparisons except for the examination scores of the “consistent use students” and the “slide
intensive use students” (U ¼ 210.0, z ¼ 0.284, p ¼ 0.776). These results indicate that “consistent use students” gained
significantly higher homework scores than “less use students” and “slide intensive use students,” while “slide intensive use
students” gained significantly higher homework scores than “less use students.” Regarding the final examination scores,

Table 4
Cluster analysis of the behaviors of viewing learning materials.

Cluster 1 (n ¼ 21) Cluster 2 (n ¼ 19) Cluster 3(n ¼ 19)


Time spent viewing lecture slides 2.10 1.21 2.63
Time spent viewing video lectures 2.62 1.26 2.00
Time spent viewing shared assignments 2.71 1.26 1.89
Time spent viewing posted messages 2.76 1.53 1.58
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 293

Table 5
Behaviors of accessing learning materials among clusters.

Variables consistent use students less use students (cluster slide intensive use Kruskal- Post-hoc tests
(cluster 1) 2) students (cluster 3) Wallis test (Mann-Whitney-U-
Test)
Mean SD Mean SD Mean SD p
Time spent viewing lecture slides 10576.86 6011.41 3984.89 2025.79 13221.58 6136.87 0.000** cluster1 > cluster2**
(second) cluster3 > cluster2**
Number of times viewing lecture 44.00 14.47 16.58 9.00 36.26 11.65 0.000** cluster1 > cluster2**
slides cluster3 > cluster2**
Average time spent on each access 238.97 99.11 318.08 240.43 357.54 175.40 0.068 cluster3 > cluster1*
of lecture slides
Time spent viewing video lectures 8720.10 4459.57 1916.53 1749.44 5654.53 4458.09 0.000** cluster1 > cluster2**
(second) cluster1 > cluster3*
cluster3 > cluster2**
Number of times viewing video 40.05 14.80 18.42 14.62 31.05 14.20 0.000** cluster1 > cluster2**
lectures cluster1 > cluster3*
cluster3 > cluster2**
Average time spent on each access 239.48 151.60 107.28 94.19 188.54 120.61 0.002** cluster1 > cluster2**
of video lectures cluster3 > cluster2*
Time spent viewing shared 1902.86 788.47 255.37 322.81 1101.16 1306.70 0.000** cluster1 > cluster2**
assignments (second) cluster1 > cluster3**
cluster3 > cluster2**
Number of times viewing shared 32.71 18.67 7.16 7.88 18.47 11.27 0.000** cluster1 > cluster2**
assignments cluster1 > cluster3**
cluster3 > cluster2**
Average time spent on each access 73.18 66.12 32.18 36.84 53.96 55.22 0.006** cluster1 > cluster2**
of shared assignments cluster3 > cluster2*
Time spent viewing posted 588.29 697.87 230.63 657.78 132.89 118.14 0.000** cluster1 > cluster2**
messages (second) cluster1 > cluster3**
Number of times viewing posted 27.00 16.91 7.53 11.82 13.11 12.69 0.000** cluster1 > cluster2**
messages cluster1 > cluster3**
cluster3 > cluster2*
Average time spent on each access 43.88 86.56 26.81 37.46 14.27 10.21 0.159
of posted messages
Total time spent (second) 21788.10 9124.46 6387.42 3175.33 20019.16 9648.92 0.000** cluster1 > cluster2**
cluster3 > cluster2**

*p < 0.05, **p < 0.01.

“consistent use students” and “slide intensive use students” gained significantly higher final examination scores than “less
use students” (see Table 6).

4.4. Student motivation

In order to understand whether the three clusters differed in terms of their level of motivation, six Kruskal-Wallis tests
were conducted. The results revealed significant effects of the three clusters on their intrinsic goal orientations (x2 (2,
N ¼ 59) ¼ 9.958, p ¼ 0.007), task value (x2 (2, N ¼ 59) ¼ 7.329, p ¼ 0.026), and self-efficacy (x2 (2, N ¼ 59) ¼ 9.590, p ¼ 0.008).
Pairwise Mann-Whitney-U-Tests revealed statistically significant differences between “consistent use students” and “less use
students” in intrinsic goal orientations (U ¼ 70.5, z ¼ 2.609, p ¼ 0.009), task value (U ¼ 73.5, z ¼ 2.493, p ¼ 0.013), and self-
efficacy (U ¼ 63.5, z ¼ 2.819, p ¼ 0.005). These results indicated that the “consistent use students” had higher intrinsic goal
orientations, task value, and self-efficacy than the “less use students.” In addition, the intrinsic goal orientations (U ¼ 94.5,
z ¼ 2.705, p ¼ 0.007) and self-efficacy (U ¼ 111.5, z ¼ 2.191, p ¼ 0.028) of the “consistent use students” and the “slide
intensive use students” also demonstrated significant differences. These results indicated that the “consistent use students”
had higher intrinsic goal orientations and self-efficacy than the “slide intensive use students” (see Table 7).

5. Discussion

This study analyzed the system logs recorded in a Learning Management System (LMS) in order to answer four questions.
For Question 1, the descriptive statistics revealed several interesting results. First, the students accessed the lecture slides
more often and, on average, spent more time on each access than they did for the other learning materials; the lecture slides
were followed by video lectures, shared assignments, and posted messages. Lecture slides and video lectures are associated
with classroom lectures, while shared assignments and posted messages can be seen as supplementary information. This
result may represent that the students more frequently viewed the learning materials that were directly associated with the
classroom lectures. A previous study reported a similar finding that students used the online tools that were related to
lectures (i.e., course material outlines and the online scaffolding tools) more often than other LMS tools (Lust et al., 2011).
294 L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297

Table 6
Learning performance among clusters.

consistent use less use slide intensive Kruskal-Wallis test Post-hoc tests (Mann-Whitney-U-Test)
students students use students
(cluster 1) (cluster 2) (cluster 3)

Mean SD Mean SD Mean SD p


Homework Score 81.69 8.84 58.96 9.97 73.68 10.52 0.000** cluster1 > cluster2**
cluster1 > cluster3*
cluster3 > cluster2**
Final examination score 51.22 21.31 35.22 26.97 52.96 19.04 0.048* cluster1 > cluster2*
cluster3 > cluster2*

*p < 0.05, **p < 0.01.

Table 7
Motivation variables among clusters.

consistent use students less use students (cluster slide intensive use Kruskal- Post-hoc tests
(cluster 1) 2) students (cluster 3) Wallis test (Mann-Whitney-U-Test)

Mean SD Mean SD Mean SD p


Intrinsic Goal Orientation 4.45 0.33 3.91 0.65 3.96 0.67 0.007** cluster1 > cluster2**
cluster1 > cluster3**
Extrinsic Goal Orientation 4.48 0.55 3.89 1.02 4.29 0.58 0.191
Task Value 4.52 0.41 3.90 0.75 4.14 0.61 0.026* cluster1 > cluster2*
Control of Learning Beliefs 3.83 0.50 3.75 0.61 3.55 0.46 0.186
Self-Efficacy 4.20 0.57 3.54 0.60 3.81 0.55 0.008** cluster1 > cluster2**
cluster1 > cluster3*
Test Anxiety 3.24 0.78 3.36 0.89 3.53 0.79 0.578

*p < 0.05, **p < 0.01.

Second, although all students viewed lecture slides and spent a considerable amount of time viewing them, only five
students highlighted text or commented on text. There may be three reasons to explain this result. First, the students were not
used to highlighting text or commenting on text. Second, the students were used to highlighting and taking notes on paper
books, but they did not like to use digital annotation tools on digital books. Third, the primary contents of the lecture slides are
programming codes (Noyes & Garland, 2006). The students copied and executed the codes in the integrated development
environment (IDE) (e.g., visual studio 2015). They may have taken notes near their codes in the IDE. Therefore, they did not
highlight the text or comment on the text on the lecture slides. This finding is in line with the study of Junco and Clem (2015)
who found that most students did not highlight, take notes, or use bookmarks in their digital textbooks. Third, 44 students
posted messages in the forum. However, the number of accessed posted messages was lower than the number of other
learning materials that were accessed. The reason for the fewer times accessing posted messages may be that the LMS
provided a notification page where students could view how many posts they had not viewed. The students therefore did not
need to enter the forum to check whether there were new posts.
Regarding Question 2, in this study we performed cluster analysis based on the variables of time spent viewing lecture
slides, video lectures, shared assignments, and posted messages to classify the students with similar viewing behaviors into
distinct groups. Three clusters, namely “consistent use students,” “less use students,” and “slide intensive use students,” were
generated. “Less use students” performed fewer times and spent a shorter amount of time accessing any of the four kinds of
learning materials compared with the “consistent use students” and “slide intensive use students”. The “consistent use
students” and “slide intensive use students” spent the same total amount of time viewing the four types of learning material.
However, the “consistent use students” significantly performed more times and spent more time accessing the video lectures,
shared assignments, and posted messages than the “slide intensive use students,” who in turn spent significantly longer on
each access and spent marginally significantly longer total time viewing lecture slides than the “consistent use students.”
These results indicated that the students actually demonstrated very different behavior patterns of viewing the online
learning materials.
These results are similar to several previous studies that examined total LMS tool use. First, 32% of the students in this
study were labeled as “less use students” as they infrequently accessed the online learning materials. This is in line with
several studies which found that there was a high percentage of students who did not access or who infrequently accessed the
online learning tools and resources. For example, Kovanovic, Gasevic, Dawson et al. (2015) labeled 27% of the students in their
study as no-users as they had below-average scores for all behavioral variables, except for number of logins. Second, the
current study found that the students who had intensively engaged in accessing the online learning materials could be
divided into two types: intensively using all materials (“consistent use students”) and selectively intensively using some
materials (slide intensive use students). This finding is similar to that of the study by Lust et al. (2011) who labeled 43% of the
students in their study as intensive users who used all LMS tools with a high frequency, while labeling 34% as incoherent users
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 295

who accessed the LMS tools with a medium frequency, and more frequently used the tools directly associated with the
classroom lectures.
To answer Question 3, Kruskal-Wallis tests and Mann-Whitney-U-Tests were conducted. The results presented that the
homework scores and exam scores among the three clusters were significantly different. These results may represent that the
behaviors of accessing online learning materials were associated with learning performance. In particular, “consistent use
students” and “slide intensive use students” gained significantly higher homework scores and examination scores than the
“less use students.” This result may represent that the students who invested more time and effort in viewing the online
learning materials had better learning performance. This result is in line with the study of Lust et al. (2011) who found that
intensive users and incoherent users significantly outperformed the no-users in terms of their assignment scores, and Heffner
and Cohen (2005) who found that the number of accessed course materials had a positive relationship with their final course
grade.
In addition, “consistent use students” significantly gained higher homework scores than “slide intensive use students,” but
they had the same examination scores. There may be two explanations for these results. First, the students may be required to
refer to different kinds of learning materials for completing their homework assignments. Therefore, “consistent use stu-
dents” who had higher homework scores spent more time accessing video lectures, shared assignments, and posted messages
than the “slide intensive use students.” Second, there may be a variable that affected not only students’ behaviors in accessing
learning materials but also their willingness to complete homework assignments. This variable may be intrinsic goal
orientation. We explicate this suggestion in our discussion of Question 4 as follows.
In response to Question 4, it was found that the intrinsic goal orientations, self-efficacy, and task value among the three
clusters demonstrated significant differences. These results represent that the students' behavior patterns affected or were
affected by their motivation. In particular, the “consistent use students” (cluster 1) were significantly higher in intrinsic goal
orientations, self-efficacy, and task value than the “less use students” (cluster 2). On the one hand, the “consistent use stu-
dents” who had higher motivation were willing to invest more time and effort in learning and completing their tasks.
Therefore, they spent more time accessing the online learning materials than the “less use students.” These results are similar
to those revealed by the study of Yi and Hwang (2003) who found that enjoyment, learning goal orientation, and self-efficacy
positively influenced the frequency of accessing a Web-based information system. On the other hand, studies have revealed
that students’ self-efficacy not only amplifies the perceived goal achievement, but is also enhanced by their perceived goal
achievement (David, Song, Hayes, & Fredin, 2007; Waschle, Allgaier, Lachner, Fink, & Nuckles, 2014). The “consistent use
students” spent more time learning from the online learning materials than the “less use students.” Therefore, they should
have been more confident about their learning and performance and thus scored higher on the self-efficacy scale.
In addition, the intrinsic goal orientations and self-efficacy of the “consistent use students” were significantly higher than
those of the “slide intensive use students.” There may be two reasons to explain these results. First, previous studies have
found that students who had higher intrinsic goal orientations were more curious and demonstrated more explorative be-
haviors. For example, Martens, Gulikers, and Bastiaens (2004) found that intrinsically motivated students did not do more,
but liked to do different things when learning in a simulation learning environment. In our study, the “consistent use stu-
dents” who had higher intrinsic goal orientations spent more time viewing video lectures, shared assignments, and posted
messages than the “slide intensive use students,” although they spent the same amount of total time viewing the online
learning materials. Therefore, our results are similar to those of the study of Martens et al. (2004). Second, because the
“consistent use students” may have originally had higher motivation (e.g., intrinsic goal orientations), they were motivated to
complete their homework assignments and thus gained higher homework scores than the “slide intensive use students.” All
of the students could know their homework scores one week after the assignments were published. Their good performance
in the homework assignments probably increased the students’ confidence in their learning and performance. Therefore, the
“consistent use students” reported higher self-efficacy than the “slide intensive use students” by the end of the course. These
two reasons can also explain why the “consistent use students” gained significantly higher homework scores than the “slide
intensive use students.”

6. Conclusion

Viewing online learning materials is the most frequently performed online learning activities. Therefore, understanding
how students view different learning materials and how their viewing behaviors affect their learning has been an important
issue. This study analyzed the system logs of 59 students who viewed online learning materials in a programming course. The
results of the analysis revealed several significant findings. First, the students viewed the learning materials related to their
classroom lectures (i.e., lecture slides and video lectures) for longer and more often than other learning materials (i.e., shared
assignments and posted messages). Second, students' viewing behaviors showed great variety and were clustered into three
behavior patterns: “consistent use students” who intensively used all of the learning materials, “slide intensive use students”
who intensively used the lecture slides, and “less use students” who infrequently used any learning material. Third, students'
viewing behaviors were associated with their learning performance. In particular, the “less use students” had lower learning
performance. Finally, the students’ behavior patterns were associated with their motivation. In particular, “consistent use
students” had higher motivation in terms of intrinsic goal orientations and self-efficacy.
Although this study has demonstrated these significant findings, several issues still exist, which should be further
examined in the future. First, one third of the students were still “less use students” in this study. They had lower learning
296 L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297

performance than other students did. Therefore, future study can examine why “less use students” demonstrated relatively
low learning performance and which approaches can improve their low performance. Second, “consistent use students” and
“slide intensive use students” demonstrated different behavior patterns but have the same examination scores. These results
should be further investigated. Third, although students' motivation can partly explain why their behavior patterns were
different, this is still not the complete answer. For example, why did the “slide intensive students” and “less use students”
demonstrate different behavior patterns when their motivation did not show statistical significance? Are the different
behavior patterns associated with individual characteristics, students' tasks, system features, or teachers' instructional ap-
proaches? We suggest that future research examine which factors can determine students’ behavior patterns when accessing
learning materials and incorporate other data, such as interviewing or behavioral sequential patterns, to provide more evi-
dence for answering these questions.
Although this study made a number of significant findings, several limitations should be mentioned. First, the sample size
was small (N ¼ 59). Second, the course did not use a textbook, so most of the students learned from the learning materials
published on the LMS. The results are therefore not suitable to be inferred to other courses which use a textbook. Further
research can compare the online viewing behaviors of courses with and without a textbook. Third, although we used the
difference between mouse focusing in a page and mouse focusing out of the page to represent the time a student spent
viewing learning material, this measurement is not entirely accurate because the student may leave the computer. In order to
more accurately detect students' focus, future research can periodically detect the movement of mouse focus. If the distance
between the mouse foci in a pre-determined time interval (e.g., five minutes) is zero, it may represent that the student left his/
her seat. Fourth, this study used time of accessing online learning materials to classify the students into three behavior
pattern groups, and examined the effects of the patterns on learning performance and motivation. The results revealed that
the learning performance of the three groups demonstrated significant differences. However, students' learning performance
is determined not only by the time they spent viewing the materials, but also by how they viewed the materials. It cannot be
determined if a student is actively and effectively learning from a material once he or she has accessed it. Students who access
a learning material for the same amount of time may have different levels of behavioral, cognitive and emotional engagement
(Henrie et al., 2015). Time spent accessing a specific learning material can measure the levels of behavioral engagement.
However, system logs do not easily measure the levels of cognitive and emotional engagements. Future research can explore
what log data can represent students’ cognitive and emotional engagement. For example, Sinha, Jermann, Li, and Dillenbourg
(2014) used a sequence of click actions on a video player to represent a cognitive strategy. Finally, the students could access
the internet in the classroom. They could also access the online learning materials published on 21CS LMS in the classroom
and after class. Therefore, the results are not suitable to be inferred to whole online courses. Future research can distinguish
between the system logs recorded in the classroom or after class, and can analyze the differences in the in-class and after-
class behaviors.

Acknowledgement

This project was supported by the Ministry of Science and Technology of Taiwan under contract numbers: MOST 104-2511-
S-008 -016 -MY2.

References

de Barba, P. G., Kennedy, G. E., & Ainley, M. D. (2016). The role of students' motivation and participation in predicting performance in a MOOC. Journal of
Computer Assisted Learning, 32(3), 218e231. http://dx.doi.org/10.1111/jcal.12130.
Canal, L., Ghislandi, P., & Micciolo, R. (2015). Pattern of accesses over time in an online asynchronous forum and academic achievements. British Journal of
Educational Technology, 46(3), 619e628. http://dx.doi.org/10.1111/bjet.12158.
Cerezo, R., Sanchez-Santillan, M., Paule-Ruiz, M. P., & Nunez, J. C. (2016). Students' LMS interaction patterns and their relationship with achievement: A case
study in higher education. Computers & Education, 96, 42e54. http://dx.doi.org/10.1016/j.compedu.2016.02.006.
Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An
empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257e278. http://dx.doi.org/10.1111/bjet.12243.
Cho, Y. H., & Cho, K. (2011). Peer reviewers learn from giving comments. Instructional Science, 39(5), 629e643. http://dx.doi.org/10.1007/s11251-010-9146-1.
Cho, M. H., & Kim, B. J. (2013). Students' self-regulation for interaction with others in online learning environments. Internet and Higher Education, 17, 69e75.
http://dx.doi.org/10.1016/j.iheduc.2012.11.001.
David, P., Song, M., Hayes, A., & Fredin, E. S. (2007). A cyclic model of information seeking in hyperlinked environments: The role of goals, self-efficacy, and
intrinsic motivation. International Journal of Human-Computer Studies, 65(2), 170e182. http://dx.doi.org/10.1016/j.ijhcs.2006.09.004.
Dennen, V. P. (2008). Pedagogical lurking: Student engagement in non-posting discussion behavior. Computers in Human Behavior, 24(4), 1624e1633. http://
dx.doi.org/10.1016/j.chb.2007.06.003.
Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in
an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285e292. http://dx.doi.org/10.1016/j.chb.2012.09.005.
Graf, S., Liu, T. C., & Kinshuk. (2010). Analysis of learners' navigational behaviour and their learning styles in an online course. Journal of Computer Assisted
Learning, 26(2), 116e131. http://dx.doi.org/10.1111/j.1365-2729.2009.00336.x.
Heffner, M., & Cohen, S. H. (2005). Evaluating student use of web-based course material. Journal of Instructional Psychology, 32(1), 74e82.
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education,
90, 36e53. http://dx.doi.org/10.1016/j.compedu.2015.09.005.
Junco, R., & Clem, C. (2015). Predicting course outcomes with digital textbook usage data. Internet and Higher Education, 27, 54e63. http://dx.doi.org/10.1016/
j.iheduc.2015.06.001.
Kay, R. H. (2012). Exploring the use of video podcasts in education: A comprehensive review of the literature. Computers in Human Behavior, 28(3), 820e831.
http://dx.doi.org/10.1016/j.chb.2012.01.011.
L.-Y. Li, C.-C. Tsai / Computers & Education 114 (2017) 286e297 297

Khan, I., & Pardo, A. (2016). Data2U: Scalable real time student feedback in active learning environments. In Paper presented at the proceedings of the sixth
international conference on learning analytics & knowledge, Edinburgh, United Kingdom.
Kopcha, T. J., & Sullivan, H. (2008). Learner preferences and prior knowledge in learner-controlled computer-based instruction. Etr&D-Educational Tech-
nology Research and Development, 56(3), 265e286. http://dx.doi.org/10.1007/s11423-007-9058-1.
Kovanovic, V., Gasevic, D., Dawson, S., Joksimovi, S., Baker, R. S., & Hatala, M. (2015). Penetrating the black box of time-on-task estimation. In Paper presented
at the proceedings of the fifth international conference on learning analytics and knowledge, poughkeepsie, New York.
Kovanovic, V., Gasevic, D., Joksimovic, S., Hatala, M., & Adesope, O. (2015). Analytics of communities of inquiry: Effects of learning technology use on
cognitive presence in asynchronous online discussions. Internet and Higher Education, 27, 74e89. http://dx.doi.org/10.1016/j.iheduc.2015.06.002.
Lane, D. M., Scott, D., Hebl, M., Guerra, R., Osherson, D., & Zimmer, H. (2014). Introduction to statistics. Houston: Rice Univ.
Lopez, X., Valenzuela, J., Nussbaum, M., & Tsai, C. C. (2015). Some recommendations for the reporting of quantitative studies. Computers & Education, 91,
106e110. http://dx.doi.org/10.1016/j.compedu.2015.09.010.
Lust, G., Collazo, N. A. J., Elen, J., & Clarebout, G. (2012). Content management Systems: Enriched learning opportunities for all? Computers in Human
Behavior, 28(3), 795e808. http://dx.doi.org/10.1016/j.chb.2011.12.009.
Lust, G., Elen, J., & Clarebout, G. (2013a). Regulation of tool-use within a blended course: Student differences and performance effects. Computers & Edu-
cation, 60(1), 385e395.
Lust, G., Elen, J., & Clarebout, G. (2013b). Students' tool-use within a web enhanced course: Explanatory mechanisms of students' tool-use pattern. Com-
puters in Human Behavior, 29(5), 2013e2021. http://dx.doi.org/10.1016/j.chb.2013.03.014.
Lust, G., Vandewaetere, M., Ceulemans, E., Elen, J., & Clarebout, G. (2011). Tool-use in a blended undergraduate course: In Search of user profiles. Computers
& Education, 57(3), 2135e2144.
Martens, R. L., Gulikers, J., & Bastiaens, T. (2004). The impact of intrinsic motivation on e-learning in authentic computer tasks. Journal of Computer Assisted
Learning, 20(5), 368e376. http://dx.doi.org/10.1111/j.1365-2729.2004.00096.x.
Noyes, J., & Garland, K. (2006). Explaining students' attitudes toward books and computers. Computers in Human Behavior, 22(3), 351e363. http://dx.doi.org/
10.1016/j.chb.2004.09.004.
Pintrich, P. R., & Degroot, E. V. (1990). Motivational and self-regulated learning components of classroom academic-performance. Journal of Educational
Psychology, 82(1), 33e40. http://dx.doi.org/10.1037//0022-0663.82.1.33.
Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1993). Reliability and Predictive-validity of the motivated strategies for learning questionnaire
(MSLQ). Educational and Psychological Measurement, 53(3), 801e813. http://dx.doi.org/10.1177/0013164493053003024.
Sinha, T., Jermann, P., Li, N., Dillenbourg, P.. (2014). Your click decides your fate: Inferring information processing and attrition behavior from mooc video
clickstream interactions. arXiv preprint.
Waschle, K., Allgaier, A., Lachner, A., Fink, S., & Nuckles, M. (2014). Procrastination and self-efficacy: Tracing vicious and virtuous circles in self-regulated
learning. Learning and Instruction, 29, 103e114. http://dx.doi.org/10.1016/j.learninstruc.2013.09.005.
Worthington, D. L., & Levasseur, D. G. (2015). To provide or not to provide course PowerPoint slides? The impact of instructor-provided slides upon student
attendance and performance. Computers & Education, 85, 14e22. http://dx.doi.org/10.1016/j.compedu.2015.02.002.
Xie, K., Yu, C., & Bradshaw, A. C. (2014). Impacts of role assignment and participation in asynchronous discussions in college-level online classes. Internet and
Higher Education, 20, 10e19. http://dx.doi.org/10.1016/j.iheduc.2013.09.003.
Yi, M. Y., & Hwang, Y. J. (2003). Predicting the use of web-based information systems: Self-efficacy, enjoyment, learning goal orientation, and the technology
acceptance model. International Journal of Human-Computer Studies, 59(4), 431e449. http://dx.doi.org/10.1016/s0171-5819(03)00114-9.
You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. Internet and Higher Education, 29,
23e30. http://dx.doi.org/10.1016/j.iheduc.2015.11.003.
Zheng, B. B., & Warschauer, M. (2015). Participation, interaction, and academic achievement in an online discussion environment. Computers & Education,
84, 78e89. http://dx.doi.org/10.1016/j.compedu.2015.01.008.

Você também pode gostar