Você está na página 1de 7

TechTrends

https://doi.org/10.1007/s11528-018-0327-0

ORIGINAL PAPER

Patterns in Faculty Learning Management System Use


Szymon Machajewski 1 & Alana Steffen 1 & Elizabeth Romero Fuerte 1 & Eleanor Rivera 1

# Association for Educational Communications & Technology 2018

Abstract
Technology in Higher Education affects teaching and learning excellence while being a significant expense for universities. There
is a need for evaluation of current instructional technology use when planning for renewal or adoption of a new learning
management system (LMS). This study was conducted to understand the patterns of course tools used by faculty in a commercial
LMS used at a large public research university. Course data was extracted from 2562 courses with 98,381 student enrollments
during the Fall of 2016. A latent class analysis was conducted to identify the patterns of LMS tool use based on the presence of
grade center columns, announcements, assignments, discussion boards, and assessments within each course. Three latent classes
of courses were identified and characterized as Holistic tool use (28% of the courses), Complementary tool use (51%), and
Content repository (21%). These classes differed in the mean number of students per course and whether courses were exclu-
sively online. These descriptions provided data-based information to share with deans across the university to facilitate discussion
of faculty needs for LMS tools and training.

Keywords Course design . Learning management system . Latent analysis . Blackboard Learn . BbStats . Educational technology

Insight into the use of educational technology is an important were preparing to replace their LMS system in the next three
contribution to strategic decision making at academic institu- years (Dahlstrom et al. 2014). Many universities have moved
tions. Today, virtually all academic institutions offer a set of from their original LMS to another as a result of such evalu-
technology tools dedicated to instruction. Included in this eco- ations while some universities have decided to stay with their
system of tools is the learning management system (LMS), a current LMS. After 19 years of using the same LMS, IT
tool which is particularly pervasive in higher education. Indeed, Governance at the University of Illinois at Chicago (UIC)
the LMS has become integral to student learning experiences decided to conduct an evaluation in order to advise executive
and faculty teaching experiences (Dahlstrom et al. 2014). decision makers on the future direction.
Over the past decade, competition in the LMS market space
coupled with demands from faculty and students exposed to
cutting-edge Internet systems, forced universities to re-
evaluate their use of the LMS. In fact, a technical report by Relevance of the Study
EDUCAUSE predicted that nearly one in five institutions
A review of the literature informing LMS evaluations sug-
gested that decisions are being made largely based on feature
sets provided by the vendor matched with a compiled com-
* Szymon Machajewski prehensive feature list of requirements provided by the insti-
szymonm@uic.edu
tution (Berking and Gallagher 2013). The process of gathering
Alana Steffen tool requirement strongly relies on survey data of self-reported
steffena@uic.edu faculty needs and usage, expert opinions, or other anecdotal
data points (Abazi-Bexheti et al. 2010; Medina-Flores and
Elizabeth Romero Fuerte
elromero@uic.edu Morales-Gamboa 2015; Orfanou et al. 2015). While LMSs
have evolved over time, they generally have the same capa-
Eleanor Rivera
sriver25@uic.edu bilities from the late 1990s (Kroner 2014), and relying on self-
reported data to define Badoption^ and Bbasic and deep use^
1
University of Illinois at Chicago, Chicago, IL, USA was considered problematic for UIC.
TechTrends

Strategic decision making for the investment of instruction- This research contributes to the literature in two ways.
al technology is best informed by data available within the First, it describes the process of extracting meaningful data
institution; however, there are few empirical studies that have from Blackboard databases used by the institution utilizing a
looked at faculty instructional use of the LMS using data from popular free software package from the Open Source
the databases supporting the LMS itself. Previous conceptual Community for Educational Learning Objects and Tools
studies were conducted by Janossy (2008) as well as Dawson (OSCELOT) called BbStats – Activity Dashboard
et al. (2008) attempting to answer the question of course de- (Machajewski 2014), which was the second most frequently
sign by analyzing digital data points. For instance, based on a downloaded project in this repository with 14,490 downloads
self-developed scale of functions supported by commonly between 2010 and 2016. Second, it demonstrates the use of
used LMSs, Janossy (2008) evaluated instructional use of latent class analysis, a novel approach that provides data-
the LMS by looking at the contents of the LMS database driven insight into the patterns of the LMS tool use.
course tables without content and then comparing the tables
as faculty added course features to the system. The researcher
claimed that the Busage model and metric can be applied by
any institution using any LMS package to unambiguously and Data Collection
accurately assess its usage across the entire institution^
(Jannosy, 2008). While this might be the case, the scales did The data for this research was collected from a Managed
not emerge from the data itself but applied to the data for tool Hosted system in the Virginia Blackboard data centers using
use classification. BbStats Activity Dashboard. The data was filtered by a spe-
A more recent study took an approach of measuring and cific term, Fall 2016, based on the course id identifier
analyzing time spent on task by students as a way to evaluate (course_id). The data sample represented all courses at
the use of the LMS (Whitmer et al. 2016). With this approach, University of Illinois at Chicago, which created an artifact in
even if faculty designed a course with rich features, features the Blackboard Learn system. While other courses and orga-
were not reported unless students used the features. Other nizations were also used during the specific term, only term-
empirical analyses have been performed in a single campus bound, academic courses were included. Examples of exclud-
study (Fritz 2013) and in multi-campus study of 927 institu- ed courses were continuous courses, courses from past terms
tions with 70,000 courses (Whitmer et al. 2016b). From the which may be reusing old course shells, or merged sections
student use perspective, five course archetypes were identi- with non-standard parent sections.
fied: supplemental (53%), complementary (24%), social The Blackboard tools reported on by BbStats included
(11%), evaluative (10%), and holistic (2%). The strength of course content items, which can be any artifact created in
this study was the large sample and multiple institutions pro- the course such as a file, folder, or text item. The use of
viding data. The potential gap was that it is unknown if student any artifact qualifies a course to be included in the re-
use archetypes that reflect course design elements provided by search sample. For each element of course design, (i.e.,
faculty. Students can use only what is offered by the faculty, content items, announcements, assessments, assignments,
but they may not use all that is offered. Ideally, we would look discussion board forums, grade center columns, blogs/
at both student use and the context of the course archetypes. wikis/journals (combined),) a separate excel data file
The intentions of faculty can be understood better not from was created that included the course_id, a count of that
student activity only, but from the course design itself. element within the course, and the number of students.
The lack of empirical studies using data from the LMS These files were imported into SAS 9.4 and merged to
itself may be due to the fact that, like many large systems, create a comprehensive raw data file with one record per
the vast amounts of data generated by the LMS is difficult course_id and zero-filled for elements with no use in the
to extract and analyze, and often requires specialized skills course. Indicator variables were coded for each design
for both data extraction and interpretation. Blackboard element for presence/absence in the course in addition to
Inc.’s own data science group admits that common indica- the count variable. Additional variables such as delivery
tors as activity, enrollment, or other measures can be diffi- method (online versus in-person) and college/department
cult to understand and the relationship between student were associated with each course as well. Design of the
achievement and the use of electronic tools may have a course is then defined by the use of the grade center,
large variation (Whitmer et al. 2016a, b). While proprietary content items, announcements, assignments, assessments,
software is available for collecting activity data within blogs, wikis, journals, and discussion forums. The
Blackboard, as well as through the vendor consulting ser- resulting file contained a single record for each course,
vices, both methods require additional expenditures. For indicators and counts of each design element, number of
some institutions, this type of expense may not be a part students enrolled in the course, delivery method, and
of the LMS budget. college.
TechTrends

Statistical Analysis example, 49% of courses used assignments through


Blackboard, and the median use was 5 assignments for those
The comprehensive raw data file was imported into Mplus, courses. Grade center was used in 79% of courses with a
version 7.4 to conduct latent class analysis. Latent class anal- median 15 grade columns.
ysis (LCA) is a model-based clustering approach that iden-
tifies homogeneous subgroups within a heterogeneous popu- Model Results Latent class models with k = 2 to 5 classes were
lation (Lazarsfeld and Henry 1968). Maximum likelihood es- examined and showed that 3 classes provided the best fitting
timation identifies a latent categorical variable with k levels or solution (lowest BIC, Lo-Mendell-Rubin adjusted likelihood
classes based on observed indicators with parameters that in- ratio test showing a significantly better fit than 2 classes), and
dicate the size of each class and the probability of each indi- fairly good separation of classes (entropy = .79). Three sub-
cator by class. The process involves fitting consecutive groups identified and named were Holistic tool use (28%),
models, k = 2, 3, 4, etc. and examining likelihood ratio tests, Complementary tool use (51%), and Content repository
fit indices (e.g. Bayesian Information criteria, BIC), entropy, a (21%). Figure 1 represents the latent class profiles identified
measure of class separation, and standardized residual z- by the final model, their relative sizes estimated for the popu-
scores to determine the best solution (i.e., the number of clas- lation, and the probability of tool presence within a course.
ses that provide the best fit to the data). Our input for the LCA Class 1: Holistic tools use (100% of courses use grade
models were the following design element indicators (present/ center, >90% use announcements, and least used tool is as-
absent): announcements, assessments, assignments, discus- sessments, 40% of courses in this subgroup). This latent class
sion board forums, and grade center. Content was universally represents academic courses with the highest likelihood for
present across the courses (97%) and wikis, blogs, and using all of the examined tools. This group is estimated to
journals were minimally used (4%), therefore, these elements encompass 28% of the courses offered.
were excluded due to lack of variability. Once the solution was Class 2: Complementary tools use (100% of courses use
selected, class assignment was done using the highest poste- grade center, roughly 55% of courses use announcements and
rior probability. We used this class indicator to further profile 48% use assignments). This largest identified latent class com-
and describe the subgroups using other variables not used to prises 51% of courses, takes advantage of the above grade
create the classes, e.g., delivery method, college/department, center benefits, about half use announcements and assign-
number of students. ments, but discussion boards and digital assessments such as
quizzes or exams are rarely used.
Class 3: Content repository was the smallest (21%) identi-
fied class (Roughly 45% of courses use announcements and
Results 19% included discussion forums, other tools were not used).
The third latent class notably takes no advantage of the digital
Sample Description The data set included 2562 courses. On grade center, but uses some class-wide collaboration tools
average, there were 38.4 (SD = 56.9) students enrolled in a such as the discussion forums, instructor-to-student commu-
course, with a median at 22 and range from 1 to 1304. nication tools such as announcements, and the digital reposi-
Table 1 provides an overview of the design elements used tory of content similar to all groups included in the case study.
by faculty across the course. There is considerable variation Additional results describe the typical number of enrolled
in the use of tools, and when present, the extent of use. For students per latent class, number of tools for each course, and
proportion delivered online or in-person (Table 2). These var-
Table 1 Course elements for 2562 courses at University of Illinois at iables were not directly used to create the latent class solution,
Chicago, Fall 2016
but statistically significant differences across the subgroups
Course Element % of courses that used Range Median (IQR) supports validity of the latent class solution. Finally, the re-
that element sults were grouped for each of the schools or colleges identi-
fied by the UIC’s Student Information System (Fig. 2). At UIC
Announcements 64 0–234 9 (16)
we note that colleges with the largest proportions of Holistic
Assessments 17 0–177 6 (11)
tools courses have more online and blended courses.
Assignments 49 0–102 5 (8)
Content Items 97 0–407 39 (45)
Discussion Board 32 0–191 6 (13)
Discussion
Grade Center 79 0–250 15 (20)
Blogs/Wikis/Journals 4 0–48 2 (4)
The purpose of this study was to better understand how our
Median and interquartile range (IQR) refer to the count of each course faculty members use the current LMS to inform discussions
element when present in a course and decision-making about our teaching technology needs.
TechTrends

Fig. 1 Latent class profiles 1


0.9
0.8
0.7
0.6
Holisic Tools Use 28%
0.5
0.4
Complementary Tools Use
0.3 51%
0.2
Content Repository 21%
0.1
0

We used a cost-effective approach; that is, we extracted our are deployed in the Blackboard Learn system in the BHolistic
own data using BbStats, a freely available software. Our anal- tool use^ profile, 28% of courses offer the option for students
yses revealed 3 patterns of tool use among our courses that to access their grades in a digital system, read announcements
illustrate the extent to which we use the functionality of our (through email or in the course), and complete online quizzes
current LMS. The Holistic subgroup (28% of courses) had or exams. Instructors designing courses within this category
courses with a larger class size, a greater likelihood of online might take advantage of student tools such as grade notifica-
delivery and 5 tools used per course (content, grade center, tions through email and Blackboard notification framework,
announcements, and possibly assignments, discussions, and/ the Bb Student mobile app, and browser-based communica-
or assessments), on average. The complementary tool use tion to students in the global navigation of the Blackboard
group was the middling half (51%) of all courses with regard Learn system.
to tool use, class size, and proportion of online courses. These The BComplementary tool use^ profile represents the larg-
courses averaged 3 tools per course (content, grade center, and est number of courses in the UIC system at 51%. The use of
announcements or assignments). The content repository sub- the grade center and significant support for announcements
group (21% of courses) had almost no (<2%) online courses, and assignments shows an intention to use Blackboard
had smallest class sizes and averaged about 2 LMS tools per Learn to communicate grades, keep students informed of the
course (i.e., content and announcements or discussion board). course progress, and accept digital assignments. These are key
There was no use of grade center, assignments, or assess- features of a Learning Management System, which track aca-
ments. These findings allow us to characterize how we use demic progress and automate key teaching methods.
our current LMS, yet do not address faculty satisfaction with The use of the Blackboard Learn system as a BContent
the status quo, knowledge of available tools and functions, repository^ makes up only 21% of the system. This latent
and their desire to learn and incorporate new technologies into class profile may represent initial phases of a faculty member
their teaching. digitizing a course experience. It may represent the view of the
The three identified latent class profiles represent the intent role of technology in teaching as faculty-to-student communi-
behind the instructional design of the courses. They do not cation and content-to-student communication. Certainly, this
provide evidence of student use or satisfaction. When courses intent by faculty does not tap into student-to-student

Table 2 Number of students and course tools by latent class subgroup

Holistic Tools Use Complementary Tools Use Content Repository Test


n = 706 n = 1309 n = 547 (p value)

Mean (SD) Mean (SD) Mean (SD) F test


Number of students per course 44.5 (60.1) 41.0 (52.2) 24.5 (60.9) 22.0 (<0.0001)
Number of tools used per course* 5.3 (0.7) 3.2 (0.8) 1.6 (0.7) 3687.9 (<0.0001)
n (%) n (%) n (%) χ2
Online 205 (29.2%) 86 (6.8%) 9 (1.7%) 284
In-Person 496 (70.8%) 1184 (93.2%) 523 (98.3) (<0.0001)

*Tools include content, grade center, announcements, assignments, discussion board, assessments, and blogs/wikis/journals (combined)
TechTrends

Fig. 2 Distribution of course 100%


types by college 90%
80%
70%
60%
50%
40%
30%
20%
10%
0%

Content Repository Complementary Tools Use Holisic Tools Use

communication, digital collaboration between either faculty announcements, assignments, and reviewing grade postings.
and students (assignments) or student-to-student collaboration This affects approximately 31,417 student course enrollments
(groups, discussions), or digital assessment (assessments as (706 courses with 44.5 enrollments on average out of the total
online quizzes or exams) in the online system. It may be that of 2562 courses with an average of 38.4 student course enroll-
these teaching and learning dimensions are facilitated in the ments). The definition of the Complementary group in the
classroom or in other systems. study by Whitmer (2016) included content with announce-
Undoubtedly, the cost-effective use of BbStats to access ments and the use of gradebook. Our study identifies a differ-
data provides other institutions using Blackboard with an ex- ent course design profile as Complementary tool use. It in-
cellent opportunity to replicate this study, and it may provide cludes digital assignments for roughly half of the courses.
an opportunity for faculty collaboration for statistical analysis. Perhaps the time required to complete the assignments cannot
The aggregate profiles of courses by school or college often be recorded in the student activity data; however, there is a
reflect a general nature of the programs and curricular ap- clear intention on the side of faculty for students to submit
proach. The adoption of specific tools in the online portion their assignments through the LMS. Along with the Holistic
of a course should not be correlated to academic quality of the tool use profile of courses, they make up 79% of courses at
program or effectiveness of instruction. This approach reports UIC.
only on the selection of tools in Blackboard Learn portion of Comparing the student use of time in online courses and
the course design. However, this presentation of the results faculty design intentions, there is clearly a gap. Perhaps time
may suggest resources that may be needed by specific col- spent on course items by students reflects their best judgment
leges, such as assigned instructional designers or instructor on what will make them successful in the course. Faculty may
training sessions in specific Blackboard tool use. Finally, as be designing opportunities for students, which are not well
the body of knowledge about the Scholarship of Teaching and communicated and utilized. Further research is needed to
Learning (SoTL) continues to increase, exploring the tool use bridge this gap and match student online behavior with faculty
in the local learning management system may help to distrib- expectations and their design for learning.
ute SoTL findings to instructors and colleges according to the We observed that almost all courses in the sample had
digital evidence of their course design (Englund et al. 2017; content items and almost none used the blogs/wiki/journals
Openo et al. 2017). tools, making those tools irrelevant to understanding the pat-
terns of use due to their limited variability. Content may have
been as minimal as a syllabus, but the median number of items
Conclusions across courses seemed plausible as did the median within each
subgroup. The range of number of content items and other
The findings in this study need to be related to the patterns, tools was noteworthy. The largest observed values may repre-
which were identified previously by measuring the student use sent outlier courses that use a high frequency of tools, but it is
of courses across a large data set (Whitmer et al. 2016). That also possible that courses become cluttered as they are copied
study did not report on the faculty intent, in terms of the design repeatedly each time the course is offered. It may be that
of the course, but on the time students spent consuming con- maintaining an uncluttered course design is not easily accom-
tent. The present study adds that while students may be spend- plished with the current LMS and suggests a usability issue to
ing a large amount of time in the content of the course, at least be explored with faculty and instructional designers. We also
28% of courses at UIC were created with well-rounded oppor- observed a low proportion of courses used LMS assessments
tunities for students to engage in assessments, discussions, across the sample (17%). While online classes might be
TechTrends

expected to use LMS assessments, 12% of courses had an by students when course credit is earned but not for discus-
online component. We will want to explore if assessment tool sions or content when the features are not graded.
options address faculty needs and concerns for possible Courses were reported on with equal weight despite their
cheating. low or high enrollment. Therefore, the impact on the student
body of a specific tool incorporated in courses and present in
the report, may be disproportionally reflecting the importance
of a tool. The association with specific schools and colleges
Future Research
may therefore be further out of balance if a school has a large
number of smaller class offerings. Courses created in
The pattern of course tool use at UIC can be used to gain an
Blackboard LMS at UIC are already filtered for gradable sec-
understanding of the appropriateness of the LMS or the need
tions only and exclude conference or discussion sections.
to evaluate other solutions. Further research is needed to iden-
Courses covered by the study used content items (97%),
tify what external tools are used by college to get a better
which is why an assumption was made that all courses in the
picture of the instructional technology ecosystem in place in
data set had a link, a folder, or text item of some sort created.
UIC. Furthermore, another dimension that is needed may be
The latent classes were identified and named following these
brought via focus groups to learn about teaching approaches
limitations.
that correspond to these identified groups and if different tools
or training might be needed to enhance teaching.
Compliance with Ethical Standards
In addition, to deepen the insight on LMS use, it will be
necessary to collect qualitative feedback from users such as Conflict of Interest On behalf of all authors, the corresponding author
students and faculty. What is the faculty satisfaction with the states that there is no conflict of interest.
system? How would they describe the ease of use? How much
time is involved in accessing necessary resources by students
or faculty? How does the use of audio and video resources,
which are often external to the LMS, affect their course design References
and LMS use?
Some of the key questions requiring answers include: Can Abazi-Bexheti, L., Kadriu, A., & Ahmedi, L. (2010). Measurement and
the knowledge of course patterns be used to improve student assessment of learning management system usage. In Proceedings
of 6th WSEAS/IASME International Conference on Educational
learning? Can the design archetypes be correlated to student Technologies. WSEAS Press.
satisfaction, grades, or course evaluation feedback? How can Berking, P., & Gallagher, S. (2013). Choosing a learning management
the findings of the study be applied in a larger context? system. Advanced distributed learning (ADL) Co-laboratories.
Certainly, analyzing course design and faculty intentions into Version 3 .0. Se rco Se rvice s, Inc.–O PM con tract no .
OPM0207008, Project Code: 02EA3TTAN MP, Vol.3. https://
emerging patterns shed a light on the eLearning ecosystem,
www.adlnet.gov/public/uploads/ChoosingAnLMS.docx. Accessed
but actionable next steps would be of great value. 2 Jan 2017.
Dahlstrom, E., Brooks, C., & Bichsel J. (2014). The Current Ecosystem of
Learning Management Systems in Higher Education: Student,
Faculty, and IT Perspectives. Retrieved from: http://www.
Limitations educause.edu/ecar.
Dawson, S., McWilliam, E. & Tan, J. (2008). Teaching smarter: how
Data collection was highly mechanical without advanced em- mining ICT data can inform and improve learning and teaching
practice. Annual Conference of the Australasian Society for
bedded logic or machine learning. This means that items, Computers in Learning in Tertiary Education (pp. 221–230).
which were disabled to prevent students from viewing them Melbourne, Australia: Deakin University.
on purpose (perhaps lecture notes) or items that were not de- Englund, C., Olofsson, A. D., & Price, L. (2017). Teaching with technol-
ployed for instructional use (experimental use or course in ogy in higher education: Understanding conceptual change and de-
velopment in practice, Higher Education Research & Development,
development), were included in the study. The simple pres-
36:1, 73–87. https://doi.org/10.1080/07294360.2016.1171300.
ence of a digital item in the course was counted as an inten- Fritz, J. (2013). Using Analytics at UMBC: Encouraging Student
tional use; therefore, we were not confident in the accuracy of Responsibility and Identifying Effective Course Designs. (Research
tool count data. We are confident in the presence/absence of Bulletin). Louisville, CO: EDUCAUSE Center for Applied
tools from our extraction method, but a limitation comes from Research. Retrieved from https://library.educause.edu/resources/
2013/4/using-analytics-at-umbc-encouragingstudent-responsibility-
the lack of student data to verify if tools were used within a and-identifying-effective-course-designs. Accessed 2 Jan 2017.
course. We can assume assignments and assessments are used
TechTrends

Janossy, J. (2008). Proposed Model for Evaluating C/LMS Faculty Usage Journal for the Scholarship of Teaching and Learning, 8(2), 1–18.
in Higher Education Institutions. Paper presented at the MBAA https://doi.org/10.5206/cjsotl-rcacea.2017.2.6.
International Conference, Chicago, IL. Orfanou, K., Tselios, N., & Katsanos, C. (2015). Perceived usability
Kroner G. (2014). Does your LMS do this? [Blog post]. Retrieved from evaluation of learning management systems: Empirical evaluation
https://edutechnica.com/2014/01/07/a-model-for-lms-evolution/. of the system usability scale. The International Review of Research
Lazarsfeld, P., & Henry, N. (1968). Latent structure analysis. Boston: in Open and Distributed Learning, 16(2), 227–246. https://doi.org/
Houghton Mifflin. 10.19173/irrodl.v16i2.1955.
Machajewski, S. (2014). Open source analytics for Blackboard Learn, Whitmer, J., Nuñez, N., Harfield, T., & Forteza, D. (2016). Patterns in
BbStats – Activity dashboard. Paper presented at the Big Data Blackboard Learn tool use: Five course design archetypes.
Conference, Allendale, MI. Retrieved from https://scholarworks. Retrieved from https://www.blackboard.com/Images/Bb_Patterns_
gvsu.edu/bigdata_conference2014/10. LMS_Course_Design_r5_tcm21-42998.pdf.
Medina-Flores, R., & Morales-Gamboa, R. (2015). Usability evaluation Whitmer, J., Nuñez, N. & Forteza, D. (2016a). Research in progress:
by experts of a learning management system. IEEE Revista Learning analytics at scale for Blackboard Learn [Blog post].
Iberoamericana de Tecnologias del Aprendizaje, 10(4), 197–203. Retrieved from http://blog.blackboard.com/research-in-progress-
https://doi.org/10.1109/rita.2015.2486298. learning-analytics-at-scale/.
Whitmer, J., Nuñez, N. & Forteza, D. (2016b). How successful students
Openo, J., Laverty, C., Kolomitro, K., Borin, P., Goff, L., Stranach, M., &
use LMS tools – confirming our hunches [Blog post]. Retrieved
Gomaa, N. (2017). Bridging the divide: Leveraging the scholarship
from http://blog.blackboard.com/how-successful-students-use-lms-
of teaching and learning for quality enhancement. The Canadian
tools/.

Você também pode gostar