Você está na página 1de 78

ABSTRACT

INTEGRATION OF TECHNOLOGY INTO HIGHER EDUCATION:


SELF-REGULATED LEARNING AND MOTIVATION
This study examined how integration of technology into higher education
was associated with self-regulated learning and motivation. Six courses for this
study were selected from a tablet initiative program at California State University,
Fresno, where all students were required to have and use tablets in class. The
Substitution, Augmentation, Modification, and Redefinition (SAMR) model of
technology integration was used to determine a level of integration (Substitution,
Augmentation, Modification, or Redefinition) for each course. Students in each of
the six courses (total N = 177) were evaluated on their level of motivation and
self-regulated learning six times during the semester, using two separate
measures. The study aimed to determine if students motivation and selfregulated learning differed based on the level of technology integration in the
classroom, and if self-regulated learning is a cyclical process, as hypothesized by
Zimmerman (2008). Repeated-measures ANOVAs and correlations were used to
analyze obtained data. Findings from this study suggest that highest level of
technology integration was associated with lower levels of self-efficacy and less
effective cognitive strategies over time. Correlations supported Zimmerman's
(2008) model of self-regulated learning, showing a cyclical nature over three tasks
throughout the semester.
Amber Costantino
March 2016

INTEGRATION OF TECHNOLOGY INTO HIGHER EDUCATION:


SELF-REGULATED LEARNING AND MOTIVATION

by
Amber Costantino

A thesis
submitted in partial
fulfillment of the requirements for the degree of
Master of Psychology
in the College of Science and Mathematics
California State University, Fresno
March 2016

2
22 2
APPROVED
For the Department of Psychology:
We, the undersigned, certify that the thesis of the following student
meets the required standards of scholarship, format, and style of
the university and the student's graduate degree program for the
awarding of the master's degree.

Amber Costantino
Thesis Author

Constance Jones (Chair)

Psychology

Martin Shapiro

Psychology

Ronald Yockey

Psychology

For the University Graduate Committee:

Dean, Division of Graduate Studies

3
33 3
AUTHORIZATION FOR REPRODUCTION
OF MASTERS THESIS
I grant permission for the reproduction of this thesis in part or
in its entirety without further authorization from me, on the
condition that the person or agency requesting reproduction
absorbs the cost and provides proper acknowledgment of
authorship.

Permission to reproduce this thesis in part or in its entirety must


be obtained from me.

Signature of thesis author:

ACKNOWLEDGMENTS
I would like to express my gratitude and appreciation to my advisor Dr.
Constance Jones, you have been an incredible mentor. Thank you for pushing me
and encouraging me every step of the way and for allowing me to grow as a
research scientist and as a person. I would also like to thank my committee
members, Dr. Martin Shapiro, Dr. Henry Delcore, and Dr. Ronald Yockey for
serving as my committee and giving me valuable and effective feedback. I would
especially like to thank all the DISCOVERe faculty and students who allowed me
to utilize their classrooms for my study, as well as the DISCOVERe administrators
who offered their support and expertise throughout the research process. Your

4
44 4
cooperation and support were vital to the completion of this study, and I
sincerely appreciate all of you.
I would also like to take this opportunity thank my family and friends
who have been tremendously supportive and encouraging, thank you. To my
father; thank you for everything you have done for me and for being the person I
strive to be, you are my inspiration. To my mother, brother, and sister; thank you
for your unconditional love and support, you are my strength. Lastly, to my
boyfriend Ross; thank you for sticking with me through everything and always
making me laugh, you are my happiness.

TABLE OF CONTENTS
Page
LIST OF TABLES...........................................................................................................
LIST OF FIGURES............................................................................................................
CHAPTER 1: Introduction...............................................................................................
CHAPTER 2: LITERATURE Review..............................................................................
Learning.....................................................................................................................
Learning with Technology.......................................................................................
The Human Situation.............................................................................................
Current Study..........................................................................................................
CHAPTER 3: Methods...................................................................................................
Participants..............................................................................................................
Instruments..............................................................................................................
Design and Procedure............................................................................................
CHAPTER 4: RESULTS..................................................................................................

5
55 5
Descriptives of Primary Variables........................................................................
Inferential Statistics.................................................................................................
CHAPTER 5: DISCUSSION...........................................................................................
Summary of Purpose..............................................................................................
Weaknesses..............................................................................................................
Strengths...................................................................................................................
Future Research.......................................................................................................
REFERENCES..................................................................................................................50
APPENDICES..................................................................................................................56
APPENDIX A: Faculty Informed Consent Form........................................................57
APPENDIX B: Student Informed Consent Form........................................................60
APPENDIX C: Student Demographic Questionnaire................................................62
APPENDIX D: Faculty demographics survey............................................................65
APPENDIX E: SAMR Faculty survey.............................................................................67
APPENDIX F: Motivated Strategies for Learning Questionnaire (MSLQ) Student
Survey....................................................................................................................72
APPENDIX G: Microanalysis Student Surveys..........................................................76

LIST OF TABLES
Pag

Table 1. Level of Technology Integration.....................................................................


Table 2. Primary Concepts and Variables of the Study.............................................

6
66 6
Table 3. MSLQ Pretest Posttest Means and Standard Errors....................................
Table 4. Total Microanalysis Pearsons r Correlation Matrix....................................
Table 5. Low Level of Technology Integration Microanalysis Pearsons r
Correlation Matrix............................................................................................
Table 6. Medium Level of Technology Integration Microanalysis Pearsons r
Correlation Matrix............................................................................................
Table 7. High Level of Technology Integration Microanalysis Pearsons r
Correlation Matrix............................................................................................

LIST OF FIGURES
Page
Figure 1. Mean motivation: self-efficacy score across time, separated by level
of technology integration..............................................................................
Figure 2. Mean motivation: intrinsic value score across time, separated by
level of technology integration.....................................................................
Figure 3. Mean motivation: test anxiety score across time, separated by level
of technology integration..............................................................................
Figure 4. Mean SRL: cognitive strategies score across time, separated by level
of technology integration..............................................................................
Figure 5. Mean SRL: self-reflection score across time, separated by level of
technology integration...................................................................................

CHAPTER 1: INTRODUCTION
Selfregulated learning (SRL) is a learner-directed process in which
initiating, monitoring, and adapting learning behaviors and strategies allow the

2
22 2
learner to independently transform cognitive ability into academic skill
(Barak, 2010; Bjork, Dunlosky, & Kornell, 2013; Puustinen & Pulkkinen, 2001;
Winne, 2010; Zimmerman, 2008). Research suggests that successful self-regulated
learners are highly effective learners both inside and outside academic settings
(Butler & Winne, 1995), and that those with self-regulatory skills have more
opportunities for life-long learning (Puustinen & Pulkkinen, 2001). SRL allows
learners to have a more pronounced and active role in their education. Research
suggests that this autonomy increases motivation (Rissanen, 2014). Thus,
motivation is one of the key features of SRL (Barber, Bagsby, Grawitch, & Buerck,
2011).
Technology can potentially foster SRL by allowing a high level of control
and manipulation of the information to be learned (Barak, 2010; Barber et al.,
2011) and by increasing motivation (Barak, 2010). Technology not only provides
flexibility in terms of time and place of learning, but also offers access to large
amounts of information from which learners can construct their own meaning.
With information-rich resources available and new tools constantly in
development, technology tools can increasingly support or facilitate effective
learning processes. This highlights the importance of effective integration of
technology in order to foster SRL (Barak, 2010; Barber et al., 2011).
Given differing levels of technology integration, this study examined the
connection between technology integration (Substitution, Augmentation,
Modification, or Redefinition) and the motivation and self-regulated learning
strategies of college students. First, the study aimed to determine if there is a
significant difference in motivation and self-regulated learning strategies by
differing levels of technology integration. Multiple measures were taken

3
33 3
throughout the semester in order to observe how motivation and selfregulated learning fluctuate throughout the semester. If the multiple
measurements of self-regulated learning show a cyclical pattern, it provides
additional support for current theoretical standpoint of SRL (Zimmerman, 2008)
and offers evidence for the reliability and validity of the measurements.

CHAPTER 2: LITERATURE REVIEW


Learning
Teaching is the transfer of information or skill from one person to another;
learning is the acquisition of new knowledge or skills. Motschnig-Pitrik and
Standl (2013) identified three types of learning: understanding knowledge or
intellect; acquiring relevant skills; and developing attitudes, feelings, and
personality. Successful acquisition of knowledge does not come from the
capability to passively respond to events, but the ability to make decisions and
actively apply information to new situations (Derry, 2008).
Self-regulated learning
Self-regulated learning (SRL) is defined broadly as the way individuals
dictate and monitor their own cognitive processes while acquiring new
information (Puustinen & Pulkkinen, 2001). It is not a specific academic skill but
rather a self-directed process such that learners convert mental abilities into
academic skills (Barak, 2010). SRL is particularly important for creating
opportunities for learning outside the formal classroom setting (Bjork et al.,
2013), including life-long learning (Puustinen & Pulkkinen, 2001). Numerous

4
44 4
studies show that SRL strategies increase academic achievement in many
contexts (Zimmerman, 1990). Empirical studies have also shown a strong
correlation between self-regulating behavior and motivation to complete a
challenging task. SRL also fosters problem solving skills in informal settings
where problems are not clearly defined or do not have a clearly correct answer
(Barak, 2010). Many researchers agree that self-regulated learners are the most
effective learners (Butler & Winne, 1995).
Recent trends in educational research show an increase in learning outside
the formal classroom setting. Without instructor supervision, self-regulated
learning and motivation to learn become increasingly important. Unfortunately,
studies show that learners are not very skilled at managing and assessing their
own learning (Bjork et al., 2013). Studies that focus on judgments of learning and
learners prediction of future performance show that people tend to be
underconfident in their ability to learn more in the future and overconfident in
their current level of knowledge, leading to non-optimal study habits (Bjork et
al., 2013). Further, societal attitudes and assumptions can foster misconceptions
about how to become an effective learner. For example, there is an assumption
that people do not need to be taught how to manage their learning activities and
that the purpose of education is to purely deliver information. This assumption
leads to a lack of curriculum focused on teaching and learning skills needed for
successful SRL.
There are five empirically tested models, all defining SRL in slightly
different ways, primarily depending on the model upon which the learning
theory draws (Puustinen & Pulkkinen, 2001). The current study primarily focuses
on Zimmermans (1990) model of self-regulation, which stems from Banduras

5
55 5
(1986) social cognitive theory and therefore emphasizes the interaction
between the self, behavior, and the environment as well as what Bandura called
self-motivation and self-efficacy. There are three phases of Zimmermans (1990)
model: forethought, performance, and self-reflection. The forethought phase
consists of task analysis (goal setting and strategic planning) and selfmotivational beliefs (self-efficacy, outcome expectations, task interest/value, and
goal orientation). The performance phase consists of self-control (self-instruction,
self-imagery, attention focusing, and task strategies) and self-observation (selfrecording and metacognitive monitoring). The self-reflection phase consists of
self-judgment (self-evaluation and causal attributions) and self-reaction (selfsatisfaction/affect and adaptive or defensive inferences). Importantly,
Zimmerman (2008) suggests that these self-regulatory phases are cyclical in
nature, meaning that self-reflection from one task influences the forethought
phase of the next task. In particular, feelings of satisfaction in the self-reflection
phase are predictive of self-efficacy and task interest in the forethought phase.
Motivational beliefs known to promote SRL are self-efficacy, task value,
and mastery goal orientation (Banyard, Underwood, & Twiner, 2006). Selfefficacy is the belief in ones own ability to learn and the confidence in their skills.
Task value belief is the attitude about the content being learned (i.e.. is it
interesting, important, useful?). Mastery goal orientation is learning for the sake
of learning. In contrast, external or performance goal orientation is learning to
get a good grade, earn a reward, etc. Those who are confident in their skills, place
a high value on the task at hand, and identify with the mastery goal orientation
are more likely to use self-regulated strategies.

6
66 6
Students who have a more pronounced role in their education tend
to be more motivated and active (Rissanen, 2014) and successful self-regulated
learners understand that learning requires active participation (Bjork et al., 2013).
Active and peer learning in the classroom are recognized as important
techniques that improve student engagement, creativity, and motivation and, in
turn, produce better learning outcomes. The conversational learning model
(Atif, 2013) encompasses collaboration between peers and instructors,
cooperative learning, and a community of practice. Studies show that
conversational learning promotes authentic learning, in which students are able
to apply what they learned in a formal school setting to real-world situations.
Although lecturing is still important for conveying concepts, especially to large
groups, some researchers suggest lectures should primarily serve to model
problem solving or to provide pre-knowledge needed for activities or discussions
that are student-managed.
Attempts to contextualize learning to the students world have also
increased motivation (Barak, 2010). The intention to create a learning
environment similar to students actual life experiences may lead the student to
find the task more valuable. These types of interventions include (but are by no
means limited to) introducing technology and mimicking ludic pedagogy (the
way games teach players to play) to create gamified active learning
environments (Broussard & Machtmes, 2012). In a gamified classroom students
use popular gaming strategies to get through the course. Simply a change in
language (i.e., quests versus assignments) can change students perspectives
of assignments and, in turn, lead students to be more willing to put in effort
beyond what was expected from them and increase student performance overall.

7
77 7
Ultimately, many of the studies mentioned above criticize noninteractive lecture style presentations of factual information that only serves to
reiterate what is in the textbook. This does not effectively improve the learners
experience or learning in the classroom (Atif, 2013). Mastery learning is not
centered on the content being learned, but rather the process of learning to
master specific material (Puzziferro & Shelton, 2008). Sadly, the fact that learners
do not make good judgments regarding learning indicates that students are not
being taught effective learning strategies crucial to SRL. Harmful misconceptions
about learning are particularly detrimental to SRL because effective SRL requires
a true understanding of human learning and recognition of effective learning
strategies (Bjork et al., 2013). Misconceptions and bad learning habits will
continue to plague students if we do not teach these important skills (Bjork et al.,
2013).
Learning with Technology
The abundance of modern technology has created a new digital world that
has caused irreversible changes in how we learn, work, communicate, play, and
live (Ping Lim, Yong, Tondeur, Ching Sing, & Chin-Chung, 2013). The growth of
this new digital world is remarkable; in 2004 the number of internet users was
800 million, in 2010 that number had jumped to 1.97 billion. Increase in
technology utilization has been shown to significantly improve productivity and
cost savings outside education, but such significant gains have yet to be observed
in education.
In a critical review of Technology Enhanced Learning (TEL) research,
Kirkwood and Price (2014) describe three primary levels of research interventions
in which technology is integrated into education: replication, supplementation,

8
88 8
and transformation. Similarly, the Substitution, Augmentation,
Modification, or Redefinition (SAMR) model describes four primary levels of
technology integration. The substitution level of technology integration is the
lowest level of integration and is characterized by no functional change in
teaching or learning (i.e., technology is used to perform the same tasks that were
done before the integration). The augmentation level of technology integration is
characterized by some functional benefit to the student or teacher (i.e., students
receive immediate feedback). The modification level of technology integration is
characterized by significant functional change in the classroom. The redefinition
level of technology integration is characterized by student-centered learning and
expansion of the learning environment. The SAMR model also classifies
substitution and augmentation as classroom enhancement, while modification
and redefinition are classified as classroom transformation.
The Good: Fostering Self-Regulated
Learning
Advancements in technology provide many new opportunities for selfregulated and lifelong learning (Bjork et al., 2013). Due to the increased
accessibility of the internet, there has never been so much information available
to learners with such minimal time or effort devoted to acquiring it. Providing
massive amounts of information in a non-authoritative fashion provides learners
the opportunity to construct their own meaning, which promotes deeper
learning (Derry, 2008). These advancements also give learners the ability to learn
virtually anywhere at any time. In turn, SRL becomes increasingly obtainable and
important (Sha, Looi, Chen, & Zhang, 2012; Wang, Shannon, & Ross, 2013).

9
99 9
Technology as a resource creates endless opportunities for SRL. It
not only creates a resource rich environment that facilitates learners access to
information (Derry, 2008), but it can also help them develop and adapt ideas and
learning goals, receive and use feedback effectively, and communicate with peers,
mentors, teachers, or professionals (Barak, 2010). Studies show that
conversational learning can be facilitated by use of information and
communication technology in a studio-type classroom. In an experimental study,
students in a technology-based conversational learning classroom outperformed
those in a the traditional classroom style (Atif, 2013). Specific technology tools,
such as personalized feedback provided by multimedia presentations, have been
found to promote deep learning and active engagement (Moreno & Mayer, 2000).
Feedback is a critical catalyst of monitoring SRL, fostering a reassessment
of goals and strategies to reach those goals (Butler & Winne, 1995). Self-regulated
learners are aware of their own knowledge, beliefs, motivation, and cognitive
processing, allowing them to monitor their own learning activities through
internal feedback. This allows self-regulated learners to identify a perceived
discrepancy between current state of learning and their goals. In other words,
they can identify when there is a lower rate of progress than expected. However,
successful self-regulated learners also seek feedback from external sources
(Butler & Winne, 1995). In some cases, technology offers more instant and
accessible external feedback. For example, information on grades is usually
communicated faster when using technology tools. Timely grading is an
important type of feedback that promotes understanding of how todays
performance contributes to the overall course outcome (Barber et al., 2011).

10
101010
A common misunderstanding about the role of errors and mistakes
in learning can be critically damaging to how one sets goals and interprets
feedback during self-regulated learning (Bjork et al., 2013). Setting goals to avoid
imperfection, otherwise known as performance goals, are less beneficial to
learners than goals focused on learning and progress, known as mastery goals
(Puustinen & Pulkkinen, 2001; Zimmerman, 2013). Nevertheless, there is a
widespread assumption that errors and mistakes are something to be avoided.
On the contrary, research suggests they are necessary for efficient learning (Bjork
et al., 2013). In fact, when feedback is given after a learner makes a mistake with
high confidence, the effects on learning are even more substantial than when the
error is made with low confidence. This is known as the hypercorrection effect.
Importantly, technologies such as video games create an environment in which
trial and error strategies are attractive and motivating (Mellecker, Witherspoon,
& Watterson, 2013). Not only does it provide the learner with a safe place to
make mistakes but it is also more conductive than teacher-classroom
environments. This results in continued persistence and the use of various
strategies to complete a challenging task. In turn, it promotes greater interest and
better performance on that task.
In summary, technology has been shown to facilitate self-regulation by
providing feedback (Barber et al., 2011), training specific SRL strategies (Nunez et
al., 2011; Zimmerman, 2008), prompting metacognitive monitoring (Sha et al.,
2012), providing conceptual scaffolding (Azevedo & Hadwin, 2005), and
providing a way to navigate information that allows for personal exploration and
deeper learning (Minhong, Jun, Bo, Hance, & Jie, 2011). Additionally, technology

11
111111
increases autonomy and control over the learning environment, which has
been shown to increase motivation (Barak, 2010; Sha et al., 2012).
Motivation tends to increase as technology is introduced to the classroom
(Banyard et al., 2006; Barak, 2010; Motschnig-Pitrik & Standl, 2013). This may be
because technology contextualizes learning in the students world (Barak, 2010),
where technology is a part of their day-to-day lives (Renes & Strange, 2011). In
other words, students place a higher value on the task because it is relevant and
important to their lives. As previously stated, certain motivational beliefs foster
SRL. It is currently unknown if technology increases the specific motivational
beliefs that are crucial in SRL.

The Bad: Cognitive Overload, Cost,


and Failure
Advancements in technology can provide a resource-rich learning
environment that is flexible and promotes self-regulated and lifelong learning
(Minhong et al., 2011). However, this resource-rich environment can also be
detrimental to learning, particularly when attempting to learn complex
knowledge structures without guidance. This is because learning complex
knowledge by oneself means searching for, selecting, assembling, and organizing
huge amounts of information, which may cause cognitive overload and
conceptual and navigational disorientation. Minhong et al. (2011) proposes
knowledge visualization (KV) in online courses as a possible solution to this
problem. KV is achieved by creating visual cognitive roadmaps to guide learners
through a non-linear knowledge space and help them recognize important
connections between abstract concepts. Other researchers recommend that online

12
121212
courses set aside a specific time for students to concentrate on the course
(Wang et al., 2013). Although this imitates a SRL skill, it eliminates the attractive
flexibility of time associated with online courses.
People often believe that online courses are less expensive due to the
reduction of physical space and labor cost, but there has not been an actual
lowered cost of tuition for students. In fact, online courses can be more expensive
due to the cost of review processes, training faculty, computer hardware and
software, and cyber support services. When the cost of tuition is lowered it is
usually due to the utilization of Massive Open Online Courses (MOOC), which
typically have high drop out and failure rates. There is a fear that low-cost
MOOCs will, in turn, produce a low-quality education (Casement, 2013).
Those against integration of technology in education point out that many
studies find a negative correlation between frequency of technology use and
academic success. However, the technologies mentioned in these studies are not
integrated into the students learning but instead serve as a distraction
(Wentworth & Middleton, 2014). Time spent on the computer is often found to be
negatively correlated with time spent studying, and less time studying may be
the reason for drops in academic success (Wentworth & Middleton, 2014).
However, students drop out or fail online courses twice as much as regular
classroom setting courses (Casement, 2013).
Technology Integration Models
Models of interaction between technologies and university teaching have
been attempted. From a systems approach, the model involves interaction
between instructors, students, content, and technology (Svinicki & McKeachie,
2011). Each component requires attention to make the integration of technology

13
131313
successful. The content component involves an analysis of course goals
and choosing activities that will actively engage students. The instructor
component involves an analysis of technology skill level, time availability, and
defining the role of the teacher. The student component involves an analysis of
technology skill level and technology access. Lastly, the technology component
involves an analysis of the type of technology needed and how it can be used to
support teaching.
The technology, pedagogy, and content knowledge (TPACK) model
emphasizes the initial instability in content and pedagogy caused by the
introduction of new technology into teaching. The model suggests that there
must be communication between teachers, technologists, and academic
developers to ensure that pedagogy is not added to technology-enhanced
teaching materials as an afterthought. This reflects the shift in research away
from e-learning environments themselves and towards learning and studentteacher relationships. Puzziferro and Shelton (2008) use Blooms Taxonomy to
create the Active Mastery Learning Model that emphasizes learning through
applying and analyzing information as well as group projects that promote
interaction.
Other models suggest more emphasis on person-centered dimensions in
technology-enhanced learning, along with the idea that the technology should fit
the learning and teaching, not visa versa. Motschnig-Pitrik and Standl (2013)
implemented a web-based, open source technology to support person-centered
learning that included modules such as peer evaluations and team space. This
model emphasizes the incorporation of several key pedagogical elements, such as
motivating students, being respectful and empathetic, using practical and

14
141414
engaging exercises, using self-initiated and self-organized projects, and
using team-based and multinational projects. In this study, students perceived
the model in a hybrid class as superior to either the strictly eLearning or face-toface classes by itself.
Assessments of Learning
The most commonly used evidence for technologys enhancement of
learning is test or assessment scores (Kirkwood & Price, 2014). Typical studies
utilize experimental techniques such as pretest-posttest or comparison of parallel
groups. However, it is difficult to attribute academic achievement to any one
thing in particular (Winne, 2010). The second most common evidence is positive
perceptions and attitudes (Kirkwood & Price, 2014). Studies show a positive
correlation between student satisfaction with the course and persistence
throughout the course (Wang et al., 2013). However, people often do not make
good judgments of their own learning (Bjork et al., 2013) and often do not
accurately self-report their learning strategies (Puustinen & Pulkkinen, 2001).
Measurements of SRL
Measuring SRL can be particularly difficult, but can be aided by
technology (Zimmerman, 2008). In an attempt to measure SRL, researchers have
conducted structured interviews and think aloud studies and have developed
questionnaires to measure such aspects of SRL as motivation and strategies for
learning (Puustinen & Pulkkinen, 2001). However, the correlation between selfreports and actual use of learning tactics is low, suggesting that self-reports by
themselves are an inadequate measure of SRL. Observation of behavior that
represents cognitive, metacognitive, and motivational occurrence may be key to

15
151515
fully modeling and measuring SRL (Winne, 2010). These observations are
known as traces. Trace-based measurement can be aided by technology with the
use of time-stamped logs of learners activities while interacting with the
material (Sha et al., 2012; Zimmerman, 2008). However, researchers tend to agree
that other measures of SRL in conjunction with trace measures increases the
validity of those conclusions (Winne, 2010; Zimmerman, 2008).
Another way to approach the measurement of SRL is through the
microanalytic methodology (DiBenedetto & Zimmerman, 2013b; Zimmerman,
2008). These are brief, context-specific measures of established SRL processes and
motivational beliefs. These types of measures can be both qualitative and
quantitative and can be compared to other measures to optimize reliability of the
measure. The key feature of microanalytic measures is that they can be used
during multiple learning events and the results can be plotted to show trends in
SRL processes and beliefs. These brief measures have been shown to be reliable
(Zimmerman, 2008) and valid (DiBenedetto & Zimmerman, 2013b). However,
there is a need to apply microanalytics to learning contexts that span longer
periods of time, where motivation is expected to fluctuate (Zimmerman, 2008).
The Human Situation
Teachers
Some teachers are digital immigrants; they did not grow up in the digital
world. For those teachers, adapting teaching methods in attempt to teach those
who are not immigrants can be a struggle (Renes & Strange, 2011). Primarily this
is because technology shifts the role of the teacher. In a traditional classroom the
teacher is the sole source of information. When technology is introduced to the

16
161616
classroom, the role of the teacher becomes far more complex. The
teachers role becomes that of a negotiator of lessons, a supporter for students
with differing levels of experience, a monitor of student progress, a motivator for
discussion and reflection (Ping Lim et al., 2013), a model of effective technology
usage (Wedman & Diggs, 2001), a source of external feedback (Butler & Winne,
1995), and a promoter of self-regulated learning (Barak, 2010).
Scanlon and Issroff (2005) established common evaluation factors used by
students and teachers to judge learning with technology and how those factors
affected rules in the classroom. Importantly, students efficiency goals were
defined as gaining knowledge but this conflicted with teachers efficiency goals to
maximize the number of students being taught. Cost, serendipity, interactivity,
and failure of technology changed the classroom rules and the division of labor.
Teachers complicated relationship with technology has been recognized
in research. Problems tend to stem from either teachers being ill equipped to
model technology use or technology that is isolated from the pedagogy of the
teacher. This reflects a failure of teacher education programs (Wedman & Diggs,
2001) and a failure of communication between technologists, academic
developers, and teachers (Kinchin, 2012). Attempting to use technology to teach
without sufficient preparation or commitment can be detrimental to student
learning (Svinicki & McKeachie, 2011). Factors that influence teachers to
implement technology are peer support, institutional support, and perceived
improvement in student learning (Renes & Strange, 2011).

17
Students

171717

Students are typically digital natives and see technology as a natural


extension of themselves. Interacting and learning from digital technologies
comes as second nature to digital natives (Renes & Strange, 2011), although this is
not equally true for all students. Although the differences have become less
substantial in recent years, students differ from each other with respect to
technology skill level and technology access (Svinicki & McKeachie, 2011).
Further, the integration of technology not only changes the role of the teacher but
also the role of the student. Students are expected to assume new responsibilities
that encourage SRL. Students who are used to learning passively may be
resentful and unwilling to take initiative and be successful self-regulated
learners.
The ability to adapt to increases in technology in education differs
demographically. This implies that the increase in technology may reinforce
educational inequality, rather than weaken it. Abdul-Alim (2013) calls attention
to these differences, pointing out that those with lower academic preparation,
such as those of African-American descent and men, are more likely to drop out
and get lower grades in online courses.
Community college students who fail or withdraw from online courses
usually attribute their failure to personal problems or getting too far behind to
catch up, not necessarily to problems with technology. This may suggest that
student success support systems are lacking for online courses, and that early
alert systems need to be implemented to identify struggling students early in the
course. Orientation into online courses has also been shown to be helpful
(Milman, 2013). It is important that students use technology to learn instead of
only learning how to use technology (Wedman & Diggs, 2001).

18
181818
Students who tend to do well in online courses are adult learners,
self-directed learners, students living in rural areas, and students who
understand the concept of interdependence in learning (Renes & Strange, 2011).
Simonds and Brock (2014) found statistically significant differences by age of
student with performance and preference of activities in online courses. Younger
students preferred interactive learning strategies and felt more confident in their
ability to learn in an online course. Adult students preferred videos of lectures
but also posted more on discussion boards and spent more time logged into their
online account. Adult students got better scores on quizzes and on their final
projects, while younger students did better at assignments where they could chat
and share biographical information. Along with age, experience with online
courses can dictate preference for activities in online courses. Importantly, those
who had taken online courses thought they would learn more from online
courses than in traditional class courses. Those who had not taken online courses
thought they would learn more from traditional in class courses.
It is important to note that these differences are very likely due to
important generational differences. A longitudinal study that compared high
school seniors who graduated between 1988 and 1992 (Generation X) to those
who graduated between 2002 and 2004 (Millennials) showed some interesting
differences in use of technology. While 23.5% of Generation X seniors used a
personal computer at least once per week, 85.8% of Millennials used personal
computers at least once per week. Due to the increase of technology and
information being at their fingertips Millennials hunger for knowledge but also
expect instantaneous and customizable services. Zimbardo, as cited in McManus
(2010), attributes this generational change to the change in time orientation due

19
191919
to the increase in technology. He states that increase in technology has
caused our brain to be digitally rewired toward a present hedonistic time
orientation and a need to be in control of our environment and, in turn,
millennial students do not do well in a traditional analog classroom. In
conclusion, Millennials may have a heightened since of entitlement and may not
tolerate outdated teaching strategies (Broussard & Machtmes, 2012).

Current Study
The current study proposes that the integration of technology into
education can enhance learning by increasing motivation and promoting selfregulated learning processes. Using computers and informational technologies in
education can foster self-regulated learning in two ways: by increasing specific
types of motivation and providing increased opportunities for SRL. Motivational
beliefs that have been known to increase SRL, self-efficacy, task value, and
mastery goal orientation, were evaluated. This was compared to the level of
technology integration based on the SAMR and TPACK models of integration.
Other variables that may have an effect on motivation in a technology-enhanced
course were measured for comparison, including comfort with technology, age,
congruency between the students major and the topic of course, and the reasons
for taking the course. The overall aim of this study is to provide insight on how
technology integration influences SRL and motivation.
The proposed study asks the following research questions: (1) Do students
enrolled in courses with a higher level of technology integration have increased
motivational beliefs that are crucial to SRL? (2) Do students enrolled in courses

20
202020
with a higher level of technology integration use more SRL strategies? (3)
Do microanalytic measures of SRL show a cyclical nature over the span of a
semester?

CHAPTER 3: METHODS
Participants
Participants for this study were recruited from six different courses at
California State University, Fresno which incorporate tablets into the classroom
via a university program titled DISCOVERe. This program is ideal for measuring
the integration of technology into the classroom because all students are required
to have tablets, but each course has differing levels of engagement with that
technology. DISCOVERe faculty members were recruited from the DISCOVERe
summer institute, which encourages faculty to use a variety of applications that
can aide in teaching and learning. It is ultimately left up to the faculty member to
choose which applications, and to which extent, they use. This results in
classrooms that all use tablets but in different ways, and with differing levels of
integration. There is substantial university support for this program, for both the
students and the instructors.
Participants (N=177) age ranged from 18-42, with a mean age of 21.66
years (SD = 4.02). The sample consisted of 60 men and 117 women (33.9% men,
66.1% women). A total of six courses from various disciplines were selected for
this study. As classified by the SAMR model of technology integration, a total of
42 students were enrolled in a course (engineering and chemistry courses)
classified as having a low level of integration of technology, 98 students were
enrolled in a course (nutrition and journalism courses) classified as having a

21
212121
medium level of integration of technology, and 37 students were enrolled
in a course (construction and earth science courses) classified as having a high
level of integration of technology (see Table 1). When students were asked to selfreport their technological skill level 80.80% of the students indicated that they
Table 1. Level of Technology Integration
Level

Percent of Total Sample

Low

42

24%

Med

98

55%

High

37

21%

Total

177

100%

were Good (36.20%), Very Good (29.90%), or Excellent (14.7%), while the
other 20.20% indicated they their technological skill level was either Poor
(1.70%), or Fair (17.50%).
Six faculty members were recruited for this study. Faculty members were
all women, with an age range form 32-55 and a mean age of 42.83. All faculty
members self-reported their technological skill level as Good (50%) or Very
Good (50%). None of the faculty members had taught a DISCOVERe course in
the past.
Instruments
Two consent forms were used for this experiment: one for faculty (see
Appendix A), and one for students (see Appendix B). The SAMR faculty survey
was given before any student surveys were completed. It included a disclaimer
that this information may be used to qualify them for future studies and that it is

22
222222
possible that the principal investigators will contact them again and that
further data collection will be pursued. The student consent form informed them
that, if they choose to participate, they will be asked to answer short pre- and
post-assignment questions on certain assignments as well as a set of surveys at
the beginning and end of the semester. They were informed that all assessments
should be taken seriously and that their participation would be important and
appreciated by the scientific community.
A demographics questionnaire was distributed to both students (see
Appendix C) and faculty (see Appendix D). The students demographic
questionnaire inquired about their age, major, class level, ethnicity, technological
skill level, the course they are enrolled in, previous exposure to DISCOVERe
courses, and the reason for taking the course. The faculty demographic
questionnaire inquired about age, ethnicity, technological skill level, previous
experience instructing DISCOVERe courses, and the course(s) they are
instructing this semester.
Independent Variable
The independent variable of level of integration of technology was
measured by a faculty self-report survey at the end of the Spring 2015 semester
and end of the Fall 2015 semester. The first survey, which was given at the end of
the Spring 2015 semester, was created specifically for this study and is based on
the SAMR (from low to high; Substitution, Augmentation, Modification,
Redefinition) model (see Appendix E). This survey asked questions about how
often/ if they planned to use activities that are typically aligned with various
levels of the SAMR model. Each question was answered on a 5-point Likert scale
that ranges from never to always. For example, questions that measure

23
232323
substitution ask faculty member to report how often technology was
integrated without functional change in teaching or learning such as I use PDFs
or Word documents to distribute documents via email and Students take notes
using MicrosoftWord or other text only software (i.e. iOS notes, TextEdit, etc.).
Questions that measure augmentation ask about how often technology was used
to slightly improve functionality of teaching or learning, such as Students are
instructed on how to search within a document and define words within a
document using their tablets and Students are instructed to categorize or tag
notes using Evernote or other comparable applications. Questions that measure
substitution ask about how often technology was used in a way that significantly
improved functionality, such as Students are encouraged to integrate online
sources or material into their notes (i.e. Evernote, SlingNote, etc.) and Students
turn in assignments using BlackBoard, Google Drive, DropBox or similar
websites. Questions that measure redefinition asked how often technology was
used in a way that was only accomplishable with technological tools where
substantial changes were made in teaching and learning, such as Students take
collaborative notes by using shared, online notebooks (i.e. Evernote, GoogleDocs,
etc.) and I create interactive documents using iBooks Author or similar
applications. The activities of inquiry came from the researchers knowledge
and general research of the SAMR model. The sum of the scores for each subset
of questions (Substitution, Augmentation, Modification, Redefinition) was taken
from the first set of 25 questions in the survey. Two courses were selected with
the highest score on the subscale that measures the lowest level of integration
(substitution), two courses were selected with the highest score on the subscale
that measures one of the intermediate levels of integration (augmentation or

24
242424
modification), and two courses were selected with the highest score on the
subscale that measures the highest level of integration (redefinition) (see Table 2
for summary of primary concepts and variables).

Table 2. Primary Concepts and Variables of the Study


Concept

Variable & Questions

Scoring

Level of

Substitution

Integration of

SAMR questions 2, 5, 8, 12,

Sum of scores using a 5-point

Technology

14, 18, & 22

Likert scale

Augmentation
SAMR questions 3, 10, 15,

Sum of scores using a 5-point

19, & 23

Likert scale

Modification
SAMR questions 4, 9, 16,

Sum of scores using a 5-point

20, & 24

Likert scale

Redefintion

Motivation

SAMR questions 1, 7, 11,

Sum of scores using a 5-point

13, 17, 21, & 25

Likert scale

Self-efficacy
MSLQ questions 2, 7, 10,

Sum of scores using a 7-point

11, 13, 15, 20, 22, 23

Likert scale

Intrinsic value
MSLQ questions 1, 5, 6, 9,

Sum of scores using a 7-point

25
252525
12, 17, 18, 21, 25

Likert scale

Test anxiety
MSLQ questions 3, 14, 24,

Sum of scores using a 7-point

27

Likert scale

Task Analysis
Microanalysis Q1-Q3

Sum of scores using a 7-point


Likert scale

Self-Motivation Beliefs
Microanalysis Q4-Q5

Sum of scores using a 7-point


Likert scale

Self-regulated

Cognitive Strategy Use

learning

MSLQ questions 30, 31, 33,

Sum of scores using a 7-point

35, 36, 38, 39, 42, 44

Likert scale

Self-Regulation
MSLQ questions 32, 34, 40,

Sum of scores using a 7-point

41, 43, 45, 46, 52, 55

Likert scale

Self-Reflection
Microanalysis Q6-Q7

Sum of scores using a 7-point


Likert scale

Cyclical

Time lag

learning

Microanalysis Q8 & Q1-Q3

Correlation (r) between sum of


scores

Microanalysis Q8 & Q4-Q5

Correlation (r) between sum of


scores

Concurrent

26
262626
Microanalysis Q4-Q5 & Q8

Correlation (r) between


sum of scores

Microanalysis Q6-Q7 & Q8

Correlation (r) between sum of


scores

Dependent Variables
Motivation
The Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich &
De Groot, 1990) (see Appendix F) was used as to measure the dependent variable
of motivation. The survey contains 44 items (22 of them pertaining to motivation)
that were answered on a 7-point Likert scale, from 1 is not at all true of me to 7
is very true of me. The MSLQ considers three components of motivation: SelfEfficacy, Intrinsic Value, and Test Anxiety. Each component was scored separately
by summing the appropriate items. This survey was given at the beginning and
end of the course resulting in three separate pretest and posttest scores for
motivation (see Table 2 for summary of primary concepts and variables).
Microanalytic assessment is an umbrella term in which many contextspecific questions are asked in attempt to measure target behaviors (Cleary,
Callan, & Zimmerman, 2012). Whereas the MSLQ attempts to measure overall
motivation, microanalytic measures evaluate motivation in relation to a
particular task. Although many studies use interview protocols to take
microanalytic measures, questions have been structured for online learning as
well (Zimmerman, 2008). Further, although it is common to take measures of

27
272727
many of the subprocesses of SRL to show its cyclical nature, it is not
common to take measures of all subprocesses or even all the phases of the SRL
process (Cleary et al., 2012). The current study focuses on motivational beliefs in
the forethought phase and self-reflection in the reflection phase. Therefore, there
were no measures in the performance phase of the SRL process.
Given the nature of microanalytic measurements, reliability and validity
measures are not typically reported. However, microanalytic measures of SRL
have been shown to have high predictive validity for athletic tasks (Kitsantas &
Zimmerman, 2002), science learning (DiBenedetto & Zimmerman, 2013a), and
mathematical problem solving (Callan, 2014). Further, microanalytic measures
have been shown to be a superior predictor of SRL when compared to self-report
surveys and teacher surveys (Callan, 2014).
In the current study, microanalytic measures (Appendix G) were used to
measure the dependent variable of students motivation in relation to four semiregular or regularly assigned tasks throughout the semester. When completing
these tasks, students were asked to fill out a brief questionnaire before
(forethought) and after (reflection) each task. Motivation was measured as part of
the pre task portion (forethought) of the questionnaire which consisted of five
items that attempted measurement of two components of motivation: task
analysis (goal setting and planning; Q1-Q3) and self-motivation beliefs (selfefficacy and outcome expectations; Q4-Q5) (see Table 2 for summary of primary
concepts and variables).
Each participant earned two motivation scores between 2-14. Task
Analysis scores are calculated from the first three questions. The first question
inquired about what percentage grade they would like to achieve. Scoring for this

28
282828
item will range from 0-9, where if the participant selected a grade between
0%-10% it will be scored as 0, 11%-20% as 1, 21%-30% as 2, 31%-40% as 3, 41%50% as 4, 51%-60% as 5, 61%-70% as 6, 71%-80% as 7, 81%-90% as 8, and 91%100% as 9. The second question indicated their goal orientation and asked them
to check all goals that applied as well as the option to check other and specify
their own. There were three types of goals evaluated in this study: no goal,
outcome goals, and process goals. Process goals indicate more effective than
general or do my best goals (Cleary et al., 2012). In turn, participants were
assigned two points were if mostly process goals are selected. Three points were
assigned if mostly outcome goals are selected. One point was added if I dont
have any other goalswas selected. For categorization of process and outcome
goals, see Appendix G. The third question indicates whether or not they are
using strategic planning. If they chose no their score was 1, if they chose yes their
score was 2. The scores from these three questions were summed and resulted in
a single Task Analysis score for each task. The Self-Motivation Beliefs score was
calculated by summing the scores of questions four and five which used a
response scale from 1-7. The questions that are used are based on previous
studies that used developed coding techniques (Callan, 2014; Cleary et al., 2012;
DiBenedetto & Zimmerman, 2013a). Optimally, a total of at least 4 measures
would have been taken. However, compliance of these measures declined
throughout the semester causing very low Ns for the last of the 4 microanalytic
measure.
Self-Regulated Learning
The Motivated Strategies for Learning Questionnaire (MSLQ) (Pintrich &
De Groot, 1990) (see Appendix F) was also used as to measure the dependent

29
292929
variable of SRL. The survey contains 44 items (22 of them pertaining to
SRL) that were answered on a 7-point Likert scale, from 1 is not at all true of
me to 7 is very true of me. The MSLQ considers two components of SRL:
Cognitive Strategy Use, and Self-Regulation. Each component was scored
separately by summing the appropriate items. This survey was given at the
beginning and end of the course resulting in two separate pretest and posttest
scores for SRL (see Table 2 for summary of primary concepts and variables).
In the current study, microanalytic measures (Appendix G) were also used
to measure the dependent variable of students SRL in relation to four semiregular or regularly assigned tasks throughout the semester. When completing
these tasks, students were asked to fill out a brief questionnaire before
(forethought) and after (reflection) each task. SRL was measured as part of the
post-task portion (reflection) of the questionnaire which consisted of two items
that attempted to measure self-reflection by inquiring about students selfjudgment (self-evaluation; Q6) and self-reaction (self-satisfaction; Q7) (see Table 2
for summary of primary concepts and variables).
Each participant earned two motivation scores between 2-14. The selfreflection beliefs score was calculated by summing the scores of questions six and
seven which used a response scale from 1-7. The questions that are used are
based on previous studies that used developed coding techniques (Callan, 2014;
Cleary et al., 2012; DiBenedetto & Zimmerman, 2013a). Optimally, a total of at
least 4 measures would have been taken. However, compliance of these measures
declined throughout the semester causing very low Ns for the last of the 4
microanalytic measure.

30
303030

Cyclical Nature of SRL

Microanalytic measures (Appendix G) were also used to test the cyclical


nature of students SRL in relation to four semi-regular or regularly assigned
tasks throughout the semester. In addition to the seven questions that measured
Task Analysis, Self-Motivation, and Self-Reflection, the scale also included an
item (Q8) to represent the cyclical nature of SRL. The question inquires about
how well the student believe they would do on the next task in the series after
completing the current task and uses a Likert scale of 1-7. This allowed
evaluation of how the beliefs in the reflection phase in one task were related to
the forethought phase of the next task using bivariate correlations. These are
considered time-lagged measures of cyclical nature. In addition, concurrent
measures of cyclical nature were evaluated through the correlations between the
Task Analysis, Self-Motivation, and Self-Reflection within a single task.

Design and Procedure


The current study is a non-experimental, mixed method, nonequivalent
groups design. Six separate classes were selected and compared according to the
level of integration of technology into the classroom, which can be thought of as
the independent variable. The SAMR faculty survey was the measure of this
variable. The students motivation and self-regulated learning, although
intertwined, can be thought of as the dependent variables. Secondarily and
independently, motivation was evaluated as an independent variable that affects
self-regulated learning. These variables were measured in a pre-test post-test
manner by the MSLQ as well as microanalytic measures that were taken
throughout the semester. See Table 2 for the schedule of all measurements taken.

31
313131
The SAMR model survey was distributed to faculty who were
scheduled to teach a DISCOVERe course in the Fall 2015 semester. This allowed
the researcher to measure the level of integration of technology, which not only
served as the independent variable but also as a tool to select a sample of courses
with varying levels of integration for deeper analysis. While selecting the sample,
the courses were evaluated on the occurrence of regularly assigned tasks that
would be ideal for microanalytic measures and class size.
Selected faculty were asked to participate and give their students the
opportunity to participate in a study that evaluated the integration of technology
into education and its effects on self-regulated learning and motivation. They
were told this will include students completion of the MSLQ at the beginning
and end of the Fall 2015 semester as well as completion of brief questions
(microanalytic measures) before and after regularly scheduled tasks. Those tasks
were defined through collaboration between the faculty member and the
researcher. In addition, faculty members were informed that they would be
expected to complete the SAMR survey again at the end of the semester. One-onone meetings were required to clarify this procedure and explain any benefits or
costs it may have.
Once the faculty member agreed to allow his or her course to be
evaluated, students were asked if they would like to participate in the study at
the very beginning of the semester. Incentive to complete the microanalytic
measures was in the form of a lottery. Students names were entered once for
every microanalysis they completed. Ten students were awarded a $10 Amazon
gift card at the end of the semester. Classroom visits were required to introduce
the study as well as distribute the consent and demographic measures. Those

32
323232
who completed the consent form and choose to participate were first
given the MSLQ (Pintrich & De Groot, 1990) and general demographics and
introductory questionnaire that inquires about age, gender, class level, exposure
to DISCOVERe courses in the past, and level of comfort with technology. The
microanalytic measures were given to those students on four regularly scheduled
assignments throughout the semester. At the end of the semester, the Motivated
Strategies for Learning Questionnaire (Pintrich & De Groot, 1990) was distributed
again for a post-integration comparison.

The hypotheses for this study are as follows:


(1) Students enrolled in courses with highest level of integration of technology
into the classroom will have higher scores on measures of motivation (SelfEfficacy, Intrinsic Value, and Test Anxiety).

(2) Students enrolled in courses with highest level of integration of technology


into the classroom will have higher scores on measures of self-regulated learning
(Cognitive Strategies and Self-Regulation).

(3) Correlations will show an association between the concurrent self-efficacy


score (microanalysis Q4-Q5) and the cyclical learning question (Q8), concurrent
self-regulated learning strategy of reflection (microanalysis Q6-Q7) and the
cyclical learning question (Q8), and time lagged measurements of cyclical
learning question (Q8) of one task and the task value (microanalysis Q1-Q3) and
self-efficacy (microanalysis Q4-Q5) of the next task.

33
CHAPTER 4: RESULTS

333333

Descriptives of Primary Variables


Outliers in the MSLQ dataset were eliminated via visualization of
frequency tables for all five subscales. It was determined that the posttest scores
of intrinsic value, self-efficacy, and cognitive strategies all had one score that was
much lower than the rest. Review of the data revealed that this was a single
participant. That participant was therefore eliminated as an outlier. After the
outlier was eliminated all frequency distributions fit a normal curve fairly well.
All MSLQ subscales had a Cronbachs alpha between .82-.94 except for
self-reflection, which was .60 (pretest) and .56 (posttest). This indicates good
internal consistency for these measures. Mean scores, standard deviations, and
Ns for each subscale, pretest and posttest, are presented in Table 3.
Inferential Statistics
Hypothesis 1
Students enrolled in courses with highest level of integration of
technology into the classroom will have higher scores on measures of motivation
(Self-Efficacy, Intrinsic Value, and Test Anxiety).
Three repeated-measures ANOVAs, one for each of the three MSLQ
motivation subscales, were conducted in which time was considered a within
subjects factor and level of technology integration was considered as a between
subject variable.
Motivation: Self-Efficacy
For Self-Efficacy, there was a significant main effect of time, F(1, 100) =
7.73, p < .05, partial 2 = .072. There was no significant main effect of level of

34
343434
Table 3. MSLQ Pretest Posttest Means and Standard Errors
Pretest
SAMR Level of
Integration

MSLQ Subscale

Mean

SD

Mean

SD

M: Self Efficacy

48.87

5.94

46.93

7.91

M: Intrinsic Value
M: Test Anxiety
SRL: Self Reflection
SRL: Cognitive Strategies

51.47
17.40
42.27
63.40

7.25
7.89
5.65
8.89

52.00
16.20
41.27
62.53

5.98
7.16
5.40
5.15

M: Self Efficacy

49.26

7.62

48.91

8.83

M: Intrinsic Value
M: Test Anxiety
SRL: Self Reflection
SRL: Cognitive Strategies

50.64
15.79
41.99
65.62

8.28
5.44
6.62
10.88

50.11
14.25
41.61
64.75

7.88
5.25
7.01
12.31

M: Self Efficacy

50.42

6.13

42.25

8.51

M: Intrinsic Value
M: Test Anxiety
SRL: Self Reflection
SRL: Cognitive Strategies

53.75
17.75
43.00
72.17

7.69
4.71
6.94
9.96

47.08
18.25
39.00
60.50

8.59
4.27
5.08
8.02

Low

Medium

High

Postest

Note. N = 15 N = 76 N = 12; M: Motivation, SRL: Self-regulated learning

technology integration, F(2, 100) = 1.02, p > .05. There was a significant
interaction between time and level of technology integration, F(2, 100) = 3.67, p
< .05, partial 2 = .029, See Figure 1.
Motivation: Intrinsic Value
For Intrinsic Value, there was no significant main effect of time or of level
of technology integration, F(1, 100) = 2.92, p > .05; F(2, 100) = .30, p > .05,

35
353535
respectively. Further, there was no significant interaction between time
and level of technology integration, F(2, 100) = 2.33, p > .05, See Figure 2.
Motivation: Test Anxiety
For Test Anxiety, there was no significant main effect of time or of level of
technology integration, F(1, 100) = .72 p > .05; F(2, 100) = 2.77, p > .05,
respectively. Further, there was no significant interaction between time and level
of technology integration, F(2, 100) = .51, p > .05, See Figure 3.
Hypothesis 2
Students enrolled in courses with highest level of integration of
technology into the classroom will have higher scores on measures of selfregulated learning (Cognitive Strategies and Self-Regulation).
Two repeated-measures ANOVAs, one for each of the two MSLQ SRL
subscales, were conducted in which time was considered a within subjects factor
and level of technology integration was considered as a between subject variable.
SRL: Cognitive Strategies
For Self-Efficacy, there was a significant main effect of time, F(1, 100) =
5.51, p < .05, partial 2 = .052. There was no significant main effect of level of
technology integration, F(2, 100) = .63, p > .05. There was a significant interaction
between time and level of technology integration, F(2, 100) = 3.10, p < .05, partial
2 = .058, See Figure 4.
SRL: Self Reflection
For Self Reflection, there was no significant main effect of time or of level
of technology integration, F(1, 100) = 2.50 p > .05; F(2, 100) = .13, p > .05,

55.00
52.00

36
363636

49.00

Motivation Self-Efficay

46.00
respectively. Further, there
was no significant interaction between time

and level of technology integration,


F(2, 100) = .96, p > .05, See Figure 5.
43.00
40.00
Pretest

Posttest

Time
Low

Medium

High

Total

Figure 1. Mean motivation: self-efficacy score across time, separated by level of


technology integration

Low

Medium

High

Total

37
373737

Figure 2. Mean motivation: intrinsic value score across time, separated by level
of technology integration

Pretest

Posttest
Low

Medium

High

Total

38
383838

Figure 3. Mean motivation: test anxiety score across time, separated by level of
technology integration

45.00
43.00
41.00

SRL Cognitive Strategies


39.00
37.00
35.00
Low

Medium

High

Total

Figure 4. Mean SRL: cognitive strategies score across time, separated by level of
technology integration

39
393939

SRL Self-Reflection

74.00
72.00
70.00
68.00
66.00
64.00
62.00
60.00
58.00
56.00
54.00
Low

Medium

High

Total

Figure 5. Mean SRL: self-reflection score across time, separated by level of


technology integration
Additional Analysis
The MSLQ subscales pretest and posttest scores were then analyzed
separately for differences between levels of technology integration. A total of ten
one-way ANOVAs were conducted. Posttest self-efficacy scores were significantly
higher in the medium level of technology integration (M = 48.91, SD = 8.83) and
the highest level of technology integration (M = 42.25, SD = 8.51), F(2, 100) = 3.15,

40
404040
p < .05, partial 2 = .059. All other subscales did not significantly differ by
level of integration of technology, p > .05 (see Figures 1-5)
Hypothesis 3
Correlations will show an association between the concurrent selfmotivation belief score (microanalysis Q4-Q5) and the cyclical learning question
(Q8), concurrent self-regulated learning strategy of reflection (microanalysis Q6Q7) and the cyclical learning question (Q8), and time lagged measurements of
cyclical learning question (Q8) of one task and the task analysis (microanalysis
Q1-Q3) and self-motivation beliefs (microanalysis Q4-Q5) of the next task.
Correlations showed an association between the concurrent selfmotivation beliefs score (microanalysis Q4-Q5) and the cyclical learning question
(Q8) in the first three tasks, task 1; r(122) = .26, p <.001, task 2; r(47) = .49, p <.001,
task 3; r(16) = .63, p <.001. Correlations also showed an association between the
concurrent self-regulated learning strategy of reflection (microanalysis Q6-Q7)
and the cyclical learning question (Q8) in the first three tasks, task 1; r(122) = .73,
p <.001, task 2; r(47) = .69, p <.001, task 3; r(16) = .56, p = .015.
Correlations showed an association between the cyclical learning question
(Q8) of task 1 and the task analysis (microanalysis Q1-Q3) of task 2, as well as the
self-motivation beliefs (microanalysis Q4-Q5) of task 2, respectively; r(47) = .55, p
<.001, r(47) = .60, p <.001. Correlations also showed an association between the
cyclical learning question (Q8) of task 2 and the task analysis (microanalysis Q1Q3) of task 3, as well as the self-motivation beliefs (microanalysis Q4-Q5) of task
3, respectively; r(16) = .79, p <.001, r(16) = .76, p <.001. See Table 4 for total
microanalysis correlation matrix.

41
414141
When correlations were evaluated separately between level of
technology integration it was discovered that there was a very limited number of
participants (N=3) in the low and medium levels of integration who completed
the microanalysis for task 3. In addition, the participants in the medium level of
integration failed to complete the task analysis section of task 2, making it
impossible to calculate the correlations that represent the cyclical nature of selfregulated learning for those levels of technology integration individually.
Using a Fishers r-to-z transformation, correlation coefficients were
compared across levels of technology integration. There was no significant
difference between correlation coefficients between the cyclical learning question
(Q8) of task 1 and the task value (microanalysis Q1-Q3) of task 2 between low
and high levels of technology integration between, z = 0.82, p > .05. There was no
significant difference between correlation coefficients between the cyclical
learning question (Q8) of task 1 and the self-efficacy (microanalysis Q1-Q3) of
task 2 between low and high levels of technology integration between, z = 1.67, p
> .05. See tables 5-7 for correlation matrices for each level of technology
integration.

Table 4. Total Microanalysis Pearsons r Correlation Matrix


1
2
3
4
5
6

TA1
SM1
R1
Q8.1
TA2
SE2

1
1.00
0.32
0.29
0.26
0.45
0.36

1.00
0.71
0.79
0.57
0.69

1.00
0.74
0.22
0.45

1.00
0.55
0.62

1.00
0.65

1.00

10

11

12

42
424242
7
R2
0.19 0.38 0.46 0.22 0.24 0.56 1.00
8
Q8.2 0.42 0.63 0.37 0.55 0.49 0.66 0.69 1.00
9
TA3 0.36 0.65 0.05 0.29 0.77 0.87 0.54 0.79 1.00
10 SE3 0.14 0.53 0.33 0.72 0.64 0.71 0.51 0.76 0.60 1.00
11 R3
0.11 0.23 0.72 0.25 0.37 0.36 0.77 0.35 0.23 0.52 1.00
12 Q8.3 0.22 0.61 0.37 0.77 0.71 0.72 0.48 0.77 0.63 0.95 0.56 1.00
Note. p < .05; TA = Task Analysis, SM = Self-Motivation Beliefs, R = Reflection, Q8 =
cyclical nature question; Task 1 N = 126, Task 2 N = 50, Task 3 N = 18, Task 4 N = 6

Table 5. Low Level of Technology Integration Microanalysis Pearsons r Correlation


Matrix
1
2
3
4
5
6
7
8
1
TA1
1.00
2
SE1
0.11
1.00
3
R1
0.33
0.79
1.00
4
Q8.1
0.15
0.85
0.76
1.00
5
TA2
0.56
0.42
0.34
0.68
1.00
6
SE2
0.15
0.71
0.71
0.61
1.00
0.82
7
R2
-0.08 0.41
0.51
0.27
0.06
0.64
1.00
8
Q8.2
0.31
0.57
0.63
0.69
0.54
0.81
0.77
1.00
Note. p < .05; TA = Task Analysis, SM = Self-Motivation Beliefs, R =
Reflection, Q8 =
cyclical nature question

43
434343
Table 6. Medium Level of Technology Integration Microanalysis Pearsons r
Correlation Matrix
1
2
3
4
5
6
7
8
1
TA1
1.00
2
SE1
0.48
1.00
3
R1
0.27
0.73
1.00
4
Q8.1
0.47
0.94
0.70
1.00
5
TA2
.
.
.
.
.
6
SE2
0.87
0.87 -0.95 0.50
.
1.00
7
R2
0.00
0.87 -0.19 1.00
.
0.50
1.00
8
Q8.2
0.50
1.00 -0.66 0.87
.
0.87
0.87
1.00
Note. p < .05; TA = Task Analysis, SM = Self-Motivation Beliefs, R =
Reflection, Q8 =
Table 7. High Level of Technology Integration Microanalysis Pearsons r
cyclical nature question
Correlation Matrix
1
2
3
4
5
6
7
8
9
10 11 12
TA
1.0
1
1
0
SE
0.3 1.0
2
1
4
0
0.2 0.6 1.0
3 R1
7
8
0
Q8. 0.2 0.7 0.7 1.0
4
1
5
5
5
0
TA
0.4 0.6 0.1 0.4 1.0
5
2
8
9
9
6
0
SE
0.3 0.7 0.3 0.5 0.7 1.0
6
2
9
1
6
8
3
0
0.1 0.4 0.4 0.2 0.3 0.5 1.0
7 R2
8
5
1
5
6
2
0
Q8. 0.4 0.6 0.3 0.5 0.5 0.6 0.6 1.0
8
2
2
8
0
4
4
0
7
0
TA
0.2 0.5
0.1 0.7 0.8 0.4 0.7 1.0
9
0.1
3
2
7
8
3
6
6
6
0
6
1 SE
0.4 0.1 0.6 0.6 0.6 0.4 0.7 0.5 1.0
0.0
0 3
8
9
8
1
8
2
3
6
0
4
1
0.0 0.6 0.1 0.2 0.2 0.7 0.2 0.1 0.4 1.0
R3
0.1
1
6
6
1
3
7
2
0
0
5
0
3
1 Q8. 0.0 0.5 0.2 0.7 0.6 0.6 0.3 0.7 0.5 0.9 0.4 1.0
2 3
4
7
3
3
9
9
8
4
9
4
8
0
Note. p < .05; TA = Task Analysis, SM = Self-Motivation Beliefs, R =
Reflection, Q8 =
cyclical nature question

44
CHAPTER 5: DISCUSSION

444444

Summary of Purpose
This study examined how the integration of technology into higher
education is associated with students self-regulated learning and motivation. Six
DISCOVERe tablet based courses were selected based on the level of integration
of technology based on the Substitution, Augmentation, Modification, and
Redefinition (SAMR) model. Each course was categorized as having a low
(Substitution), medium (Augmentation or Modification) or high (Redefinition)
level of integration of technology. Students in each of the courses (total N=177)
were evaluated on their level of motivation and self-regulated learning six times
during the semester, using the Motivated Strategies for Learning Questionnaire
(MSLQ) at the beginning and end of the semester and Microanalysis measures on
four similar tasks throughout the semester.
It was hypothesized that students enrolled in courses with the highest
level of integration of technology into the classroom would have higher scores on
measures of motivation (Self-Efficacy, Intrinsic Value, and Test Anxiety). This
hypothesis was not supported by this study. In contrast, findings from this study
suggest that, while the low and medium levels of technology showed little
change from pretest to posttest MSLQ scores, there was a significant decrease in
self-efficacy and cognitive strategies in the highest level of technology
integration. Further, although all MSLQ measures did not differ by technology
integration level at the start of the semester, an end of semester evaluation found
that self-efficacy scores were significantly higher in the medium level of
technology integration than the highest level of technology integration.

45
454545
Successful technology integration into the classroom requires
attention and redefining each component of the classroom; the instructors, the
students, the content, and the technology (Svinicki & McKeachie, 2011). Initial
integration of technology shifts the role of the teacher and the student (Renes &
Strange, 2011) causing an instability in content and pedagogy (Kabakci Yurdakul
et al., 2012). Problems can stem from technology being isolated from the
pedagogy of the teacher, especially when teachers are ill equipped to model
technology use (Kinchin, 2012). Since each of the faculty members in the current
study had no previous experience instructing DISCOVERe courses, this could
have been a major contributing factor to the results of the study. The highest level
of technology integration would be expected to be the most difficult technologies
to model and cause the most initial instability, potentially explaining the
decreased self-efficacy throughout the semester in the highest level of technology
integration.
Students are also expected to make changes with technology integration.
They must use SRL skills and assume new responsibilities (Renes & Strange,
2011). However, passive learners may be resentful of these new responsibilities
and may be unwilling to assume them. This resent or unwillingness may be the
reflected in the self-efficacy scores of the students. Further, although the
resource-rich environments that are offered by technology can increase
opportunities for SRL, it can also be detrimental to learning, especially when
attempting to learn complex information (Minhong et al., 2011). We can expect
that students in courses with the highest level of technology integration would
have to assume the most new responsibilities, and that their classrooms would be
the most different from a traditional classroom. Therefore, this could also

46
464646
potentially explain the decreased self-efficacy throughout the semester in
the highest level of technology integration seen in this study.
There was also a decrease in SRL cognitive strategies throughout the
semester in the highest level of technology integration. Increased technology, in
general, has been associated with negative impacts on academic success
(Wentworth & Middleton, 2014). Previous studies suggest that technology may
serve as a distraction when used for purposes other than academics. In the
current study, it is possible that those in the highest technology integration level
were focused too much on the technology itself and did not focus enough on
learning or interacting with the information. This, along with the initial content /
pedagogy instability, and increased student responsivities may have caused a
cognitive overload and, in turn, a decrease in effective SRL cognitive strategies. If
nothing else, this study emphasizes the need for effective technology integration
into the classroom and a clear understanding of the consequences, at least
initially, of this technology integration. In turn, we should caution new faculty
that there will be an adjustment period after the initial integration and that those
who start off at a lower level of technology integration tend to have less
disruption to their classrooms.
Finally, this studys findings further validate microanalytic measures by
providing evidence of SRL as a cyclical process as proposed by Zimmerman
(2008). Zimmerman (2008) suggested that the three phases of SRL (forethought,
performance, and self-reflection) had a cyclical nature where the forethought of
one task was affected by the self-reflection of the previous task. Microanalytic
measures were used in attempt to capture these phases. Further, there was a need
to apply microanalytics to learning contexts that span longer periods of time,

47
474747
where motivation is expected to fluctuate. In this study, microanalytic
measures were used on four different tasks throughout the semester. It was
hypothesized that correlations would show an association between the
concurrent self-efficacy score (microanalysis Q4-Q5) and the cyclical learning
question (Q8), concurrent self-regulated learning strategy of reflection
(microanalysis Q6-Q7) and the cyclical learning question (Q8), and time lagged
measurements of cyclical learning question (Q8) of one task and the task value
(microanalysis Q1-Q3) and self efficacy (microanalysis Q4-Q5) of the next task.
Three of the four tasks had a sufficient number of students who completed the
microanalytic measure and the results were in support of the hypothesis in every
case. These findings suggest that SRL is cyclical in nature and that microanalytics
are a good measure of context specific SRL process.
Weaknesses
The SAMR model survey that was taken by faculty members at the
beginning to measure the level of integration of technology was created by the
researcher for the purpose of this study. The first two hypotheses relied heavily
on the validity of this measure. It is possible that this was an insufficient measure
of the level of integration and that there were little differences between these
groups because it was not measured correctly. Further issues with measures
include that the MSLQ subscale scores were highly correlated with each other,
decreasing the discriminant validity between the scales.
This study utilized a non experimental design with intact, nonequivalent
groups. This type of design has low control and allows for potential confounding
variables due to pre-existing differences among participants in the intact groups.
In addition to student variation between classrooms, it is also possible that

48
484848
faculty members who chose to integrate technology at the highest level are
inherently different from faculty members that integrate technology more
cautiously. These differences could potentially explain the results seen in this
study.
Although the university provided training and support, faculty members
included in this study had never taught a DISCOVERe course in the past. This
may have caused some disruption in the classroom and could have caused these
results to be confounded by the initial instability of the technology based
classrooms.
Although the study started with a large number of participants,
compliance to complete the measures decreased throughout the semester,
limiting the number of viable time delayed matches.
Strengths
Multiple measures were used during this study in order to more
completely understand students motivation and SRL. The use of microanalysis
measures multiple times during the semester allowed measurement of students
motivation and SRL in a brief, context-specific way. On the other hand, the MSLQ
pre- and post-semester measures allowed measurement of students motivation
and SRL in a broader, overall terms.
Pretest/ posttest scores provide insight to how individuals change over
the span of the semester in relation of the level of integration of technology.
Pretest scores provide a baseline scores for students in each level of technology
integration. Evaluation of these scores between levels of the technology
integration showed that they were not significantly different from each other.
Self-efficacy and cognitive strategies scores differed only at the posttest. The

49
494949
pretest posttest design tells us that this change was more than likely due
to differences between classrooms; the focus here being the different levels of
technology integration.
The complexity of level of technology integration was also a strength of
this study. Rather than comparing classrooms with no technology to those that
use technology, this study evaluated technology integration at three different
levels. This gives insight into how technology based classrooms can differ by
using technology in different ways and how this effects student learning.
Future Research
Many of the findings in this study can potentially be explained by the
initial instability caused by the integration of new technology into the classroom.
It would be interesting to look at these classrooms over the next few semesters to
see if this stabilizes, how long it takes to stabilize, and if instructors regress
toward a midlevel of technology integration. It would also be interesting to look
at an already established high integration classroom to see if these effects on SRL
and motivation persist.
Mastery learning centers on the processes of learning, rather than the
content itself (Puzziferro & Shelton, 2008). Monitoring these processes is an
important skill of a self-regulated learner. Monitoring skills can be promoted
through technology because, in some cases, technology can provide more
instantaneous and accessible feedback (Butler & Winne, 1995). The amount of
feedback was not accessed in this study, but is something to be considered for
future studies.
Atif (2013) suggests that in order to effectively improve learners experience
in the classroom, teachers must move beyond reiterating what is in the textbook

50
505050
and increase interaction with the material. Addition of technology does
not necessarily indicate an increase in integration or an increase in content
beyond the scope of the textbook. Perhaps, as student engagement with
information in the classroom is an important factor of motivation and SRL, this
should be something that is measured in future studies.
Academic performance was not measured, nor was it the focus of the
current study. However, future researchers may be interested in the practical
importance of academic performance. Measurement of academic performance,
however, should only be compared across courses that have the same content and
instructor, making it difficult to compare across differing levels of technology
integration.

REFERENCES

REFERENCES
Abdul-Alim, J. (2013). Virtual explosion. Diverse Issues in Higher Education, 30(4),
14-15.
Atif, Y. (2013). Conversational learning integration in technology enhanced
classrooms. Computers in Human Behavior, 29(2), 416-423.
doi:10.1016/j.chb.2012.07.026
Azevedo, R., & Hadwin, A. F. (2005). Scaffolding self-regulated learning and
metacognition - implications for the design of computer-based scaffolds.
Instructional Science, 33(5-6), 367-379. doi:10.1007/s11251-005-1272-9
Bandura, A. (1986). Social foundations of thought and action : a social cognitive theory
Englewood Cliffs, N.J.: Prentice-Hall.

51
515151
Banyard, P., Underwood, J., & Twiner, A. (2006). Do Enhanced
Communication Technologies Inhibit or Facilitate Selfregulated Learning?
European Journal of Education, 41(34), 473-489. doi:10.1111/j.14653435.2006.00277.x
Barak, M. (2010). Motivating self-regulated learning in technology education.
International Journal of Technology & Design Education, 20(4), 381-401.
doi:10.1007/s10798-009-9092-x
Barber, L. K., Bagsby, P. G., Grawitch, M. J., & Buerck, J. P. (2011). Facilitating selfregulated learning with technology: Evidence for student motivation and
exam improvement. Teaching of Psychology, 38(4), 303-308.
doi:10.1177/0098628311421337
Bjork, R. A., Dunlosky, J., & Kornell, N. (2013). Self-regulated learning: Beliefs,
techniques, and illusions. Annual Review of Psychology, 64(1), 417-444.
doi:10.1146/annurev-psych-113011-143823
Broussard, J. E., & Machtmes, K. (2012). Gaming as curriculum. Curriculum &
Teaching Dialogue, 14(1/2), 89-104.
Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A
theoretical synthesis. Review of Educational Research, 65(3), 245-281.
doi:10.2307/1170684
Callan, G. L. (2014). Self-regulated learning (SRL) microanalysis for mathematical
problem solving: A comparison of a SRL event measure, questionnaires, and a
teacher rating scale. The University of Wisconsin-Milwaukee.
Casement, W. (2013). Will online learning lower the price of college? Journal of
College Admission(220), 14-18.
Cleary, T. J., Callan, G. L., & Zimmerman, B. J. (2012). Assessing self-regulation as
a cyclical, context-specific phenomenon: overview and analysis of SRL
microanalytic protocols. Education Research International, 2012.
Derry, J. A. N. (2008). Technologyenhanced learning: A question of knowledge.
Journal of Philosophy of Education, 42(34), 505-519. doi:10.1111/j.14679752.2008.00638.x

52
525252
DiBenedetto, M. K., & Zimmerman, B. J. (2013a). Construct and predictive
validity of microanalytic measures of students' self-regulation of science
learning. Learning and Individual Differences, 26(0), 30-41.
doi:http://dx.doi.org/10.1016/j.lindif.2013.04.004
DiBenedetto, M. K., & Zimmerman, B. J. (2013b). Construct and predictive
validity of microanalytic measures of students' self-regulation of science
learning. Learning and Individual Differences, 26, 30-41.
doi:10.1016/j.lindif.2013.04.004
Kabakci Yurdakul, I., Odabasi, H. F., Kilicer, K., Coklar, A. N., Birinci, G., & Kurt,
A. A. (2012). The development, validity and reliability of TPACK-deep: A
technological pedagogical content knowledge scale. Computers & Education,
58(3), 964-977. doi:http://dx.doi.org/10.1016/j.compedu.2011.10.012
Kinchin, I. (2012). Avoiding technologyenhanced nonlearning. British Journal of
Educational Technology, 43(2), E43-E48. doi:10.1111/j.1467-8535.2011.01264.x
Kirkwood, A., & Price, L. (2014). Technology-enhanced learning and teaching in
higher education: What is 'enhanced' and how do we know? A critical
literature review. Learning Media and Technology, 39(1), 6-36.
doi:10.1080/17439884.2013.770404
Kitsantas, A., & Zimmerman, B. J. (2002). Comparing self-regulatory processes
among novice, non-expert, and expert volleyball players: A microanalytic
study. Journal of Applied Sport Psychology, 14(2), 91-105.
doi:10.1080/10413200252907761
McManus, E. (2010). Phillip Zimbardo on the powers of time: The animation.
Retrieved from http://blog.ted.com/phillip_zimbard/
Mellecker, R. R., Witherspoon, L., & Watterson, T. (2013). Active learning:
Educational experiences enhanced through technology-driven active game
play. The Journal of Educational Research, 106(5), 352-359.
doi:10.1080/00220671.2012.736429
Milman, N. B. (2013). Increasing student success in online courses examining
existing research and the need for even more! Distance Learning, 10(3), 63-65.

53
535353
Minhong, W., Jun, P., Bo, C., Hance, Z., & Jie, L. (2011). Knowledge
visualization for self-regulated learning. Journal of Educational Technology &
Society, 14(3), 28-42.
Moreno, R., & Mayer, R. E. (2000). Engaging students in active learning: The case
for personalized multimedia messages. Journal of Educational Psychology,
92(4), 724-733. doi:10.1037/0022-0663.92.4.724
Motschnig-Pitrik, R., & Standl, B. (2013). Person-centered technology enhanced
learning: Dimensions of added value. Computers in Human Behavior, 29(2),
401-409. doi:10.1016/j.chb.2012.04.013
Nunez, J. C., Cerezo, R., Bernardo, A., Rosario, P., Valle, A., Fernandez, E., &
Suarez, N. (2011). Implementation of training programs in self-regulated
learning strategies in Moodle format: results of a experience in higher
education. Psicothema, 23(2), 274-281.
Ping Lim, C., Yong, Z., Tondeur, J., Ching Sing, C., & Chin-Chung, T. (2013).
Bridging the gap: Technology trends and use of technology in schools.
Journal of Educational Technology & Society, 16(2), 59-68.
Pintrich, P. R., & De Groot, E. V. (1990). Motivated Strategies for Learning
Questionnaire.
Puustinen, M., & Pulkkinen, L. (2001). Models of self-regulated learning: a
review. Scandinavian Journal of Educational Research, 45(3), 269-286.
doi:10.1080/00313830120074206
Puzziferro, M., & Shelton, K. (2008). A model for developing high-quality online
courses: Integrating a systems approach with learning theory. Journal of
Asynchronous Learning Networks, 12(3-4), 119-136.
Renes, S., & Strange, A. (2011). Using technology to enhance higher education.
Innovative Higher Education, 36(3), 203-213. doi:10.1007/s10755-010-9167-3
Rissanen, A. J. (2014). Active and peer learning in STEM education strategy.
Science Education International, 25(1), 1-7.

54
545454
Scanlon, E., & Issroff, K. (2005). Activity theory and higher education:
Evaluating learning technologies: Activity theory and higher education.
Journal of Computer assisted learning, 21(6), 430-439. doi:10.1111/j.13652729.2005.00153.x
Sha, L., Looi, C. K., Chen, W., & Zhang, B. H. (2012). Understanding mobile
learning from the perspective of self-regulated learning. Journal of Computer
assisted learning, 28(4), 366-378. doi:10.1111/j.1365-2729.2011.00461.x
Simonds, T. A., & Brock, B. L. (2014). Relationship between age, experience, and
student preference for types of learning activities in online courses. Journal
of Educators Online, 11(1).
Svinicki, M., & McKeachie, W. J. (2011). Teaching tips: Strategies, reserach, and theory
for college and university teachers (13th edition ed.). Belmont, CA: Wadsworth,
Centrage Learning.
Wang, C.-H., Shannon, D. M., & Ross, M. E. (2013). Students characteristics, selfregulated learning, technology self-efficacy, and course outcomes in online
learning. Distance Education, 34(3), 302-323.
doi:10.1080/01587919.2013.835779
Wedman, J., & Diggs, L. (2001). Identifying barriers to technology-enhanced
learning environments in teacher education. Computers in Human Behavior,
17(4), 421-430. doi:10.1016/S0747-5632(01)00012-7
Wentworth, D. K., & Middleton, J. H. (2014). Technology use and academic
performance. Computers & Education, 78(0), 306-311.
doi:http://dx.doi.org/10.1016/j.compedu.2014.06.012
Winne, P. H. (2010). Improving measurements of self-regulated learning.
Educational Psychologist, 45(4), 267-276. doi:10.1080/00461520.2010.517150
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An
overview. Educational Psychologist, 25(1), 3.
Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical
background, methodological developments, and future prospects. American
Educational Research Journal, 45(1), 166-183. doi:10.3102/0002831207312909

55
555555
Zimmerman, B. J. (2013). From Cognitive Modeling to Self-Regulation: A
Social Cognitive Career Path. Educational Psychologist, 48(3), 135-147.
doi:10.1080/00461520.2013.794676
APPENDIX A: FACULTY INFORMED
APPENDICES
CONSENT FORM

56

You are invited to participate in a study conducted by Dr. Constance Jones and
Amber Costantino at California State University, Fresno. We hope to learn about
how technology is being integrated into the classroom via the DISCOVERe
program. You were selected as a possible participant in this study because you
are instructing a DISCOVERe course next semester.

If you decide to participate, you will take a survey about how you are planning to
use tablets in your fall 2015 course. Your participation in this survey will take
approximately 15-20 min. You may be asked to answer questions about your use
of technology, your comfort with technology, your students engagement with
technology, and your students level of comfort with technology. You may also be
asked study materials on models and examples of integration of technology that
may be beneficial for you, as a DISCOVERe instructor, to understand. We cannot
guarantee, however that you will receive any benefits from this study.

Any information that is obtained in connection with this study and that can be
identified with you will remain confidential and will be disclosed only with your
permission or as required by law. If you give us your permission by selecting
agree at the bottom of the page, we plan to disclose the overall university
statistics about the level of integration of technology.

Be aware that, upon your participation, you may be selected for additional
studies in the fall. The principal investigators may or may not contact you via
email concerning this additional study.

57
Your decision whether or not to participate will not prejudice your future
relations with California State University, Fresno or the DISCOVERe program. If
you decide to participate, you are free to withdraw your consent and to
discontinue participation at any time without penalty. The Department of
Psychology Human Subjects Committee at California State University, Fresno has
reviewed and approved the present research.

If you have any questions, please ask us. If you have any additional questions
later, Ms. Costantino [ambercostnatino@gmail.com, (559)824-3333] will be happy
to answer them. Questions regarding the rights of research subjects may be
directed to Constance Jones, Chair, CSUF Committee on the Protection of Human
Subjects, (559) 278-4468.

You should print this page for your own records.

YOU ARE MAKING A DECISION WHETHER OR NOT TO PARTICIPATE.


SELECTING "I AGREE" INDICATES THAT YOU HAVE DECIDED TO
PARTICIPATE, HAVING READ THE INFORMATION PROVIDED ABOVE

58
APPENDIX B: STUDENT INFORMED CONSENT FORM
You are invited to participate in a study conducted by Dr. Constance Jones and
Amber Costantino at California State University, Fresno. We hope to learn about
how technology is being integrated into the classroom via the DISCOVERe
program. You were selected as a possible participant in this study because you
are attending a DISCOVERe course this semester.
If you decide to participate, you will be asked to participate in two (2) surveys
that evaluate your level of motivation and self-regulated learning as well as brief
questionnaires that are to be completed immediately before and after selected
assignments throughout the semester. These brief surveys may cause the time it
takes to complete an assignment to increase, causing a moderate inconvenience.
However, the questions asked might be helpful in promoting self-regulatory
behavior, which has been shown to promote a deeper understanding of the
material. We cannot guarantee, however that you will receive any benefits from
this study. Further, for every one of these brief surveys you complete your name
will be entered into a lottery. At the end of the semester 10 (ten) students will be
randomly selected as winners. The prize will be a $10 amazon gift card.
Any information that is obtained in connection with this study and that can be
identified with you will remain confidential and will be disclosed only with your
permission or as required by law. If you give us your permission by selecting
agree at the bottom of the page, we plan to disclose the overall statistics about the
level of integration of technology and how that affects motivation and selfregulated learning.
Your decision whether or not to participate will not prejudice your future
relations with California State University, Fresno, the DISCOVERe program, or
the course in which you are enrolled. If you decide to participate, you are free to

59
withdraw your consent and to discontinue participation at any time without
penalty. The Department of Psychology Human Subjects Committee at California
State University, Fresno has reviewed and approved the present research.
If you have any questions, please ask us. If you have any additional questions
later, Ms. Costantino [ambercostnatino@gmail.com, (559)824-3333] will be happy
to answer them. Questions regarding the rights of research subjects may be
directed to Constance Jones, Chair, CSUF Committee on the Protection of Human
Subjects, (559) 278-4468.
You should print this page for your own records.
YOU ARE MAKING A DECISION WHETHER OR NOT TO PARTICIPATE.
SELECTING "I AGREE" INDICATES THAT YOU HAVE DECIDED TO
PARTICIPATE, HAVING READ THE INFORMATION PROVIDED ABOVE

60
APPENDIX C: STUDENT DEMOGRAPHIC QUESTIONNAIRE
Student Demographic Questionnaire

Q1 What is the course number of the DISCOVERe course in which you are
currently enrolled?

Q2 Have you ever taken a DISCOVERe course in the past?


Yes
No

Q3 What is your current level of technological skill?


Poor
Fair
Good
Very Good
Excellent

Q4 How old are you?

Q5 What is your gender?


Male
Female
Other ____________________
Decline to State

Q6 What is your major?

61
Q7 What are your reasons for taking this course? (check all that apply)
The content seems interesting
It fulfills a GE requirement
It fulfills a major requirement
It will be useful to me in other courses
It is an easy elective
It will help improve my academic skills
It will help improve my technological skills
It will improve my career prospects
It was recommended by a friend
It was recommended by my counselor or other faculty member
Other, please specify ____________________

62
APPENDIX D: FACULTY DEMOGRAPHICS SURVEY
Faculty Demographics Questionnaire

Q1 What is the course number of the DISCOVERe course(s) in which you are
currently instructing?

Q2 Have you ever instructed a DISCOVERe course in the past?


Yes
No

Q3 What is your current level of technological skill?


Poor
Fair
Good
Very Good
Excellent
Q4 How old are you?
Q5 What is your gender?
Male
Female
Other ____________________
Decline to State

APPENDIX E: SAMR FACULTY SURVEY

63
The SAMR (Substitution, Augmentation, Modification, Redefinition) model was
developed by Dr. Ruben Puentedura and attempts to explain how technology can
impact teaching and learning. The following questions will evaluate which level
of integration of technology you will use in your classroom in the fall 2015
semester. Please answer each these questions to the best of your ability using
your previous knowledge of your DISOVERe course to predict what will be
happening in your fall 2015 DISCOVERe course.

(all questions are answered using a 5-point Likert scale where 1 is Never, 2 is
Rarely, 3 is Sometimes, 4 is Most of the Time, and 5 is Always)
Substitution
Q2 Students take notes using MicrosoftWord or other text only software (i.e. iOS
notes, TextEdit, etc.)
Q5 Students use their browser (i.e. Safari, Chrome, Firefox, etc.) to research and
collect information
Q8 Students demonstrate their understanding by creating presentations using
PowerPoint, KeyNote, Prezi, or comparable slideshow applications
Q12 I use PowerPoint, KeyNote, Prezi, or comparable slideshow applications to
present information to students
Q14 I copy, paste, and send web addresses using email
Q18 I use PDFs or Word documents to distribute documents via email
Q22 Students turn in assignments by emailing me directly

Augmentation
Q3 Students are instructed to categorize or tag notes using Evernote or other
comparable applications

64
Q10 Students demonstrate their understanding by creating projects within
applications such as ShowMe Interactive Whiteboard in which they can add
illustrations, pictures, and voiceovers.
Q15 I send "meeting requests" or calendar reminders for deadlines through email
Q19 Students are instructed on how to search within a document and define
words within a document using their tablets
Q23 Students turn in assignments using class folders directly from the
application (i.e. Pages, ePortfolios)

Modification
Q4 Students are encouraged to integrate online sources or material into their
notes (i.e. Evernote, SlingNote, etc.)
Q9 Students demonstrate their understanding by creating short movies in iMovie
(or other comparable movie making software) OR by creating webpages using
GoogleSites (or other comparable site building tools)
Q16 I create QR codes that link students to websites
Q20 I use annotated digital documents such as those made in GoodReader or
iBooks
Q24 Students turn in assignments using BlackBoard, Google Drive, DropBox or
similar websites

Redefinition
Q1 Students take collaborative notes by using shared, online notebooks (i.e.
Evernote, GoogleDocs, etc.)
Q7 Students use their browser (i.e. Safari, Chrome, Firefox, etc.) to create
mindmaps for visual displays of ideas and concepts

65
Q11 Students use demonstrate their understanding by creating interactive
activities using NearPod or other interactive applications
Q13 I use NearPod or other interactive applications to present information to
students
Q17 I use augmented reality applications such as Aurasma AR, which allows
users to unlock digital content from the world around them through the use of a
phone or tablet.
Q21 I create interactive documents using iBooks Author or similar applications
Q25 Students are encouraged to collaborate on assignments and give peerfeedback using Wiki, GoogleSites, GoogleDocs, blog posts in BlackBoard, etc.

Additional Questions
Q26 How often is the work assigned technology based or inconceivable without
technology
Q27 How often is class achievable without tablets

APPENDIX F: MOTIVATED STRATEGIES FOR LEARNING

QUESTIONNAIRE (MSLQ) STUDENT SURVEY

66
Motivated Strategies for Learning Questionnaire
MSLQ

The following scales and items represent the Motivated Strategies for
Learning Questionnaire (MSLQ) that was used in this study to measure
students' motivational beliefs and self-regulated learning. The numbers next
to the items reflect the item's actual position on the questionnaire. Items
marked (*R) were reflected before scale construction. There were 56 items on
the questionnaire, but only 44 were used in this study to form the following
five scales.

Motivational Beliefs
A. Self-Efficacy
2. Compared with other students in this class I expect to do well.
7. I'm certain I can understand the ideas taught in this course.
10. I expect to do very well in this class.
11. Compared with others in this class, I think I'm a good student.
13. I am sure I can do an excellent job on the problems and tasks assigned
for this class.
15. I think I will receive a good grade in this class.
20. My study skills are excellent compared with others in this class.
22. Compared with other students in this class I think I know a great deal
about the subject. 23. I know that I will be able to learn the material for this
class.

67
B. Intrinsic Value
1. I prefer class work that is challenging so I can learn new things.
5. It is important for me to learn what is being taught in this class.
6. I like what I am learning in this class.
9. I think I will be able to use what I learn in this class in other classes.
12. I often choose paper topics I will learn something from even if they
require more work.
17. Even when I do poorly on a test I try to learn from my mistakes.
18. I think that what I am learning in this class is useful for me to know.
21. I think that what we are learning in this class is interesting.
25. Understanding this subject is important to me.

C. Test Anxiety
3. I am so nervous during a test that I cannot remember facts I have
learned.
14. I have an uneasy, upset feeling when I take a test.
24. I worry a great deal about tests.
27. When I take a test I think about how poorly I am doing.

Self-Regulated Learning Strategies

D. Cognitive Strategy Use


30. When I study for a test, I try to put together the information from class
and from the book.
31. When I do homework, I try to remember what the teacher said in class

68
so I can answer the questions correctly.
33. It is hard for me to decide what the main ideas are in what I read. (*R)
35. When I study I put important ideas into my own words.
36. I always try to understand what the teacher is saying even if it doesn't
make sense.
38. When I study for a test I try to remember as many facts as I can.
39. When studying, I copy my notes over to help me remember material.
42. When I study for a test I practice saying the important facts over and
over to myself.
44. I use what I have learned from old homework assignments and the
textbook to do new assignments.
47. When I am studying a topic, I try to make everything fit together.
53. When I read material for this class, I say the words over and over to
myself to help me remember.
54. I outline the chapters in my book to help me study.
56. When reading I try to connect the things I am reading about with what I
already know.

E. Self-Regulation
32. I ask myself questions to make sure I know the material I have been
studying.
34. When work is hard I either give up or study only the easy parts. (*R)
40. I work on practice exercises and answer end of chapter questions even
when I don't have to.
41. Even when study materials are dull and uninteresting, I keep working
until I finish.

69
43. Before I begin studying I think about the things I will need to do to learn.
45. I often find that I have been reading for class but don't know what it is
all about. (*R)
46. I find that when the teacher is talking I think of other things and don't
really listen to what is being said. (*R)
52. When I'm reading I stop once in a while and go over what I have read.
55. I work hard to get a good grade even when I don't like a class.

APPENDIX G: MICROANALYSIS STUDENT SURVEYS

70

Microanalysis

Task Analysis: Goal Setting and Planning


Q1 What is your goal, in terms of percentage grade, for this assignment?
______ Goal Percentage Grade (slide bar to indicate percentage from
0% to 100%)

Q2 Do you have any other goals for this assignment? (check all that apply)
I don't have any other goals (1)
I will try my best (3)
I want to get a better grade in the class (2)
I want to finish as fast as possible (2)
I want to chose the correct strategy for completing the assignment
(3)
I will try different methods until I find one that works (3)
I want to get better at doing this task (2)
Other, please specify

Q3 Do you have a plan for attaining this goal? If yes, you can choose write what
the plan is in the text box provided (optional).
Yes (2)
No (1)

Self-Motivational Beliefs: Self-Efficacy and Outcome Expectations

71

Q4 How confident are you that you will be able to successfully understand this
material?
Not Sure At All (1)
(2)
(3)
(4)
(5)
(6)
Very Sure (7)

Q5 How confident are you that you will be able attain your outcome goal for this
assignment?
Not Sure At All (1)
(2)
(3)
(4)
(5)
(6)
Very Sure (7)

New survey page: Please stay on this page while you complete your assignment.
When you have completed the assignment please return to the survey to
complete the follow up questions. (Timing recorded)

Self-Reflection: Self-Judgment and Self-Satisfaction

72

Q6 How confident are you that you were able attain your outcome goal for this
assignment?
Not Sure At All (1)
(2)
(3)
(4)
(5)
(6)
Very Sure (7)

Q7 How satisfied are you with the outcome of this assignment?


Very Dissatisfied (1)
Dissatisfied (2)
Somewhat Dissatisfied (3)
Neutral (4)
Somewhat Satisfied (5)
Satisfied (6)
Very Satisfied (7)

Cyclical Nature
Q8 How confident are you that you attain your outcome goal for the NEXT
assignment in this series?
Not Sure At All (1)
(2)
(3)
(4)
(5)
(6)
Very Sure (7)

Você também pode gostar