Você está na página 1de 15

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

An Examination of the Correlation between Teacher Perception of Technology Integration and


Student Success on Standardized Achievement Tests
Vince Moore & Michael Serfin
The University of North Texas

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

Abstract
The purpose of this research is to identify a relationship between teachers perceptions of
technology use in the classroom and achievement levels on standardized testing in rural Texas.
The Texas Education Agency gathers and makes available test scores of its K-12 schools, as well
as maintaining an online portal for the purpose of ascertaining educators observations of
technology usage within their educational space. While separate initiatives within the same
agency, both data sets are collected and sorted by individual institution; therefore, it is possible
to sort the information logically by school. As technology advancement and usage in society
continues to increase, it is important to not only study its adoption into educational
environments, but also the impact it is having on the students in regards to the quality of their
education.

To that end, the anticipated outcome is a positive correlation between higher

technology awareness and students test scores.

However, no correlation was found.

The

findings of this study may be useful in determining how strong a bond there is between these two
variables, as well as a secondary purpose to discuss if there are better ways to collect, and more
accurately show, individual teacher technology assessments.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

An Examination of the Correlation between Teacher Perception of Technology Integration and


Student Success on Standardized Achievement Tests
For almost as long as teachers have been utilizing technology in the learning
environment, researchers have sought ways to measure the effectiveness of that integration.
These studies exhibit a wide variety of results (Ponzo, 2014; Vigdor, Ladd, & Martinez, 2014).
In some instances, technology seems to benefit the learning outcomes of the students, while at
other times the results come back as neutral or negative (Harter & Harter, 2004). However,
despite these contradictory and inconclusive findings, the connection of technology to
achievement is still a very important topic to visit. The growth of technology integration in the
education world has been immense over the past decades.

For the students of today and

tomorrow, technology integration maintains as much importance as utilization of books was for
our forefathers (Hernandez-Julian & Peters, 2012). Several areas hold importance under the
umbrella of technology integration and student achievement. It is relevant to examine how
teachers feel about technology as well as the ways that they integrate technology into their
teaching. An examination of barriers to technology integration also deserves its place in the
topic. Understanding the standardized tests that measure student achievement also holds a high
level of significance. By looking into these areas, it should be possible to find a correlation
between integration of technology, as judged by the teachers, and the achievement of students, as
measured by the standardized tests. This study asks the question: Is there a connection between
student achievement and technological integration in the learning environment? The researchers
hypothesize that in schools where teachers report a higher rate of technological integration, there
will be a corresponding increase in student achievement.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

One of the most important factors concerning how technology is implemented can be
seen in the teachers beliefs and perspective on technology and teaching in general. A study by
Becker (2000) found that technology works best in classrooms with access to technology,
prepared teachers, curricular freedom, and a constructivist pedagogy. The teachers personal
philosophy of education can have a great impact on the learning environment where technology
is concerned. Ertmer (2005) stated that to understand how technology is implemented, the
researcher must first understand what that educator believes; however, equal attention should be
paid to the understanding that beliefs and practices are not always congruent.

Teachers

ultimately make the decisions of how to utilize technology in the learning process, and their
actions may be understated when compared to their ideals and beliefs. Cuban (1986) found that
throughout the modern era of education, the effectiveness of technology integration has fallen
upon the shoulders of the teachers. Over and over again, researchers have found that what the
teacher does with the technology is more important than simply exposing the students to the
technology (Cuban, 1986; Cuban, 1999; Ertmer, 2005; Hixon & Buckenmeyer, 2009; Keengwe,
Schnellar, & Mills, 2011; Bowers & Berland, 2012; Vigdor, Ladd, & Martinez, 2014).
One of the causes for the difference between a teachers belief structure concerning
technology and the end use of computers and other instructional tools lies in the barriers to
implementation. Often, teachers are unable to fully enact their lesson ideas with regard to
technology because of external issues, such as slow internet, poor funding for hardware,
scheduling, etc. Kopcha (2012) found that there is a direct connection between the barriers to
implementation and the decision to use technology. To overcome these barriers, teachers must
be given instructional coping tools through professional development. Those findings match the
results of a study Hixon & Buckenmeyer (2009) that found that technology is greatly

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

underutilized in most American public schools because of barriers to integration and poor
administration of technology funds and hardware.

Strawn (2011) mentions that the

responsibility falls to instructional technologists to create ways to integrate technology in the


best way to see real benefits.
How do educational leaders gain insight into teacher perception of technology and
barriers to implementation? In the state of Texas, the agency governing education requires
teachers to take part in a program known as the Long-Range Plan for Technology, or LRPT.
Each year, thousands of teachers around the state answer the questions on the School
Technology and Readiness (STaR) Chart (Texas Education Agency, 2014a). The STaR Chart is
broken into several categories that examine teaching methods, educator preparedness,
administration and support, and the campus and district technology infrastructure (Davidson,
Richardson, & Jones, 2012). Teachers rate questions in each area on a four point scale; these
results are gathered to provide data for individual campuses as well as entire school districts
(Texas Education Agency, 2006).
The Texas Education Agency also administers the State of Texas Assessment of
Academic Readiness (STAAR) each year to thousands of students. Every student must achieve a
passing score, which varies among the different subjects and grade levels. These results are
compiled and each campus and district is given a rating based on the percent of students meeting
the standard set by the state (Texas Education Agency, 2014b). This practice, however, is not
the best way to utilize standardized test scores. According to a study by Haladyna, Haas, and
Allison (1998), standardized tests should not be used to evaluate schools or districts. Often
times, the standards of the curriculum are not adequately measured by the state mandated tests,
either (Bowers & Berland, 2012). However, when researchers need student achievement data,

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

these scores are often used; the STAAR results are readily available online and can provide a
snapshot of an individual student, a whole campus, an entire district, a full region or the entire
state.
Method
Participants
The population of interest for this analysis was a K-12 environment, with a slight
emphasis on secondary schools. The data used was obtained from the Texas STAAR Program
and Texas STaR chart, both run by the Texas Education Agency. The collected information and
data used was from the 2012-2013 school year. Teachers and students included in the data group
were from a wide variety of public school educational settings: High Schools (Grades 9-12),
Middle Schools (6-8), Secondary Schools (6-12), and Single-campus Schools (K-12).

The

school districts are located in a predominantly rural area of northwest central Texas. All scores
from students were collected by mandatory state-wide testing; however, the teachers
participation in the technology perception portion was tied to funding initiatives for school
districts regarding technology. While still technically voluntary, it could be stated that school
districts were monetarily incentivized to have their teachers participate.
Materials
The students were given different sections of a test over several days based on reading,
writing, and arithmetic. Both reading and arithmetic were multiple choice tests, while the
writing section required the student to provide a writing sample which was graded using a four
point scale, 4 being the highest level of attainment down to 1. All test scores are collected and
monitored by the Texas Education Agency (TEA). The student testing data was acquired by
examining the TEA School Report Cards that are available from the TEA website. These report

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

cards compile campus-wide passing results and report them as percentages. The data used in this
study is derived from a percentage of all students on a campus meeting or exceeding the stateapproved standard in the three aforementioned subject-area examinations.
The teachers perception data was provided by an online questionnaire to be taken at
some point in the school year. It is open for teachers at all levels for grades K-12. The questions
are broken down into four categories: Teaching & Learning (TL); Educator Preparation &
Development (EP); Leadership, Administration, and Instructional Support (L); and Infrastructure
for Technology (INF). Each category has six questions to be rated on a four-point scale ranging
from lowest to highest with 1-Early Tech, 2-Developing Tech, 3-Advanced Tech, and 4-Target
Tech. Each category is compiled and given a score; all four areas are averaged to give composite
score, which we used in this study. The lowest possible average is 6, while the highest possible
average would be a score of 24.
Procedure
All data was supplied from the Texas Education Agency, and included the 2012-2013
school year STAAR Reports and StaR Charts. This information was taken and filtered into
spreadsheet containing the relevant data points for analysis. First, STAAR Reading, Writing,
and Mathematics progress report percentages of students who met or exceeded the base
requirement were put into columns within the spreadsheet. Next, each schools reading, writing,
and mathematics percentages were averaged together, creating one overall progress score for
each campus.
After assimilating the STAAR Report data, attention was turned to incorporating relevant
information from Texas STaR Charts. Teachers from each campus rated technology at their
school by assigning a score in four different key areas, which are Teaching and Learning (TL);

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

Educator Preparation and Development (EP); Leadership, Administration and Instructional


Support (L); and Infrastructure for Technology (INF). Each campus then averaged the teacher
scores for a campus score. The scores for each campus were then compiled and entered into a
spreadsheet, and each schools scores from those four areas were then averaged together for an
overall score.
Results
There were sixty-nine participating schools in the final sample for evaluation. Thirty-two
of the participating schools did not have data for the writing portion of the STAAR reports, while
six of the schools did not have mathematics scores. None of these schools overlapped, and it
was decided that two out of three STAAR scores was acceptable for study inclusion. Because of
this, all sixty-nine schools were included in the final analysis.

When placed in various

scatterplots, no correlations could be found between any of the STaR chart teacher perceptions
and the STAAR testing outcomes. Each STaR category, as well as the overall total, was placed
in a scatterplot with each of the reading, writing, mathematics, and overall average of each
school from the STAAR testing data. As stated earlier, in every case the scatterplot showed no
relationships between any of the independent variables (STaR Charts) and the dependent
variables (STAAR outcomes).
This study was set up to run a multiple regression analysis on the data should a
correlation or an encouraging scatterplot have been found. It is for this purpose that each school
(dependent variable) was given an overall percentage score when averaged from the reading,
writing, and mathematics sections. This was intended to be the dependent variable, with the four
key areas from the STaR Charts (TL, EP, L, and INF) as the independent variables. However, as

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

stated in order to perform a multiple regression analysis a correlation has to exist.

The

scatterplot shown in Appendix A, as well as all others, demonstrated no such correlations exist.
Discussion
The original goal driving this research study was to examine the STaR Chart data and
STAAR results to see if a correlation exists between technology integration and student
achievement.

It was hypothesized that a positive correlation would be found; increased

technology integration leads to higher student achievement. However, no such correlation was
identified. It is important to remember that failing to find a statistical correlation is not meant to
imply that there is no relation, just that no relation was found in this particular study. Liu,
Maddux, and Johnson (2008) state that studies that find no compelling results cannot be used to
make a conclusion that technology has no positive impact on learning in a generalized manner.
And, like the other researchers mentioned earlier in this paper, there is a great deal of importance
on how the technology is utilized as opposed to simply exposing students to technology. For
instance, simply replacing books with an iPad for reading purposes does not leverage the
technology in any meaningful way other than to simply change the medium of delivery. There is
little chance that reading on a computer screen, as opposed to paper, would have a significant
impact on test scores.
In this particular study, an area of concern is the validity and reliability of the STaR chart,
which was the basis for half of the data used. The STaR chart numbers appeared to have good
internal consistency with an = 0.891 (see Appendix B). However, any measure of teacher
perception is subject to the varying interpretations of the different respondents to the survey.
One concern with the STaR charts is the inability to ascertain a difference in opinion of what
constitutes a technological classroom. One teacher may believe that having a desktop computer

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

10

in the classroom is evidence of high technological usage, while others would reserve the
distinction of high technological usage for classrooms which employ smartphones and other
portable devices in daily activities. Some survey questions could be written in such a technical
way that the teachers who are not as technologically savvy do not fully comprehend the issue
being measured. Also, since the STaR reporting is tied to funding, some campus and district
administrators may pressure their teachers to respond a certain way: a principal may caution
teachers against rating items too low or too high in an effort to ensure appropriate funds and
realistic expectations of the controllers of the state budgetary funds for technology. All of these
issues could lead to a high error variance, and thus a less reliable and valid survey.
In conclusion, this study failed to find a correlation between technology integration and
student achievement. However, this result should not deter researchers from examining the
possibilities of finding a positive correlation existing in other samples. Instead of focusing on
general implementation of technology, it would be more beneficial to study specific actions of
teachers and students with regard to learning technologies to see if specific activities, hardware,
or learning experiences lead to a greater growth in the students educations. It is also important
to remember that achievement should be measured by much more than a simple standardized test
score.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

11

References
Becker, H. J. (2000). Findings from the teaching, learning, and computer survey: Is Larry Cuban
right? [PDF file]. Education Policy Analysis Archives. Retrieved November 2, 2014 from
http://epaa.asu.edu/ojs/article/view/442/565
Bowers, A. J., & Berland, M. (2013). Does recreational computer use affect high school
achievement?. Educational Technology Research and Development, 61(1), 51-69.
Cuban, L. (1986). Teachers and Machines: The Classroom Use of Technology Since 1920. New
York: Teachers College Press.
Cuban, L. (1999). High-tech schools, low-tech teaching. Education Digest, 64(5), 53-54.
Davidson, L. Y. J., Richardson, M., & Jones, D. (2014). Teachers perspective on using
technology as an instructional tool. Research in Higher Education Journal, 24. Retrieved
November 30, 2014 from http://www.aabri.com/manuscripts/141892.pdf.
Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology
integration? Educational Technology, Research and Development, 53(4), 25-39.
Haladyna, T., Haas, N., and Allison, J. (1998). Continuing tension in standardized testing.
Childhood Education, 74(5), 262-273.
Harter, C. L, & Harter, J. F. R. (2004). Teaching with technology: Does access to computer
technology increase student achievement? Eastern Economic Journal, 30(4), 507-514.
Hernandez-Julian, R., & Peters, C. (2012). Does the medium matter? Online versus paper
coursework. Southern Economic Journal, 78(4), 1333-1345.
Hixon, E., & Buckenmeyer, J. (2009). Revisiting technology integration in schools: Implications
for professional development. Computers in the Schools, 26, 130-146.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

12

Keengwe, J., Schnellert, G., & Mills, C. (2012). Laptop initiative: Impact on instructional
technology integration and student learning. Education and Information Technologies, 17,
137-146.
Kopcha, T. J. (2012). Teacher perception of the barriers to technology integration and practices
with technology under situated professional development. Computers & Education, 59,
1109-1131.
Liu, L., Maddux, C., & Johnson, D. L. (2008). Assessment of integration of technology in
education: Countering the no significant differences argument. Computers in the
Schools: Interdisciplinary Journal of Practice, Theory, and Applied Research, 25(1-2), 1-9.
Ponzo, M. (2011). Does the way in which students use computers affect their school
performance? Journal of Economic and Social Research, 13(2), 1-27.
Stansfield, W. D. (2011). Educational curriculum standards & standardized educational tests:
Comparing apples & oranges. The American Biology Teacher, 73(7), 389-393.
Strawn, C. (2011). What does the research say? Learning & Leading with Technology, 39(2), 3839.
Texas Education Agency. (2006). Long-range plan for technology, 2006-2020. Retrieved
October 27, 2014 from
http://tea.texas.gov/WorkArea/linkit.aspx?LinkIdentifier=id&ItemID=
2147494561&libID=2147494558.
Texas Education Agency. (2014). School technology and readiness chart. Retrieved October 27,
2014 from http://tea.texas.gov/starchart/.
Texas Education Agency. (2014). STAAR resources. Retrieved October 27, 2014 from
http://tea.texas.gov/student.assessment/staar/.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

13

Vigdor, J. L., Ladd, H. F., & Martinez, E. (2014). Scaling the digital divide: Home computer
technology and student achievement. Economic Inquiry, 52(3), 1103-1119.

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

Appendix A

14

Running Head: TECH INTEGRATION AND STUDENT ACHIEVEMENT

Appendix B

15

Você também pode gostar