Você está na página 1de 43

Lorraine McCullough and Arika Collier and Michael Johnson

MEDT 8480

Dr. Westine

13 November 2016

Assignment 5: Evaluation Plan

Background of the Program and Evaluation

Description of the Program Evaluated

In August of 2014, Gwinnett County Public Schools (GCPS) rolled out their digital

curriculum and instruction tool: eCLASS. In an era where blended learning has become a focus,

many report that an online learning management system helps students investigate, collaborate,

focus, and take ownership in their learning (El-Mowafy, Kuhn, & Snow, 2013). In Georgia

specifically, Georgia Virtual School has become very popular with secondary students, and they

use the Desire2Learn platform (Teague, 2013). Several universities have also followed suite and

incorporated this popular platform into their student learning. Consequently, GCPS has

implemented eCLASS using the Desire2Learn platform.

The vision is to reach beyond the classroom walls and into the world of digital learning.

The eCLASS tool is a system for technology to permeate the education of Gwinnetts learners

(GCPS, 2017). Through the eCLASS Curriculum and Instruction tool, the ultimate goal is to

provide a digital classroom to enhance student engagement and the learning process (GCPS,

2017). eCLASS is the countys digital learning management system. Through features such as

content, calendar, discussion, and dropbox, GCPS teachers create an online classroom for

remediation, enrichment, and the daily teaching and learning. The use of eCLASS allows
teachers to build digital content to supplement and enhance their instruction which leads to the

ultimate goal of student engagement and achievement.

Every GCPS teacher has an eCLASS course page for each course they teach where the

students are automatically enrolled. Students have multiple course pages to view for each teacher

they see throughout the day. Both teachers and students login to eCLASS using their individual

identification number and self-created password to enter the secure program. The course shells or

templates are provided, but teachers import and create the content and digital learning

experience. The expectation at the local high school being evaluated is teachers use the Calendar

and News feature regularly, and it is expected that teachers will incorporate the Content feature

into every unit they teach. Content features include discussion, dropbox, and assessments. A

discussion board is where a teacher can post a prompt, and students have the opportunity to

respond, discuss, and rely to one another. It creates an online discussion learning environment. A

dropbox is where a student can submit an assignment electronically in a variety of formats such

as Word, PDF, or MP4. The teacher can then go in and grade and immediately send feedback to

the student. A classroom assessment is an online tool for tests and quizzes. Teachers input the

assessment electronically and can set certain permissions or restrictions. Students are then able to

take their assessments online and receive instant feedback. Teachers can also upload outside

materials such as videos, articles, websites, etc. Teachers are expected to update their course

pages daily and embed Content features with each instructional unit.

Evaluation Purpose

After inconsistent training and unclear administrative expectations, the current

administration and technology team wish to evaluate the effectiveness of the new local school

eCLASS technology training and support for teachers. This program should be evaluated because
it was determined after the previous years teacher evaluation reports, there was a gap in

eCLASS usage and ability level. To help bridge the gap in eCLASS knowledge and skill for

teachers, a new training plan was developed. The evaluation will be used to plan further teacher

technology training and support as well as assess current teacher eCLASS usage and ability

level. The evaluation will not affect a teachers formative or summative evaluation report. The

focus is to evaluate the training of the tool eCLASS, not to punish teachers for infrequent usage.

For further information, reference Appendix A to see the Program Logic Model.

The first evaluation question asks to what extent do teachers use eCLASS content

features for each instructional unit they teach. This is important to investigate because the

administration expects teachers to use Content features such as assessments, discussions, or

embed other outside digital resources for each instructional unit. The second evaluation question

inquires about the number of teachers who believe eCLASS has improved their teaching and

students learning. When implementing a relatively new system, it is crucial to have teacher

ownership. If teachers buy into the program and believe in it, it will be successful. The third

evaluation question investigates what percentage of the faculty find the training opportunities

such as lead innovators, Tech Tuesday, and Lunch and Learns useful for their technology

professional development. This final evaluation question is critical because it will help plan for

future technology training and professional development. If there is a gap, then the technology

team needs to address it, so it is necessary to discover if teachers find the training beneficial for

their professional growth.

Description of who will be involved in evaluation effort

The people involved in the evaluation effort will include the evaluators, Media Specialist,

Local School Technology Coordinator (LSTC), eCLASS learning specialist, Assistant Principal
of Staff Development, and the teachers. The Media Specialist and Local School Technology

Coordinator can provide information on the training and the content materials. The eCLASS

learning specialist can run the reports and retrieve data. Since the eCLASS learning specialist

travels to other schools, she can also provide comparison data to see where the local school

aligns with other schools. The Assistant Principal of Staff Development can also provide

information the training, and she can provide information on the teacher evaluation reports. The

teachers will be critical with observations and surveys as we look at their usage, training, and

beliefs.

Evaluator Arika Collins is a veteran teacher of nineteen years. The fifth grade teacher

holds certifications in Early Childhood education, Gifted, and Middle Grades Math. Ms. Collins

is a graduate student of the University of West Georgia, seeking an Ed.S in Media and

Instructional Technology.

Evaluator Michael Johnson is a high school business and technology instructor that has

over six years of experience working with middle grade and high school students. He is pursuing

an Education Specialist Degree in Media and Instructional Technology from the University of

West Georgia. He has earned the following degrees:Masters of Science in Technology Education

from Mississippi State University, Bachelors of Business Administration with a Concentration

in Business Information Systems from Mississippi State University, and Associate of Arts

Degree in General Business from Meridian Community College. He is National Board Certified

in Career and Technical Education from Adolescence to Young Adulthood. He holds a

Mississippi educators license with several business, computer, and technology endorsements.

He serves as the advisor for his schools chapter of the National Technology Student Association.
Additionally, has served on several school committees, and has been a presenter at the

Mississippi Educational Computing Association.

Lorraine McCullough has worked with high school students for six years at Collins Hill.

She has earned a Bachelors in English, a Masters in Accomplished Teaching, and she will

graduate in December with her Specialist in School Library Media. Lorraine has earned multiple

certifications including English, Gifted, Special Education, and Media. For the first four years,

she served as an English teacher for freshmen and juniors. After teaching all levels from college-

prep to Advanced Placement, she was hired to be the media specialist at Collins Hill High School

in August 2015. In addition, Lorraine also serves as an eCLASS innovator and offers instruction

and support to teachers with their technology training. Lorraine will bring context knowledge to

the team to educate her colleagues on eCLASS. While she is directly involved with the school

being evaluated, she will step aside during data collection while members of the staff and the

evaluation team compile the responses to avoid any type of bias.

Methodology

Data and Instrumentation

For the first question examing the extent teachers use eCLASS content modules,

quantitative data will be collected to determine the frequency. A faculty survey, eCLASS

Learning specialist interviews, and eCLASS usage reports will be used to measure how often

teachers use eCLASS modules for their instructional units. The second question investigates how

many teachers believe eCLASS has improved their teaching and students learning. Qualitative

data and quantitative data will be used to answer this question. Qualitative data will be collected

in the form of faculty survey. Quantitative data will be collected from GA Milestones from 2015

and 2016 along with eCLASS student user completion reports, and Teacher Evaluation Self
Assessment 2015 and 2016 ratings. All teacher and student names will be removed to protect

identity and privacy. This data will be used to determine the perceived effectiveness of eCLASS

on teaching and learning. The third question determines how many teachers find the training

helpful for their learning. To ascertain the usefulness of eCLASS professional development,

quantitative data will be collected in the form of staff development surveys, eCLASS usage

reports, and staff attendance logs. For an overall timeline with the data collection, please see

Appendix B for the Data Collection Matrix.

An electronic anonymous faculty survey (Appendix C) will administered using Google

Forms. Since the school is a Google Apps for Education (GAFE) school, evaluators will want to

use resources already available and familiar to the faculty. When using Google Forms, a

descriptive statistics report is automatically created which makes data analysis readily available.

Data can also be exported into a Google sheets spreadsheet for line item comparison. The survey

will ask teachers several things such as how they use eCLASS for each unit and what features

they specifically use such as discussion and dropbox. The survey will also ask faculty to self-

assess whether or not eCLASS has improved their teaching. In addition, faculty will also be

asked about the staff development opportunities. They will be given the opportunity to rate each

opportunity, such as Tech Tuesday and Lunch and Learn, in terms of effectiveness and

usefulness. From this faculty survey, the evaluators will be able to pull data to answer each

evaluation question. Each question will either be selected response or Likert scale.

An interview with the eCLASS Learning Specialist (Appendix D) will also be helpful

with data collection. The interview will be recorded to allow thorough data analysis and detailed

field notes. An eCLASS Learning Specialist was assigned to the local school to provide training

and support to teachers. Her role is eCLASS support and training, and she helps other schools as
well. She will be able to provide great context as well as comparison for other schools. She is

very familiar with the platform, and she sees other schools, their usage, and the training.

Conducting a semistructured interview with the eCLASS Learning Specialist will be a great

learning opportunity for background knowledge and context. After the interview is conducted,

field notes will be compiled in order to begin coding and data reduction. From here, the

evaluators will determine themes and outliers.

School data will be instrumental in order to effectively answer the evaluation questions.

The evaluators will have access to eCLASS usage reports. These reports include several items

such as how much time the teacher and his or her students have spent in the platform. It includes

what features are being used and how often. It provides how many Content activities a teacher

has in their course page. It also provides a list of the top users as well as the non-users. The data

can be overwhelming, but it is very informative at the same time. Teacher evaluation data will

also be key in the evaluation plan, especially since it will support the belief or disbeliefs from the

second evaluation question. Professional development attendance logs will be looked at, as well

as teacher self-assessment ratings from the 2015-2016 and 2016-2017 school year. These ratings

will help provide support as to whether or not teachers believe eCLASS has improved their

teaching. To look at whether or not eCLASS has improved student learning, Georgia Milestones

assessment data and student user reports from eCLASS will be analyzed. Comparing the current

GA Milestone test scores to EOCT scores before eCLASS will provide good comparison context

for the evaluators.

Sampling

The evaluators are concerned with seeing the differences in eCLASS from each

department such as math, science, fine arts, etc. In order to see a perspective from each
department, it would be best practice to conduct a stratified random sampling to ensure equal

representation from each department. As a result, the evaluators would need to take a list of each

department, and then randomly select faculty members from each department. It would be ideal

to obtain five faculty members per department to complete the surveys from math, language arts,

science, social studies, fine arts, career and technical education, special education, health and

physical education. Five teachers per department would generate 40 participants in the data

sample.

When looking at the participants for evaluation question one, the faculty, local school

technology coordinator, and eCLASS Learning Specialist will all be necessary to comment on

the extent teachers use eCLASS modules with each instructional unit. All participants can

provide both quantitative and qualitative data. When addressing evaluation question two to

examine eCLASS improving teaching and learning, the faculty, assistant principal for testing,

assistant principal for teacher evaluations and staff development, and local school technology

coordinator will be needed to provide their feedback and quantitative data. All participants

cooperation is needed to establish whether or not eCLASS is responsible for improved teaching

and learning. When addressing evaluation question three to assess the eCLASS training, the

assistant principal for teacher evaluation and staff development and local school technology

coordinator are needed to obtain usage reports, attendance logs, and staff development evaluation

surveys, These participants will be key to gathering the aforementioned data.

Participants hold various roles in the implementation of eCLASS: teacher, lead innovator,

media specialist, local school technology coordinator, eCLASS learning specialist. Their unique

perspectives will offer data on the program needs, context, operation, outcomes, cost and

efficiency. In addition, the following staff members will be crucial in helping with data
collection: the local school technology coordinator, eCLASS Learning Specialist, Assistant

Principal for Testing, and Assistant Principal for Teacher Evaluations & Staff Development.

Analysis

The quantitative data consists of nominal data and ordinal data. From the faculty survey,

eCLASS usage reports, GA Milestone test scores, and Teacher Evaluation reports the data will

named in a specific category (nominal) or ranked on the Likert scale (ordinal). The majority of

the data will be ordinal, so the statistical test that would be most appropriate is Goodman and

Kruskals Gamma. Since the Goodman and Kruskals Gamma test measures correlation rank, this

test seems the most fitting for comparison purposes. The qualitative data from the interview will

be analyzed through coding. After codes are determined, themes will be developed to fully

understand relationships, comparisons, and inconsistencies.

An analysis will be performed on all data, which includes comments and results from the

surveys. A thematic analysis on the electronic survey data from teachers and quantitative data

from eCLASS usage reports will be used to identify engagement time. The first program

evaluation question is based on system reports and qualitative data from the survey, looking

specifically at usage. Text-based responses will be analyzed qualitatively, extracting common

themes. The next two program evaluation questions are both quantitative and qualitative in

nature, and the survey and focus group/interview process will be used to identify the answers.

Since both quantitative and qualitative data will be used, this evaluation is considered a

mixed methods. It is beneficial to utilize this method because the questions require different

responses. Some can be answered with quantitative feedback while another can be answered with

qualitative feedback. All three questions will utilize both qualitative and quantitative feedback.
The qualitative feedback will provide insight and context, while the quantitative will provide

evidence and support.

Standards and Benchmarks for Evaluation Questioning

Evaluation question one will determine how many teachers use eCLASS features such as

discussion and drop box for each instructional unit they teach. The standard for evaluation

question one is 70% of the teachers at the local school use one eCLASS Content feature for each

instructional unit.While the ultimate goal is 100%, 70% is a reasonable goal since eCLASS has

been introduced to the faculty for several years. During the first two years, the expectation was to

utilize the News and Calendar feature. For the last two years, the expectation has been to

incorporate Content features for each instructional unit. Since the expectation has been set to use

content, after two years it is expected that 70% of the faculty use eCLASS for each instructional

unit they teach. Evaluation question two will survey how many teachers believe eCLASS has

improved their teaching and their students learning. The standard for evaluation question two is

75% of the teachers at the local school believe eCLASS has improved their teaching and student

learning. Like the previous question, 100% is the ultimate goal, but there has been a large

turnover with staff. There is a wide range of teacher experience and technology ability level due

to the high turnover, so 75% would be an acceptable measure. Evaluation question three will

inspect how many teachers find the eCLASS training helpful. The standard for evaluation

question three is 80% of the teachers find eCLASS training helpful. With the high turnover rate

previously mentioned and the large number of new teachers, 80% would be the ideal target

because the school is providing more support now than ever before in a variety of formats.

In order to effectively draw an evaluative conclusion, the Gwinnett County Public

Schools Gwinnett Teacher Effectiveness System will be used to provide a benchmark and
standard. The Gwinnett Teacher Effectiveness System is a detailed system that administrators use

to evaluate teachers each year. Teachers can receive the following ratings: ineffective, needs

development, proficient, and exemplary. There are ten performance standards, and two of them

tie to the learning management system eCLASS: instructional planning and instructional

strategies. With instructional planning, it is expected that teachers use the local district resources

to address all needs for all students. eCLASS falls into a local district resource that should be

used for teaching and learning. With instructional strategies, it is expected that teachers will

engage student learning by using best practice strategies, including online learning. eCLASS

falls into this performance standard seamlessly, and many evaluators analyze a teachers

eCLASS usage.

References

El-Mowafy, A. a., Kuhn, M. m., & Snow, T. t. (2013). Blended learning in higher education:

Current and future challenges in surveying education. Issues In Educational Research, 23(2),

132-150.

GCPS. (2017). Gwinnett effectiveness initiative. Retrieved from

https://publish.gwinnett.k12.ga.us/gcps/myhome/public/employment/gei

GCPS. (2017). What is eCLASS? Retrieved from

https://publish.gwinnett.k12.ga.us/gcps/home/public/about/eclass/content/what-is-eclass
Teague, C. C. (2013). Learning at Georgia Virtual School. Distance Learning, 10(4), 15-22.

Appendix A: Program Logic Model

Inputs Activities Outputs Short-Term Intermediate Long-Term


Outcomes Outcomes Outcomes

Teachers Technology Tuesday Teacher attendance Teachers learn Teachers select a eCLASS
Trainings logged & how to create a few features to successfully and
documented at news item, incorporate onto effectively is
Friday Lunch & Tech Tuesday, discussion post, their page for their utilized to build
Learns for Share & Lunch & Learn, dropbox, and upcoming digital content to
Tell etc. assessment in instructional units. enhance teaching
eCLASS. & learning
eCLASS Training through extensive
eCLASS Lead
training presentations Teachers view teacher
Innovator (one rep per Teachers use
materials & created for staff eCLASS lead technology
department) training eCLASS as a
documents & development by innovators as training and
platform to share
tutorials lead innovators. resource and eCLASS lead
& collaborate with
support innovators.
their content
Leads create personnel for teams and
Leads share lessons & shared lesson plans their eCLASS. eCLASS lead Teachers
instructional practices and template innovators. consistently
eCLASS on eCLASS. documents for Teachers learn enhance units with
trainers (lead
courses with best practices eCLASS features.
innovators & Teachers take best
standards. and use shared
learning Tech tools, tips, and lessons on their practices and Teachers view
specialists) best practices uploaded Leads create a Best course pages. incorporate it into eCLASS as an
to eCLASS. Practices their individual effective resource
document to course pages. for teaching and
upload and share. learning.
Teachers learn
APs make Admin Teachers
Instructional AP News &
Bulletin document confidently
technologists Content & Teachers direct
to post on eCLASS improve in their
train APs how Announcements themselves to
APs use Content and with news. instructional
to use specific through eCLASS eCLASS for
News features in practice by
tools digital course page. Admin news &
eCLASS. implementing
tools. content.
eCLASS features
supported by
Admin.

Appendix B: Data Collection Matrix

Research Question Instrument/Existing Participants Timing


Data Source

To what extent do eCLASS usage reports Faculty (150 teachers) October 2016
teachers use
eCLASS content Local School
features for each Faculty survey Technology November 2016-
instructional unit Coordinator (1) December 2016
they teach?
eCLASS Learning eCLASS Learning October 2016
Specialist Interviews Specialist (1)

How many teachers Faculty Survey Faculty (150 teachers) April 2016
believe eCLASS
has improved their GA Milestones 2015 & Assistant Principal for June 2016
teaching and their 2016 test scores Testing (1)
students learning?
Teacher Evaluation Self- Assistant Principal for October 2016
Assessment 2015 & 2016 Teacher Evaluations &
ratings Staff Development (1)
May 2016
eCLASS student user Local School
completion reports Technology
Coordinator (1)

How many teachers Staff Development Assistant Principal for April 2016
find the training Surveys Teacher Evaluation &
(lead innovators, Staff Development (1)
tech Tuesdays,
lunch and learns) Local School October 2016
helpful their eCLASS usage reports Technology
learning? Coordinator (1)

Assistant Principal for May 2016


Training Attendance Logs Teacher Evaluation &
Staff Development (1)

Appendix C: Faculty Survey

eCLASS is the official learning management system for Gwinnett County Public Schools. Please complete the
following survey to assess and evaluate your experience with eCLASS.

1. How often during the past month have you used eCLASS?
Daily Rarely
Weekly Never

2. Please indicate how eCLASS is used typically in your courses. (Check all that apply)

To push out information only, such as posting a syllabus or other handouts


To promote interaction outside of the classroom by using discussion boards,dropbox, assessments,
etc.
To flip my classroom completely. All of my material is online, and students are expected to watch
and view the content.

Please use the following rating system for questions 3-4 to select your satisfaction or dissatisfaction
1 = very satisfied 4 = dissatisfied
2 = satisfied 5 = very dissatisfied
3 = neutral 6= no opinion
3. If you use eCLASS, please indicate your overall satisfaction by selecting the appropriate number for each
component.

System availability Training


System response time Course management tool
Ease-of-use Online collaboration tool
Tech support Overall satisfaction
Online self-help materials Student learning

4. Please indicate your overall satisfaction with eCLASS features by selecting the appropriate number for each
feature.
Dropbox Safari Discussion
Calendar Montage Bookmarks
News Assessments Course Shells
Content
5. Please indicate your agreement with the following statements.
1 = strongly agree 2 = agree 3 = neutral
4 = disagree 5 = strongly disagree n/a = dont know/dont use
eCLASS is critical in my teaching.
The associated support services are critical in my teaching.
eCLASS is very useful as a tool to enhance student learning .
Appendix C: eCLASS Learning Specialist Interview

1. How do other schools in the county conduct their eCLASS training?


2. Who leads the training?
3. What are some of the features or topics you addressed in training?
4. Who leads and/or facilitates the training?
5. What is the delivery method for training and how often are training opportunities

available?
6. What is an acceptable measure of the number of teachers who use eCLASS

features for each instructional unit?


7. How do other schools measure student learning using eCLASS as the

measurement tool?
8. What are the best practices in eCLASS for teacher collaboration?
9. What are best practice strategies to promote and engage student learning using

eCLASS that you have seen at other schools?


10. How do other school factor eCLASS usage and participation into the Teacher

Evaluation system?
Appendix D: Evaluation Contract

EVALUATION CONTRACT

This is an agreement between Lorraine McCullough, Arika Collins, and Michael Johnson
(hereinafter referred to as the Evaluator) and Collins Hill High School (hereinafter referred to as
the Evaluation Client).

GENERAL INFORMATION

Title of Project: eCLASS Learning Management Training and Staff Development

Scope of Work: _ The Evaluator will collect data (or use previously collected data) and
analyze this data to answer three evaluation questions regarding implementation or the
effectiveness of the eCLASS Learning Management Platform as it relates to the mutually agreed
upon components of the programs operation or intended outcomes in a formative capacity.

WORK STEPS

Work steps include the following: a) develop a program logic model for the evaluation; b)
literature review as it relates to the function of the program and/or importance of the
program in the local setting; c) develop evaluation questions that are mutually agreeable
between the Evaluator and the Evaluation Client that will drive data collection and
analysis; d) identify and document data collection methods; e) define data collection
resources; f) define the sample of participants for which data collection and analysis will
be applicable; g) have evaluation plan peer- and self-reviewed; h) collect or gather
relevant data; i) analyze the data using valid and reliable analysis techniques (e.g.,
statistics) and tools; j) review initial findings with Evaluation Client and incorporate input
as necessary; k) prepare draft evaluation report and incorporate feedback from the
Evaluation Client as necessary, and l) submit a final evaluation report.

FIELD VISITS

All on-site fieldwork, data collection, interviews, and/or observations shall be


coordinated with the Evaluation Client. Off-site fieldwork is not expected, but if required
will be coordinated with the Evaluation Client.

PROJECT MANAGEMENT

Background: The purpose of this evaluation is to formatively investigate the


effectiveness of the eCLASS Learning Management Training and Staff Development. Evaluation
findings will be used to improve the program.

Performance Period: Evaluation planning will take place during October-November 2016, and
data collection and analysis will occur January 2017 and April 2017.
Type of Contract: Time and materials. Any costs of the evaluation will be covered by the
Evaluator.

CONTRACT AWARD MEETING


The Evaluator shall not begin work on the evaluation until the Evaluator and the
Evaluation Client have met and approved the evaluation plan that is outlined within this
Evaluation Contract/Statement of Work.

GENERAL REQUIREMENTS
1. If desired, the Evaluation Client may designate another organizational
representative to serve as the Evaluation Client to act on his or her behalf. A
signature is needed by both the Evaluation Client (who represents
organizational/building level authority) and the designated Evaluation Client.
2. The Evaluation Client will approve and provide access to 2015-2016 GA
Milestones test scores, 2015-2016 and 2016-2017 Teacher Evaluation Self-
Assessment ratings, professional development attendance logs, and eCLASS usage
reports.
3. The Evaluation Client will authorize surveys, interviews, or observations.
4. Requests for additional data or research beyond the items listed above will
require written approval by the Evaluation Client and will be attached to the
Evaluation Contract as an Addendum.
5. All written deliverables shall be phrased in acceptable terminology of the
field; words shall be defined in layperson language. If necessary, statistical and other
technical terms shall be defined in a glossary of terms and referenced for validity and
usefulness.
6. Electronic copies of the draft deliverables will be submitted to the
Evaluation Client via email for review and feedback. When the Evaluator and
Evaluation Client meet to review a deliverable, a hard copy will also be provided by
the Evaluator. If there is not a response from the Evaluation Client within three
business days from the data the item was delivered, it shall be deemed approved.
The Evaluator will have three business days to deliver the final deliverables from the
data of receipt of the Evaluation Clients comments. All deliverables shall be
delivered in software used by the Evaluation Client.
7. The Evaluation Report will be written to adhere to the APA Style (6th
Edition).
8. A formal presentation of the evaluation findings will be conducted with
the Evaluation Client.

SPECIFIC MANDATORY TASKS AND ASSOCIATED DELIVERABLES

Description of Tasks and Associated Deliverables:


The Evaluator will provide the specific deliverables described below.

Deliverable 1: An Evaluation Plan which describes a background of the eCLASS


Learning Management Training and Staff Development, a program logic model, the
agreed upon evaluation questions, a description of the evaluator, a detailed description of
the intended evaluation activities including the sample, data collection methods, data
analyses, relevant instruments including program documentation, contract, and
formative/summative metaevaluation forms.

Task 1: The Evaluator will develop the program logic model to be used to guide the
program evaluation. The program logic model will be shared with the Evaluation Client
for feedback.
Task 2: The Evaluator will use the program logic model to identify evaluation questions.
The questions will be mutually agreed upon by the Evaluation and Evaluation Client.
Task 3: A literature review will be conducted to defend the purpose of the program, its
function, and the need for evaluation work, useful instruments, as well as to identify
relevant standards for drawing an evaluative conclusion.
Task 4: The Evaluator will develop or find instruments for collecting data, or identify
relevant existing datasets.
Task 5: The Evaluator will write a data collection and analysis plan.
Task 6: The Evaluator will have a draft of the evaluation plan formatively metaevaluated
by peers in the MEDT 8480 course.
Task 7: The evaluator will conduct a summative metaevaluation of the evaluation plan.

Deliverable 2: A data collection and analysis report to inform the Evaluation Client on
the Evaluators ability to successfully collect data and complete the relevant data
analyses.

Task 1: The Evaluator will collect new or gather existing data.


Task 2: The Evaluator will use the methodology described in the evaluation plan to
analyze the data.
Task 3: The Evaluator will share the results of the data analysis with the Evaluation
Client for input, and update the Evaluation Client on emerging analyses that may be
relevant or if certain planned analyses are not feasible.

Deliverable 3: A draft evaluation report which contains all relevant components of the
evaluation report will be provided to the Evaluation Client for input and reaction.

Task 1: The Evaluator will write an evaluation report starting with the text of the
evaluation plan. First the evaluator will update this text to the past tense to reflect that
the data collection and analyses have been conducted. The specific methodologies, if
different than what was reported in the plan should be updated as well to reflect the
realities of the evaluation effort. The evaluation report will include a write up of the
evaluation findings in response to the evaluation questions, and include a description of
any limitations associated with the effort, and recommendations for improving the
program as well as next steps for further evaluation work.
Deliverable 4: A final evaluation report in APA format.

Task 1: The Evaluator should decide which feedback from the Evaluation Client to
include. In some cases, the Evaluator will choose not to incorporate specific feedback,
and will instead justify the decision not to change the findings.
Task 2: A formal presentation with the Evaluation Client and any parties designated by
the Evaluation Client will be conducted to discuss findings and important
recommendations.

SCHEDULE FOR DELIVERABLES

Deliverable Due Date


Evaluation Plan, which includes: November 13, 2016
Background Information on Program to be
Evaluated
Program Logic Model
Evaluation Questions (previously agreed upon)
Sampling Plan (if necessary)
Data Collection and Analysis Plan
Instruments
Evaluation Contract
External Metaevaluation (peer review) forms
Internal Metaevaluation (self-reflection) form
Data Collection and Analysis Report November 2016-May
2017
Draft Report for Evaluation Client to Review April 30, 2017
Evaluation Report May 31, 2017
Presentation to Evaluation Client and Organization June 15, 2017

The Evaluator shall provide all deliverables to the Evaluation Client as agreed upon in the
schedule established at the initial meeting, and outlined in the table above. Unless
otherwise specified, the number of draft copies and the number of final copies shall be
the same. If for any reason any deliverable cannot be delivered within the scheduled time
frame, or the contents of the deliverable changes, the Evaluator is required to explain why
in writing to the Evaluation Client, including a firm commitment of when the work shall
be completed. This notice to the Evaluation Client shall cite the reasons for the delay and
the impact on the overall project. The Evaluation Client shall then review the facts and
issue a response in accordance within three business days.

CHANGES TO STATEMENT OF WORK


Any changes to this statement of work shall be agreed upon and approved by both the
Evaluator and the Evaluation Client. A copy of each change will be documented and kept
in a project folder along with all other products of the program evaluation project. Any
costs associated with the changes will be at the Evaluators expense.

REPORTING REQUIREMENTS
The Evaluator will provide the Evaluation Client with weekly progress reports via email.
The progress reports shall cover all work completed during the preceding week and shall
present the work to be accomplished during the subsequent week. This report shall also
identify any problems that arose with a statement explaining how the problem was
resolved. This report shall also identify any problems that have arisen but have not been
completely resolved with an explanation.

TRAVEL AND SITE VISITS


No travel is required or expected. Any unexpected travel must be pre-approved by the
Evaluation Client.

SCHOOL RESPONSIBILITIES
The school shall provide access to technical and procedural information regarding the
eCLASS Learning Management Training and Staff Development. The schools shall
provide a copy of a confidentiality statement (if required) upon request by the Evaluator.
If required, the Evaluation Client agrees to work with the Evaluator to adhere to any
district-level data access or data use agreements.

CONTRACTOR EXPERIENCE REQUIREMENTS

The Evaluator will perform this evaluation as an authentic learning experience to fulfill
requirements in the Ed.S. program School Library Media and/or Instructional Technology
program at the University of West Georgia, College of Education, Department of Media
and Instructional Technology. The professor for this course is Dr. Carl Westine (Email:
cwestine@westga.edu, Office Telephone: 678-839-6095).

CONFIDENTIALITY AND NONDISCLOSURE

This evaluation effort is considered to be an internal evaluation for the purposes of


program improvement, the results of which will not be released or disseminated beyond
Collins Hill High School. Only staff members as approved by the Evaluation Client will
be able to view the findings of the program evaluation. All data will be kept confidential,
and no student names or identifying characteristics/information will be used in the
evaluation report.
EVALUATION PLAN AND CONTRACT APPROVAL

The evaluation plan including the evaluation contract was reviewed and accepted by:

Evaluation Client: Evaluation Client Representative


(if different):

Print Name Print Name

Name of Organization Name of Organization

Email Address Email Address

<NOT NEEDED> <NOT NEEDED>


Signature Signature
Date Date
Appendix E: External Metaevaluation

Qualitative Metaevaluation Form Using The Program Evaluation Standards, 3rd Edition

MEDT 8480

Evaluator: Lorainne McCollough, Arika Collier

Metaevaluator: Stephanie Stone, Michelle Hanners, 11/6/16


Sundi Cowser, NJemele Bush

NAME DATE

Overall Comments:

Utility: the evaluation purpose is clear, the evaluation serves the needs of stakeholders,
more specification could be used on the individual and cultural values that underpin the
purpose, processes and judgements of the evaluation.

Feasibility: The evaluation procedures are practical. A specified timeline or schedule for
executing various activities of the evaluation plan would be good to include. Google
documents will be used for the surveys, but other resources needed for the evaluation
are unclear.

Propriety: The evaluation appears to be responsive to the school system and the
community it serves. The evaluation agreement is not attached. No discussion of
human rights is evident. Are efforts to protect privacy (FERPA) included? The
evaluation seems to be fairly clear, except where otherwise noted) and fair. Your
evaluation will surely provide your findings in a clear manner. All of the questions
whether in interviews or on forms will be clear, I am sure. Are any of the evaluators
members of the GCPS? This could present a conflict of interest, whether positive or
negative. We think it was unclear to all of us about the fiscal notes we should make. We
are not sure how we are supposed to know this without actually participating in an
evaluation.

Accuracy: As is in this report, we believe that your conclusions would be justified


correctly. The information gained should serve its ultimate purpose - to determine if the
program is being used and implemented as intended. We believe your data collection
will yield reliable information. The program description you have provided seems to
provide enough details for the evaluation purpose. We believe that Google Drive is a
reliable and sufficient storage for your information and for collection purposes. Are your
interviews going to be recorded? Your design seems to to be adequate and we are sure
the analyses will be as well. The reasoning and communicating standards are hard to
assess through just a pre-report.

Evaluation accountability: We believe that your evaluation report, except as otherwise


noted, documents your review fully. Most of the evaluation standards are evident in your
report. The only problem we find evident is if some of your evaluators are employees of
Gwinnett County. The conflict of interest is of concern to us in regards to that.

Additional Feedback: No appendices are attached; I am sure when you add your google
forms and interview questions, some of our questions will be cleared up.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative


STATEMENTS Feedback/Comments
HARD FAIR CLEA HIGH
LY LY RLY LY

UTILITY

U1 Evaluator Evaluations should X I would make a paragraph


Credibility be conducted by for each of you describing
qualified people who why you are qualified. Kind of
establish and like an intro to your resume.
maintain credibility in
the evaluation
context.

U2 Attention Evaluations should X Perhaps make mention of


to devote attention to students in this section and
Stakehold the full range of refer to how they are affected
ers individuals and by the program and how they
groups invested in could benefit from the
the program and evaluation of the program.
affected by its
evaluation.
U3 Negotiated Evaluation purposes X Again, it may be useful to
Purposes should be identified include purpose as it relates
and continually to student achievement. How
negotiated based on is eClass usage in the
the needs of classroom tied to student
stakeholders. success? This is one of the
long-term outcomes and
should be mentioned in the
purpose of the evaluation.

U4 Explicit Evaluations should X


Values clarify and specify the The cultural values of the
individual and cultural school include student
values underpinning achievement and teacher
purposes, processes, competency. The purpose of
and judgments. the evaluation is to ensure
student achievement through
using eClass. The processes
and judgements are guided
by the school culture.

U5 Relevant Evaluation X Need section is thorough and


Informatio information should provides a clear
n serve the identified understanding of why the
and emergent needs program should be evaluated.
of stakeholders. When referring to the wide
range of capabilities of
teachers using the program, it
may be beneficial to
elaborate on these various
levels of familiarity. Are the
majority of teachers in the
county struggling with the
program? 50%? Less than
50%?

U6 Meaningful Evaluations should X It may be a good idea to


Processes construct activities, give a quick overview of the
and descriptions, and assessment features of
Products judgments in ways program: discussion posts,
that encourage quizzes, dropbox capabilities,
participants to etc. This is mentioned in the
rediscover, need section, but prior to
reinterpret, or revise elaborating on why teachers
their understandings need to use these features, it
and behaviors. would be good for the reader
to understand how these
features can enhance
instruction.
U7 Timely and Evaluations should X The eClass program is
Appropriat attend to the currently being utilized
e continuing county-wide, but is still
Communic information needs of relatively new in terms of
ating and their multiple implementation. An
Reporting audiences. evaluation at this time should
provide evaluators with a
wide range of usages and
abilities from which to draw
on.

U8 Concern Evaluations should X Findings will need to be


for promote responsible presented in a way that
Conseque and adaptive use avoids singling any teacher
nces and while guarding out for non-compliance. With
Influence against unintended regards to the GTES
negative information, the evaluator
consequences and would have to connect
misuse. teacher usage to GTES
rating, and this may be a
breach of confidentiality.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative


STATEMENTS Feedback/Comments

HARD FAIR CLEA HIGH


LY LY RLY LY

FEASIBILITY

F1 Project Evaluations X Appropriate staff members


Managem should use were chosen to help with the
ent effective project evaluation. A timeline or
management schedule for completing
strategies. various activities in the
evaluation could be included.

F2 Practical Evaluation X Involving staff members in


Procedure procedures pulling data on usage, having
s should be teachers complete surveys
practical and about professional
responsive to development, and random
the way the sampling in each department
program are all good evaluation
operates. procedures.

F3 Contextual Evaluations X This doesnt seem to be clearly


Viability should represented in the evaluation
recognize, plan.
monitor, and
balance the
cultural and
political interests
and needs of
individuals and
groups.

F4 Resource Evaluations Y Other than using Google docs,


Use should use it is unclear what other
resources resources are being used.
effectively and
efficiently.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative


STATEMENTS Feedback/Comments

HARD FAIR CLEAR HIGH


LY LY LY LY

PROPRIETY

P1 Responsiv Evaluations x I am wondering about the


e and should be students and/or their parents
Inclusive responsive to opinions. Is it easy for them to
Orientation stakeholders use and follow? DO they use it
and their every night?
communities.
P2 Formal Evaluation X There is no evidence of this in
Agreement agreements the evaluation plan.
s should be
negotiated to
make obligations
explicit and take
into account the
needs,
expectations,
and cultural
contexts of
clients and other
stakeholders.

P3 Human Evaluations X
Rights and should be Participants and stakeholders
Respect designed and human and legal rights are
conducted to protected by the anonymity of
protect human using Google surveys.
and legal rights
and maintain the
dignity of
participants and
other
stakeholders.

P4 Clarity and Evaluations X It is important to reiterate to


Fairness should be teachers that the evaluation is
understandable not for the purpose of catching
and fair in teachers not utilizing the
addressing program, but to assess their
stakeholder level of understanding of the
needs and program and how it can directly
purposes. positively affect student
achievement and engagement.

P5 Transpare Evaluations X Not applicable in the draft.


ncy and should provide
Disclosure complete
descriptions of
findings,
limitations, and
conclusions to
all stakeholders,
unless doing so
would violate
legal and
propriety
obligations.

P6 Conflicts of Evaluations X Are the evaluators teachers at


Interest should openly the school? Are the evaluators
and honestly employees of GCPS? If so,
identify and this could present a potential
address real or conflict of interest that needs to
perceived be mentioned at the beginning
conflicts of of the evaluation.
interest that may
compromise the
evaluation.

P7 Fiscal Evaluations X Not addressed in the


Responsibi should account evaluation plan
lity for all expended
resources and
comply with
sound fiscal
procedures and
processes.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative


STATEMENTS Feedback/Comments
HARD FAIR CLEAR HIGH
LY LY LY LY

ACCURACY

A1 Justified Evaluation X The goal of 80% usage


Conclusion conclusions and seems a little high given that
s and decisions should different content areas will
Decisions be explicitly have different levels of usage.
justified in the For example, a P.E. course
cultures and may have a significantly lower
contexts where usage than a language arts or
they have social studies course. Also,
consequences. what does usage consist of?
Teachers integrating
instructional materials onto
the program or students
accessing the instructional
materials from the program?

A2 Valid Evaluation X Tying GTES data to eClass


Information information usage seems to be a bit of a
should serve the stretch. GCPS teachers are
intended expected to integrate
purposes and technology into their teaching
support valid strategies, but the evaluation
interpretations. rubric doesnt specifically
mention eClass and
technology usage is only one
aspect of that particular
standard. It may be that
teachers are ranked as
exemplary or proficient
without ever having utilized
eClass, and similarly, a
teacher could receive a rating
of needs improvement after
incorporating eClass into
every lesson. The usage of
eClass and a teachers
performance rating are not
mutually exclusive and could
result in skewed evaluation
findings.

A3 Reliable Evaluation X The information intended to


Information procedures be gathered is dependable in
should yield terms of the evaluation
sufficiently questions posed. One
dependable and suggestion would be to
consistent integrate some kind of student
information for component into the data
the intended collection especially for
uses. evaluation question two. Can
a teacher fully assess whether
a students learning has been
positively impacted?

A4 Explicit Evaluations X Using 40 participants from


Program should the school to establish a
and document cohesive sample for the
Context programs and perception of the program is
Description their contexts an effective way to gain this
s with appropriate information. It may be
detail and scope beneficial to mention the total
for the number of faculty at the
evaluation school where the program is
purposes. being evaluated.

A5 Information Evaluations x Google drive is a reliable


Manageme should employ storage method and google
nt systematic forms does a fantastic job of
information organizing data.
collection,
review,
verification, and
storage
methods.

A6 Sound Evaluations x Evaluation includes a mixed


Designs should employ method design to acquire data
and technically and analyze it with various
Analyses adequate methods. Goodman and
designs and Kruskals Gamma is an
analyses that appropriate statistical test for
are appropriate analysis.
for the
evaluation
purposes.

A7 Explicit Evaluation x The reasoning for the


Evaluation reasoning evaluation is explained
Reasoning leading from clearly.and completely.
information and
analyses to
findings,
interpretations,
conclusions and
judgments
should be
clearly and
completely
documented.
A8 Communica Evaluation x A formal evaluation
tion and communications agreement would make
Reporting should have misconceptions and biases
adequate scope almost nonexistent.
and guard
against
misconceptions,
biases,
distortions, and
errors.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative


STATEMENTS Feedback/Comments
HARD FAIR CLEAR HIGH
LY LY LY LY

EVALUATION ACCOUNTABILITY

E1 Evaluation Evaluations X One form of data mentioned


Documentat should fully is the number of clicks each
ion document their teacher has on his/her eClass
negotiated page. It is possible that this
purposes and information could be skewed
implemented dramatically and the number
designs, of visits to the page doesnt
procedures, necessarily translate into
data, and effective use of the program.
outcomes. The connection between
eClass usage and EOC /
Milestones scores isnt very
clear.

E2 Internal Evaluators x While there are a few


Metaevaluat should use standards that are not
ion these and other addressed in the draft, the
applicable internal evaluation seems to
standards to hone in on almost all of the
examine the evaluation standards.
accountability of
the evaluation
design,
procedures
employed,
information
collected, and
outcomes.
E3 External Program X The external meta-evaluation
Metaevaluat evaluation was conducted over the
ion sponsors, course of a week between a
clients, group of evaluators. A google
evaluators, and doc was used to collaborate
other on the plan and to record the
stakeholders assessment of standards
should addressed throughout the
encourage the plan.
conduct of
external
metaevaluations
using these and
other applicable
standards.
Appendix F: Internal Metaevaluation

Qualitative Metaevaluation Form Using The Program Evaluation Standards, 3rd Edition
MEDT 8480
Evaluator: Group 14

Metaevaluator: Group 14 11/12/2016

NAME DATE

Instructions: Rate the relevance of each standard as it currently applies to the present
evaluation effort. Then provide feedback to the evaluator on each standard by highlighting
where the evaluator addresses compliance with the standard statement in the evaluation
plan, and the extent that the standard is being met. If in your opinion the evaluator has not
or has insufficiently addressed the standard, indicate so and provide constructive feedback
or a suggestion as to how to improve the plan. Finally, in the Overall Comments section
found below, summarize your feedback as it pertains to the evaluations Utility, Feasibility,
Propriety, Accuracy, and Evaluation Accountability. Try to come up with an overall
statement of the evaluation plans merit taking into consideration the context of the
evaluation, the relevance of the evaluation standards to the evaluation effort, and the extent
that the standards are adhered to and met by the evaluator in the evaluation plan. NOTE:
This particular assignment is to provide students an experience with an important and often
overlooked aspect of evaluation: metaevaluation. Your metaevaluative conclusions and
feedback will in no way negatively impact the course grade of the particular evaluator;
however, they should be a constructive, yet fair assessment so as to help improve the overall
evaluation effort.
When addressing each standard, consider the whole evaluation plan including appendices.
This is particularly true of the Accuracy Standards. Are the evaluation instruments (if any)
and methods sufficient to answer the evaluation questions? If you have additional
feedback (e.g., formatting, editing, APA), please comment in the Additional Feedback
section.
Overall Comments:

The utility is the extent that the stakeholders have judged the program in regards to
their needs. We feel that the stakeholders have determined that our program is
viable, and will meet their needs. School Assistant Principals, Local School
Technology Coordinator, Learning Specialist, and teachers all share a part in
the vision and implementation of the program in order to support technology
support. The evaluation is feasible because the data collection is effective and
efficient. Utilizing resources immediately available at school such as the students,
the testing office, and staff development records prove to be an effective and
efficient use of time and energy therefore making this plan feasible. The evaluation
plan also appears to hold the highest level of propriety. Data is collected from a
variety of sources, and none of it is manipulated, or misrepresented. This makes the
program very reliable, because the numbers are true, and reflected accurately. The
evaluation plan adheres to accuracy, but it is important to discuss the cultural values
and standards in a little more detail to give credibility to the interpretations. Finally
to ensure evaluation accountability, the following documents were included in the
plan external metaevaluation, the contract, and the team internal metaevaluation.
Overall, this evaluation is well-written and justified in its purpose, procedure, and
proposed goals.

Additional Feedback:

N/A

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative Feedback/Comments


STATEMENTS
HARD FAIR CLEAR HIGH
LY LY LY LY

UTILITY

U1 Evaluator Evaluations should X All three evaluators are listed


Credibility be conducted by along with their respective
qualified people qualifications. Arika is a seasoned
who establish and teacher with many years and
maintain expertise in a variety of areas;
credibility in the this experience helps her
evaluation context. understand the diverse needs of
different teachers that are
involved with the eClass system.
Michael has more than five years
utilizing learning management
systems in the business and
technology courses that he
teaches. He understands the
usefulness, and viability of having
a way to facilitate blended
learning activities. Lorraine is an
accomplished educator with great
organizational and leadership
skills. She is an innovator with
eClass, and is dedicated to
facilitate technology teaching to
teachers.

U2 Attention to Evaluations should X The evaluation team addresses


Stakeholders devote attention to the stakeholders that are integral
the full range of for the success of the eClass
individuals and initiative: School Assistant
groups invested in Principals, Local School
the program and Technology Coordinator,
affected by its Learning Specialist, and teachers.
evaluation. The administration and
technology team is addressed
along with the teachers in the
evaluation purpose. Under the
subheading, description of who
will be involved in the evaluation
effort The evaluation team
explains who and what each
person is responsible for.

U3 Negotiated Evaluation X Under the subheading,


Purposes purposes should be Evaluation Purpose, the
identified and evaluation team plans to evaluate
continually the effectiveness of the new local
negotiated based eClass technology training and
on the needs of support activities for teachers.
stakeholders. The evaluation is said to be
necessary due to inconsistencies
in eClasss utilization and ability.
The logic model does address the
proposed outcomes that were
designated in the initial statement
of purpose.

U4 Explicit Evaluations should X The evaluation does not fully


Values clarify and specify address the cultural values in the
the individual and program description. It explains
cultural values that blended learning is a top
underpinning focus in Georgia along with the
purposes, explaining the influx of online
processes, and learning programs for both high
judgments. school and college students. The
program explains the vision of
eClass. Student engagement and
achievement are the focus,
however, the individual reasons
for why engagement and
achievement was low, or which
specific groups were subject to
low engagement or achievement,
was not initially explained.

U5 Relevant Evaluation X The evaluation team identifies


Information information should that there is a growing need to
serve the identified accommodate students with
and emergent online, blended learning
needs of opportunities. They also
stakeholders. explained the need for teachers to
be trained, utilize, and adapt
their teaching practices through
using eClass. Student engagement
and achievement are designated
in paragraph 2 under the
subheading, Description of the
Program Evaluated

U6 Meaningful Evaluations should X Relevant information is gathered


Processes construct activities, through Appendix A: Program
and Products descriptions, and Logic Model. Each activity is
judgments in ways broken down according to
that encourage specific inputs and outputs.
participants to Several activities are associated
rediscover, with the eClass initiative that
reinterpret, or include Technology Tuesday
revise their Trainings, eClass lead innovator
understandings training, leaders sharing lessons,
and behaviors. and tech tools and practices. Each
of these activities have specific
outputs along with short term,
intermediate, and long term
outcomes.

U7 Timely and Evaluations should X The evaluation team explains the


Appropriate attend to the needs of the administration and
Communicati continuing technology team in regards to the
ng and information needs effectiveness of the initiative.
Reporting of their multiple Though data is implied that it is
audiences. collected throughout the year
from different sources, it is also
explained in Appendix B in the
Data Collection Matrix.

U8 Concern for Evaluations should X The evaluation team does not


Consequence promote address the misuse or negative
s and responsible and application of the data presented.
Influence adaptive use while The information listed does not
guarding against address potential or negative
unintended misuses. It does appear though
negative that there are no evident
consequences and underlying issues associated with
misuse. negative consequence and misuse.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative Feedback/Comments


STATEMENTS
HARD FAIR CLEAR HIGH
LY LY LY LY

FEASIBILITY

F1 Project Evaluations should X The evaluation that is conducted is


Manageme use effective project effective in a variety of ways. It
nt management explains the roles and
strategies. responsibilities of the Local School
Technology Coordinator, eCLASS
learning specialist, assistant
principal, as well as the teachers.
The evaluators also have unique
experience to assist with project
management.

F2 Practical Evaluation X The evaluation team outlines the


Procedures procedures should procedures of the initiative. Some
be practical and data is meant to be collected to
responsive to the determine frequency, There is both
way the program nominal and ordinal data from
operates. varied sources that include usage
reports, test scores, and surveys. In
the Analysis subheading, it also
mentions how each specific
quantitative and qualitative data
component will be evaluated for
evidence and support.

F3 Contextual Evaluations should X The evaluation team demonstrated


Viability recognize, monitor, the need to learn the different
and balance the stakeholders value about the
cultural and program and its evaluation. There
political interests are several examples. With the
and needs of teachers, it recognized both the
individuals and usability, and usefulness of eClass.
groups. With the Learning Specialist, it
also considered the amount, extent,
and manner in which teachers
were being trained.

F4 Resource Evaluations should X Varied resources from assistant


Use use resources principals, lead innovators,
effectively and teachers, and students are outlined
efficiently. in the Data Collection Matrix.
Some of the main sources of data
come from test performance
scores, access logs, and survey
information that are readily
available within the school to
promote efficiency.
StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative Feedback/Comments
STATEMENTS
HARD FAIR CLEAR HIGH
LY LY LY LY

PROPRIETY

P1 Responsive Evaluations should X The evaluation team has provided


and be responsive to an evaluation draft demonstrated
Inclusive stakeholders and by the purpose of the evaluation.
Orientation their communities. It helps address aspects such as
bridging the gap in eClass
knowledge and skill for teachers
along with developing further
teacher technology training and
support. Teachers are involved in
the evaluation process.

P2 Formal Evaluation X There is a formal agreement plan


Agreements agreements should (Appendix D) issued by the
be negotiated to evaluators. Components such as
make obligations general information, along with
explicit and take work steps, project management,
into account the general requirements, descriptions
needs, of tasks, and associated
expectations, and deliverables were all included with
cultural contexts of completeness and accuracy.
clients and other
stakeholders.

P3 Human Evaluations should X The evaluation team did not


Rights and be designed and specifically address human rights
Respect conducted to and respect. However, the data
protect human and taken from teacher surveys, and
legal rights and eClass usage will not include their
maintain the personal names. Also, there was a
dignity of random selection of teachers in an
participants and attempt to stratify the data. In this
other stakeholders. regards, the evaluation team took
into account the human rights of
this particular group.

P4 Clarity and Evaluations should X The evaluation plan addressed


Fairness be understandable how the stakeholders have
and fair in developed the needs and purposes
addressing of the plan. This is the evaluation
stakeholder needs of the effectiveness of the eClass
and purposes. technology and training of for
teachers. Using a mixed methods
approach allows for a variety of
data to meet the needs and
purposes of the stakeholders.

P5 Transparenc Evaluations should X There is a comprehensive


y and provide complete disclosure of all findings in this
Disclosure descriptions of evaluation. In the Analysis and
findings, Standards and Benchmarks for
limitations, and Evaluation Questioning sections,
conclusions to all it provides the intent and
stakeholders, expectations of the programs
unless doing so findings. The questioning is
would violate legal effective in determining the
and propriety usefulness and application of
obligations. eClass. There are no limitations
that would constitute wrongdoing
or legal action.

P6 Conflicts of Evaluations should X The evaluation team does not


Interest openly and designate any discrepancies or
honestly identify conflicts of interest. Each
and address real or evaluator demonstrates
perceived conflicts competency as educators, and
of interest that may cohesively have the same outlook
compromise the for the program.
evaluation.

P7 Fiscal Evaluations should X There is no mention of fiscal


Responsibili account for all responsibility of evaluation team,
ty expended resources but it is irrelevant because it
and comply with appears there are no costs
sound fiscal associated with the evaluation.
procedures and
processes.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative Feedback/Comments


STATEMENTS
HARD FAIR CLEAR HIGH
LY LY LY LY

ACCURACY

A1 Justified Evaluation X The evaluation document does


Conclusions conclusions and not have justified conclusions and
and decisions should be decisions because the evaluation
Decisions explicitly justified has not been conducted yet. This
in the cultures and standard is not applicable since
contexts where this is the plan and the evaluation
they have has not been conducted;
consequences. therefore conclusions and
decisions are impossible at this
point in time.

A2 Valid Evaluation X After analyzing the methodology,


Information information should it appears the evaluation will
serve the intended serve the intended purpose. Since
purposes and the evaluation has not been
support valid conducted, the interpretations
interpretations. and results cannot be reported at
this point in time. Based on the
Standards for Evaluation
section, the evaluation team will
be able to interpret results
effectively based on the on the
benchmarks.

A3 Reliable Evaluation X In the methodology section, it is


Information procedures should evident that the evaluation team
yield sufficiently uses valid and reliable data to
dependable and support evaluation assertions
consistent such as eClass usage reports,
information for the Georgia Milestones test scores,
intended uses. and student user reports. By
using this data as a part of the
methodology, the evaluation team
is adding dependability and
consistent information for the
evaluations purpose.

A4 Explicit Evaluations should X The evaluation team identifies


Program and document the context appropriately in the
Context programs and their introduction program
Descriptions contexts with description. The evaluation team
appropriate detail also incorporates research into
and scope for the the program description to give
evaluation the evaluated program more
purposes. credibility and point of reference.

A5 Information Evaluations should X The evaluation team employs


Management employ systematic systematic information collection
information in the Data Collection Matrix.
collection, review, For each evaluation question the
verification, and following is identified: indicator,
storage methods. data source, data collection
method, responsible party,
timing, and analysis plan. By
outlining this, the evaluation
team effectively addresses the
information collection, review,
and verification.

A6 Sound Evaluations should X The evaluation team employs


Designs and employ technically data analysis that includes
Analyses adequate designs Goodman and Kruskals Gamma
and analyses that Test. This is addressed in more
are appropriate for detail in the Analysis section.
the evaluation The qualitative data will be coded
purposes. as well, and this is explained in
the Analysis section.

A7 Explicit Evaluation X Since the evaluation is not


Evaluation reasoning leading complete, it is difficult to assess
Reasoning from information this standard. However, the team
and analyses to does provide a detailed analysis
findings, section to justify future
interpretations, judgements and interpretations
conclusions and post data collection.
judgments should
be clearly and
completely
documented.

A8 Communicati Evaluation X The evaluation team does not


on and communications appear to have misconceptions,
Reporting should have bias, distortions, and errors;
adequate scope and however this is difficult to assess
guard against because the report is not
misconceptions, complete as the evaluation has
biases, distortions, not been conducted.
and errors.

StandardS STANDARD Relevance to the Evaluation (Check One) Qualitative Feedback/Comments


STATEMENTS
HARD FAIR CLEAR HIGH
LY LY LY LY

EVALUATION ACCOUNTABILITY

E1 Evaluation Evaluations should X The evaluation draft does


Documentati fully document document the purpose, design,
on their negotiated procedures, and data collection.
purposes and The outcomes and negotiations
implemented are included in the contract.
designs,
procedures, data,
and outcomes.
E2 Internal Evaluators should X As part of the final evaluation
Metaevaluati use these and other plan, the team completed an
on applicable internal metaevaluation form.
standards to Evaluators should continue to
examine the apply evaluation standards to
accountability of assess the quality of the
the evaluation evaluation.
design, procedures
employed,
information
collected, and
outcomes.

E3 External Program X This evaluation plan was


Metaevaluati evaluation evaluated by four other student
on sponsors, clients, colleagues using the Qualitative
evaluators, and Metaevaluation Form that aligns
other stakeholders with The Program Standards.
should encourage The four student colleagues all
the conduct of evaluated the plan and then came
external together to compose one
metaevaluations metaevaluation from their
using these and findings. The metaevaluation was
other applicable email to the group to provide
standards. feedback and included in the
document.

Você também pode gostar