Você está na página 1de 37

Compiled & Presented by

Michele Walden-Pinnock
Reflecting on
Curriculum Implementation
How was your first semester in 2011/2012?
How was Course Delivery?
How was Course Evaluation ?
How did your students perform ? Why?
What/ Who do your colleagues attribute this
to?
Inter-relatedness of
Curriculum Elements
CURRICULUM



DEVELOP MENT
OBJECTIVES
CONTENT
EVALUATION
METHODOLGY
Who were the Curriculum Designers ?
Understanding the Numbers Game
Coursework
Project - 25%
Group Work Research and Presentation 30%
Essay 20%
Class Test - 15%
How do we find the coursework grade?

Getting Ahead of Our Game
We were responsible for writing the courses
We designed the evaluation type & weighting
Coursework
Final Examination
And how the two were to be combined
We identified
essential vs non-essential content
Retracing Our Steps
We must ALL take responsibility ! &
Right the Wrong
Without a Table of Specification assessment will
produce scores that are of limited use and
interpretation.

TOS will help all our stakeholders
Students - study guide
Lecturers teaching guide; reporting;
accountability
Ministry accountability; accreditation
What can we do, at this point?
We MUST develop our Table of Specifications
and stand committed to using it as our
blueprint.
It is against this matrix that our courses will be
evaluated.


Establish Undergirding Principles
Our Philosophy on Teaching and Learning by
students [bell curve phenomenon?]
Criterion Referenced Testing
Mastery Learning
Item Difficulty
Validity

Item Difficulty
The difficulty of the test depends on its purpose.
To monitor performance of all students the
distribution of difficulty should match the
distribution of achievement of the target
population.



NOTE: The exclusion of important areas of the
curriculum simply because students perform very
poorly or very well on them.

2/3 of test 30-70 % answer correctly
1/3 of test More than 70% answer correctly
Items that 30% are likely to answer
NRT vs CRT
"Norm-Referenced Assessment: Assessment
designed to provide a measure of performance
that is interpretable in terms of an individual's
relative standing in some known group.

Criterion-Referenced Assessment: Assessment
designed to provide a measure of performance
that is interpretable in terms of a clearly defined
and delimited domain of learning tasks." (p. 42)
Linn and Gronlund (2000)
Understanding how CRT relate to
Mastery Learning
"... include items that are directly relevant to the
learning outcomes to be measured, without
regard to whether the items can be used to
discriminate among students. No attempt is
made to eliminate easy items or alter their
difficulty.
The goal of the criterion-referenced test is to
obtain a description of the specific knowledge
and skills each student can demonstrate. " (p. 43)
Power Tests vs Speed Test
POWER TEST
On a Power Test all students are given enough time
to attempt and answer all items.

Items are arranged in a hierarchy from knowledge
level (easy) to increasing difficulty.

A power test should be administered so that a very
large percentage (90% is an acceptable minimum) of
the pupils for whom it is designed will have ample
time to attempt all of the items.
Power Tests vs Speed Test
SPEED TEST
A speed test is one in which a student must, in a limited
amount of time, answer a series of questions or perform a
series of tasks of a uniformly low level of difficulty.
The intent of a speed test is to measure the rapidity with
which a pupil can do what is asked of him or her.
Speed of performance frequently becomes important after
students have mastered task basics as in using a keyboard,
manipulatives, or phonics.

Tests are often a mixture of speed and power even when
achievement level is the test's purpose. Such tests are called
partially speeded tests.
Considering TIME vs Weighting
Teachers must check time limits carefully to be sure that all students will have
the opportunity to address each test item adequately before the allotted
time is up.
The amount of time for the test is determined before test construction
and is facilitated by using a Table of Specifications.

Testing time is determined by :
the number of objective to be tested;
Coverage and complexity of objectives;
levels of acceptable performance
Demographics of students - age and ability levels,
class time available,
types of test items,
length and complexity of test items.
Teachers must check time limits carefully to be sure that all
students will have the opportunity to address each test item
adequately before the allotted time is up.
Exploring Content
Content can be classified in many ways
Cognitive
Declarative; Procedural; Strategic (problem solving)
Using Blooms Taxonomy
Psychomotor
Affective
Classifying Content is important because different
types of knowledge; skills and attitudes are best
assessed using specific strategies
Declarative Knowledge
Factual information stored in memory and
known to be static.
Knowledge about something; describes how
things are. Things/events/processes, their
attributes, and the relations between these
things/events/processes and their attributes.
Procedural Knowledge
Knowledge of how to perform, or how to
operate. [Know-how]. It involves making
discriminations, understanding concepts, and
applying rules that govern relationships and
often includes motor skills and cognitive
strategies.

Strategic Knowledge
Information that is the basis of problem
solving, - action plans to meet specific goals;
knowledge of the context in which procedures
should be implemented; actions to be taken if
a proposed solution fails; and how to respond
if necessary information is absent.

Content-Process Validity
TOS ensures that items are representative of
materials taught adequate sampling
The intellectual reasoning level [process] used
during instruction and intended by the
curriculum designers is mirrored in the
assessment.
Facts about Knowledge
All knowledge starts out as declarative
information and procedural knowledge is
acquired through inferences from already
existing knowledge.
It is said that one becomes more skilled in
problem solving when he relies more on
procedural knowledge than declarative
knowledge.
Table of Specifications
A Table of Specifications classifies each test item
according to what topic or concept it tests
AND what objective it addresses.

SAMPLE

Purpose of TOS
To ensure that there exists correspondence
between the learning objectives for the
students and the content of the course.

To ensure proper organization of assessment
procedures that best represent the material
covered in the teaching/learning process.
Benefits of a TOS
Ensures that an assessment has content validity ie the
tests what it was suppose to test; a match between what
was taught and what is tested.

Ensures that the same emphasis on content during
instruction is mirrored on assessment (e.g., more items
about topic X and fewer about topic Y because you consider
X to be more important and you spent more time on X)

Ensures alignment of test items with objectives (e.g.,
important topics might include items that test
interpretation, application, prediction, and unimportant
topics might be tested only with simpler recognition items

Ensures that content is not overlooked or
underemphasized

Framework of TOS
A Table of Specifications consists of a two-way chart or grid
(Kubiszyn & Borich, 2003; Linn & Gronlund, 2000; Mehrens &
Lehman, 1973; Ooster, 2003) relating instructional objectives to the
instructional content.

The column of the chart lists the objectives or "levels of skills"
(Gredler, 1999, p.268) to be addressed;

The rows list the key concepts or content the test is to measure.

"We have found it useful to represent the relation of content and
behaviors in the form of a two dimensional table with the
objectives on one axis, the content on the other. The cells in the
table then represent the specific content in relation to a particular
objective or behavior" (Bloom, et al. (1971),
Going Forward
Examine the Units according to proposed
Duration and the identify weightings
Classify the objectives according to its
emphasis on knowledge/ skills/ attitudes for
each units
COGNITION PSYCHOMOTOR AFFECTIVE TOTAL
UNIT 1 :
Understanding Self
UNIT 2:
Diversity in the
Classroom
UNIT 3:
Professional Ethics and
Teacher Relationships
SUBTOTAL
TOTAL
COGNITION
PSYCHOMOTOR AFFECTIVE TOTAL
K
N
O
W
L
E
D
G
E

C
O
M
P
R
E
H
E
N
S
I
O
N

A
P
P
L
I
C
A
T
I
O
N

A
N
A
L
Y
S
I
S

E
V
A
L
U
A
T
I
O
N

S
Y
N
T
H
E
S
I
S

UNIT 1 :
Understanding
Self
UNIT 2:
Diversity in the
Classroom
UNIT 3:
Professional
Ethics and
Teacher
Relationships
SUBTOTAL
TOTAL
Factors that can Influence the Design
of the TOS
Persons understanding about the content
being measured
Persons understanding of the purpose of
Assessment
Time and resources will not permit the testing
of every objective/content on a syllabus
Therefore the CORE SKILLS/ ESSENTIAL TOPICS
ought to be agreed upon by the experts.
Lets Not Invent the Wheel
Assessment instruments are available online
Checklists
Rubrics
Rating scales

Next Step
Complete a TOS for all courses being
examined this semester
Ensure that our colleagues when submitting
draft questions for final exams indicate the
objective/ objectives being assessed
If lecturers teaching a course in your board agree
on the TOS. Then there will be no need for vetting.
Lets remain
CONFIDENT and COMMITTED
Nobody trips over mountains. It is the small
pebble that causes you to stumble. Pass all
the pebbles in your path and you will find you
have crossed the mountain.
~Author Unknown

Consider the postage stamp: its
usefulness consists in the ability to stick to
one thing till it gets there. ~Josh Billings
Lets commit ourselves to our Mission
Lets vow to sticking to each small task
Ensure the remains Motivated



ogether veryone chieves ore,
Reference
Anderson, P & G. Morgan (2008)Developing Tests
and Questionnaires for a National Assessment of
Educational
http://www.uis.unesco.org/Education/Document
s/National_assessment_Vol2.pdf
Gredler, M. E. (1999). Classroom Assessment and
Learning. New York: Longman, an imprint of
Addison Wesley Longman, Inc.
Linn, R. L., & Gronlund, N. E. (2000). Measurement
and assessment in teaching (8th ed.). Upper
Saddle River, NJ: Prentice Hall.

Você também pode gostar