Você está na página 1de 68

EDWIN L.

CUBERO
Discussant
July 31, 2018
METHODOLOGICAL
APPROACH
Research
Design
BASIC RESEARCH DESIGNS:
• Experimental Designs:
– True Experimental Studies
– Pre-experimental Studies
– Quasi-Experimental Studies

• Non-Experimental Designs:
– Expost Facto/Correlational Studies
(Descriptive)
Pre Experimental
One Group Pretest Posttest
Pre Experimental
One Shot Case Study
TREATMENT

Pre Experimental
True Experimental
True Experimental
Posttest only
True Experimental
Pretest - Posttest
True Experimental
Solomon Four Group
Quasi Experimental
Time Series
Correlational
Research
Locale
Respondents/
Subjects/
Participants of
the Study
Sampling
Technique
Probability
Sampling
Random
Sampling
Stratified
Sampling
Cluster
Sampling
Systematic
Sampling
Non-probability
Sampling
Data Gathering
Instruments
Data Gathering
Procedure
Data Analysis
Procedure
4 – point Likert scale
3.50 – 4.00
2.50 – 3.49
1.50 – 2.49
1.00 – 1.49
Statistical
Treatment of Data
(if applicable)
Reliability
and Validity
Validity refers to the
accuracy of inferences
drawn from an assessment.
It is the degree to which
the assessment measures
what it is intended to
measure.
Types of Validity

Construct validity- the assessment


actually measures what it is designed
to measure.

A actually is A
Concurrent validity- the assessment
correlates with other assessments
that measure the same construct.

A correlates with B
Predictive validity- the assessment
predicts performance on a future
assessment.

A predicts B
Valid Inferences
Validity is closely tied to the purpose or use of
an assessment.

DON’T ASK: “Is this assessment valid?”


ASK: “Are the inferences I’m making based on
this assessment valid for my purpose?”
Evidence-Centered Design
• Validity is about providing
strong evidence
• Evidence-centered design
boosts validity
–What do you want to know?
–How would you know?
–What should the assessment
look like?
• Reliability refers to consistency and
repeatability.

• A reliable assessment provides a


consistent picture of what students
know, understand, and are able to
do.
For example in assessment,
An assessment that is
highly reliable is not
necessarily valid. However,
for an assessment to be
valid, it must also be
reliable.
Guidelines in
writing the
Conclusions:
1. Generalizations based on findings
2. Not to contain numerals
3. Answering appropriately specific
questions
4. Formulated concisely pointing
out facts learned from the inquiry
5. Not repetitious of any statements
anywhere in the thesis
Guidelines in
writing the
Recommendations:
1.Solve or help solve problems discovered
in the investigation
2.Be for the continuance of good practice or
system
3.Aim for the ideal but must be feasible,
practical and attainable
4.Be addressed to persons, entities,
agencies that are in the position to
implement them
5.Include recommendation for further
research related to the topic
References:
DIWA Learning Series
UB Hand-outs
Images from Google
cyfar.org
en.wikipedia.org
sites.google.com
http://www.ride.ri.gov/Portals/0/Uploads/Documents/Instruction-and-
Assessment-World-Class-Standards/Assessment/CAS/CAS-Webinar-6-PPT-
12-14-2011.ppt
RESEARCH DESIGN
my.ilstu.edu/~mhemmas/.../2.%20%20Research%20Design-LDR%20280.PPT

Você também pode gostar