Você está na página 1de 10

Statistical Methods: 1. Tests for Normality i.

Kolmogorov-Smirnov One-Sample Test This test is concerned with the degree of agreement between the distribution of a set of observed scores and some specified distribution.

A. Statement of Hypothesis Ho: Ha: are normally distributed are not normally distributed

B. Test Statistic D=

C. Procedures: C1. Arrange the scores into a cumulative distribution and convert the cumulative frequency into cumulative relative frequency, denoted by . For each interval, find the expected (theoretical)

C2. Compute for the maximum deviation D defined as D=

C3. Look for the two-tailed probability associated with the occurrence under Ho of values as large as the observed value of D. If it is , reject Ho. (Refer to TABLE 1)

ii.

Shapiro-Wilk Test

The Shapiro-Wilk test calculates a W statistic that tests whether a random sample, x1, x2, ...,xncomes from (specifically) a normal distribution.

A. Statement of Hypothesis Ho: The distribution follows normal distribution Ha: The distribution doesnt follow normal distribution

B. Test Statistic

W= Where is the the sample; order statistic; i.e. the smallest in

is the sample mean The constants as are given by:

Where ms are the expected values of the order statistics of an iid sample from the standard normal distribution, and V is the covariance matrix of those order statistics. C. Procedures: C1. C2. Compute for the statistic W The test rejects the null hypothesis if W is too small.

2. Tests for Homogeneity i. Levenes Test

It is the standard way for testing equality of variances.

A. Statement of Hypothesis Ho: Ha: At least one is not equal

B. Test Statistic

k is the number of different groups to which the samples belong, N is the total number of samples, Ni is the number of samples in the ith group, Yij is the value of the jth sample from the ith group,

is the mean of all Zij,

is the mean of the Zij for group i. C. Procedures C1. Compute for the statistic W C2. If W > then reject Ho. (TABLE 2)

3. Tests for K Independent Samples i. One Way Analysis of Variance This is used for comparing means from different groups.

A. Statement of Hypothesis Ho: Ha:

B. Test Statistic F*= is the sum of squares for variable Y is determined from the means of the columns taken from the means of the entire population. is the sum of K is the number of columns N is the number of observations. and

C. Procedures: C1. C2. refer to TABLE 2. Compute for the value of F where F If F* > F , reject the null hypothesis and

ii.

Tukey's HSD (Honestly Significant Difference) test It is a single-step multiple comparison procedure and statistical test generally used in conjunction with an ANOVA to find which means are significantly different from one another.

A. Statement of Hypothesis: Ho: 1 = 2 = 3 = ... = n Ha: At least one is not equal

B. Test Statistic

YA is the larger of the two means being compared YB is the smaller of the two means being compared SE is the standard error of the data

C. Procedures C1. Compute for the statistic C2. Compare to the q value of the studentized range distribution. If > q value, reject Ho. (Refer to TABLE 3)

4. Tests for Association i. Point Biserial Correlation Coefficient It is a measure of association where one variable is continuous and other is truly dichotomous. Dichotomous variables are variables that take on two values only. For example, Dead or Alive, Male or Female, etc.

A. Statement of Hypothesis Ho: There is no correlation Ha: There is a correlation

B. Test Statistic

= mean of the continuous variable among the group scoring 1 =mean of continuous variable =proportion of group scoring 1 =1-p

C. Procedures C1. C2. Just compute for the statistic Compute for the t-statistic

t=

C3.

If t

, then reject Ho. (Table 4)

ii.

Spearmans Rank Correlation Coefficient It is a measure of correlation rankings. The variables were both measured at least in an ordinal scale.

A. Statement of Hypothesis Ho: There is no association between the 2 variables Ha: There is an association between the 2 variables.

B. Test Statistic

Where n is the number of sample is the difference between the rankings of 2 groups

C. Procedures C1. Rank the observations on the X variable from 1 to n. Do the same for the Y variable. C2. Put each subjects rank on the X variable and Y variable next to entry. C3. Determine for each subject by subtracting the Y rank from .

the corresponding X rank. Square this to get C4. Sum all for i=1, 2,n

C5. Use TABLE 5to determine the critical value of the statistic. C6. If the value of favor of Ha. exceeds the critical value, then reject Ho in

iii.

Eta Coefficient/Correlation Ratio This is used if one variable is nominal and the other is interval. It measures nonlinear relationship between the variables unlike the Pearsons correlation which measures linear relationship.

A.

Statement of Hypothesis

Ho: There is no association linear or nonlinear Ha: There is an association may it be linear or nonlinear

B. Test Statistic =

is the sum of squares for variable Y is determined from the means of the columns taken from the means of the entire population. is the sum of and

C. Procedure C1. C2. C3. Compute the ANOVA table Compute for the statistic Compute for the statistic: = = C4. The statistic , follows distribution. . , c is the number of columns

Look for TABLE2to compute for the critical value of C5. If , then reject Ho.

Table 1: Kolmogorov-Smirnov Test (If calculated ratio is greater than value shown, then reject the null hypothesis at the chosen level of confidence.) SAMPLE SIZE (N) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 LEVEL OF SIGNIFICANCE FOR D = MAXIMUM [ F0(X) - Sn(X) ] .20 .15 .10 .05 .01 .900 .925 .950 .975 .995 .684 .726 .776 .842 .929 .565 .597 .642 .708 .828 .494 .525 .564 .624 .733 .446 .474 .510 .565 .669 .410 .436 .470 .521 .618 .381 .405 .438 .486 .577 .358 .381 .411 .457 .543 .339 .360 .388 .432 .514 .322 .342 .368 .410 .490 .307 .326 .352 .391 .468 .295 .313 .338 .375 .450 .284 .302 .325 .361 .433 .274 .292 .314 .349 .418 .266 .283 .304 .338 .404 .258 .274 .295 .328 .392

17 18 19 20 25 30 35 OVER 35

.250 .244 .237 .231 .210 .190 .180 1.07 ___ N

.266 .259 .252 .246 .220 .200 .190 1.14 ___ N

.286 .278 .272 .264 .240 .220 .210 1.22 ___ N

.318 .309 .301 .294 .270 .240 .230 1.36 ___ N

.381 .371 .363 .356 .320 .290 .270 1.63 ___ N

Table 2. (F-table) Table 3. (Studentized Range) Table 4. (T-distribution) Table 5 (Spearmans Rank Correlation) Bibliography Institute, E. E. (n.d.). Appendix. Retrieved 3 6, 2011, from Distance Learning Center: http://www.eridlc.com/onlinetextbook/index.cfm?fuseaction=textbook.appendix&F ileName=Table7

Você também pode gostar