Escolar Documentos
Profissional Documentos
Cultura Documentos
Important Topics
1
2
3
4
Definitions
Probabilities of events.
Cumulative probability distribution.
Mathematical formula
2.
3. Propert
ies
f (x)dx 1
All x (Area Under Curve)
f (x ) 0, a x b
Frequency
f(x)
b
Value
1
d
Uniform Distribution
1. Equally likely outcomes
f(x)
2. Probability density
function
1
f ( x)
d c
c a
d c
12
Moments
and
0.01)
3 0.05
40.02)
0.5
(0 0.90 1 0.07 2 0.02
3 0.01
40.00)
0.5
0.35
Conditional variance
Independence
Correlation
Normal, Chi-Squared,
Fm ,
Distributions
, and t
2-42
36 of 42
Copyright 2011 Pearson Education, Inc.
90%: +- 1.69
95%: +- 1.96
99%: +- 2.58
where
Fm ,
distribution
where
and
are independent.
When n is ,
.
TheFm ,
distribution is the distribution of a
random variable with a chi-squared distribution
with m degrees of freedom, divided by m.
Equivalently, F
the
distribution is the
m ,
distribution of the average of m squared
standard normal random variables.
Random Sampling
i.i.d. draws.
Y of
Sampling distribution
distributed
Therefore,
Y : N ( y ,
if Y is normally
y2
n
Large-Sample Approximations to
Sampling Distributions
Two approaches to characterizing sample
distributions.
Exact distribution, or finite sample distribution
when the distribution of Y is known.
Asymptotic distribution: large-sample
approximation to the sampling distribution.
2-59
The variance of Yi ,
, is finite.
Developing
Sampling Distributions
Suppose Theres a Population ...
Population size, N = 4
Random variable, x
Values of x: 1, 2, 3, 4
Uniform distribution
Population Characteristics
Summary Measures
Population Distribution
X
i 1
2.5
X
i 1
.3
.2
.1
.0
P(x)
1.12
16 Sample Means
Sampling Distribution
of All Sample Means
16 Sample Means
1st 2nd Observation
Obs 1
2
3
4
Sampling Distribution
of the Sample Mean
P(x)
.3
.2
.1
.0
1.0 1.5 2.0 2.5 3.0 3.5 4.0
12.5x27.95
Comparison
Population
.3
.2
.1
.0
P(x)
Sampling Distribution
P(x)
.3
.2
.1
.0
1.0 1.5 2.0 2.5 3.0 3.5 4.0
Chebyshev Inequality
For any X
That is,
if and only if Pr [| Sn |] 0 as n
for every > 0.
If
, then Sn is said to be a consistent
estimator of .
The law of large numbers.
If Y1, , Yn are i.i.d., E(Yi) =
and Var(Yi) < ,
then
Convergence in distribution.
That is,
if and only if
limit holds at all
where the
< , then
is N(0,1) .
Slutskys theorem