Você está na página 1de 5

Statistics: Problem Set 1

Insper - Doctoral Program in Business Economics


Due on Thursday, April 28th in tutoring session

Professor: Ricardo Paes de Barros


TA: Ana Carolina Santos

Required problems: 4, 9, 10, 18, 20 and 30.


1. Let P : B → [0, 1] be a probability function. Show that, for all A, B ∈ B (sigma algebra) the following
statements are true:
(i) P (Ac ) = 1 − P (A)
(ii) P (∅) = 0
(iii) 0 ≤ P (A) ≤ 1
(iv) P (B ∩ Ac ) = P (B) − P (A ∩ B)
(v) P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
(vi) A ⊂ B ⇒ P (A) ≤ P (B)
2. Consider Z ∼ fZ and X ∼ fX such that X = σZ + µ where σ > 0 and µ are constants. Now suppose
E(Z) = 0 and Var(Z) = 1. Compute: E(X), Var(X), FX in terms of FZ and fX in terms of fZ .
3. X and Y are independent random variables. Show that the following statement are true:
(i) PXY (x ∈ A, y ∈ B) = PX (x ∈ A)PY (y ∈ B)
(ii) FXY (x, y) = FX (x)FY (y)
(iii) E(g(X)h(Y )) = E(g(X))E(h(Y )) for any functions g and h
(iv) Cov(g(X)h(Y )) = 0 for any functions g and h
4. Let X = (X1 , ..., Xn )T and Y = (Y1 , ..., Ym )T be two multivariate normal variables, of size n and m, respec-
tively. Denote the expectations E[X] = µX and E[Y] = µY . The covariance matrix is (with ΣXY = ΣTY X ):
 
ΣXX ΣXY
Σ=
ΣY X ΣY Y

If Σ is positive definite, the joint distribution of (X, Y) has the following density function:
i−1  
h
(m+n)/2
p 1 T T T T
 −1 T
fXY (X, Y) = (2π) | det Σ| exp − X − µX , Y − µY Σ (X − µX , Y − µY )
2
Show that if ΣXY = 0, then X and Y are independent.
5. Consider a unit circle. Take some irrational number α. The number nα cannot be integer for any n ∈ N.
Therefore if we take any point from [0, 2π], i.e. the point on the circle and mark all the points which are
obtained by a rotation of x on the angle 2πnα, n = ±1, ±2, ..., we will never come back to x. There is a
countable set Kx of all such points for any x from [0, 2π]. The circle is naturally divided on disjoint classes
{Kx } then. Take from each Kx one and only one point and form the set A0 . Define by An the set of points
obtained by rotation of the set A0 on the angle 2πnα, n ∈ Z. The union ∪∞ n=−∞ An is nothing but a segment
[0, 2π] then. Also these sets are disjoint. Is A0 a measurable set? Why?

1
6. We say F stochastically dominates G when, for all x ∈ R, F (x) ≤ G(x). Suppose there exists a constant c
such that f1 (x) ≤ f2 (x) for all x ≤ c and f1 (x) ≥ f2 (x) for all x > c, where f1 and f2 are the probability
density function associated with the cumulative distribution functions F1 and F2 , respectively. Show that:
(i) For all x ≤ c, F1 (x) ≤ F2 (x) and for all x > c, 1 − F1 (x) ≥ 1 − F2 (x)
(ii) Based on (i) show that F1 stochastically dominates F2
7. Let X and Y be two random variables with finite variances.
(i) Show the Cov(X − Y, X + Y ) = Var(X)−Var(Y )
(ii) Show that if X−Y and X+Y are stochastically independent then it must be the case that Var(X) =Var(Y ).
8. Let X and Y be two random variable. Define Z = XY . Supposte it is true that E[Y |X] = X, E[Y 2 |X] =
X 2 + 1, E[X] = 0, Var[X] = 1, and E[X 4 ] = 3.
(i) Show that E[Z|X] = X 2 (remember that E[g(X)h(Y )|X] = g(X)E[h(Y )|X])
(ii) E[Z] = 1 (you may find usufel to use the Law of Iterated expectations)
(iii) Show that E[Z 2 ] = 4 and hence that Var[Z] = 3
(iv) Show that Var[Z|X] = X 2
(v) Using (iv), show that E[Var[Z|X]] = 1
(vi) Using (i), show that Var[E[Z|X]] = 2
(vii) In this case Var(Z) =Var[E(Z|X)] + E[Var(Z|X)]. Is this a general property?
9. Let B : τ → [0, 1] be a probability function. Hence, (a) Pfor all A ∈ τ , P (A) ≥ 0, (b) P (S) = 1 and (c) for any

sequence of disjoint sets A1 , A2 , ... ∈ τ , P (∪∞ A
i=1 i ) = i=1 P (Ai ).

(i) Show that if A ⊂ B then P (A) ≤ P (B)


(ii) Show that for any sets A1 and A2 , (a) A1 ∪ A2 = A1 ∪ (A2 ∩ Ac1 ) and that (b) A1 and (A2 ∩ Ac1 ) are
disjoints, and that (c) P (A1 ∪ A2 ) = P (A1 ) + P (A2 ∩ Ac1 ) ≤ P (A1 ) + P (A2 ).
(iii) For any sequence of sets A1 , A2 , ... ∈ τ , show that

[ ∞
[
Ai = Bi
i=1 i=1

where B1 = A1 and for i = 2, 3, ...


 c
i−1
[
Bi = Ai ∩  Aj 
j=1

(iv) Show that P (Bi ) ≤ P (Ai )


(v) Show that
 
i−1
\
B i = Ai ∩  Acj 
j=1

(vi) Show that Bi ∩ Bk = ∅ for any i 6= k


(vii) Based on (iii), (iv) and (v) show that
∞ ∞ ∞
! !
[ [ X
P Ai =P Bi ≤ P (Ai )
i=1 i=1 i=1

2
10. Let B1 and B2 be sigma-algebras of subsets of a given sample space S. Show that B1 ∩B2 is also a sigma-algebra.
Is this conclusion holds for the union (B1 ∪ B2 )?

11. Assume U is a random variable with a uniform distribution on the interval (0, 1), i.e., FU (x) = x for 0 < x < 1
and FU (x) = 0 if x ≤ 0 and FU (x) = 1 if x ≥ 1. Let X = F −1 (U ) where F is an arbitrary absolutely continuous
cumulative distribution function. Show that the cumulative distribution function of X is F .
12. Let (S, B, P ) be a probability space.
Pn
(i) Show that for any A ∈ B, P (A) = i=1 P (A ∩ Ci ) for any finite partition {Ci }ni=1 ∈ B.
(ii) Consider the finite version of the Boole’s Inequality for any finite sequence of events {Ai }ni=1 ∈ B:
n
! n
[ X
P Ai ≤ P (Ai )
i=1 i=1

Given that, show the following version of the Bonferroni Inequality:


n
! n
\ X
P Ai ≥ P (Ai ) − (n − 1)
i=1 i=1

(iii) Let {Ci }∞


i=1 ∈ B be an infinite sequence of events such that C1 ⊂ C2 ⊂ · · · . It’s true that:


!
[
P Ci = lim P (Cn )
n→∞
i=1

Now let {Di }∞i=1 ∈ B be an infinite sequence of events such that D1 ⊃ D2 ⊃ · · · . Using the previous
result show that:

!
\
P Di = lim P (Dn )
n→∞
i=1

13. Suppose that X has a normal distribution with mean 0 and variance 1. Its probability density function (p.d.f.)
is given by
1 2
fX (x) = √ e−x /2 , − ∞ < x < ∞

Find the p.d.f. for Y = X 2 .


14. The Poisson distribution is defined as: X ∼Pois(λ) such that

λk
Pk = P (X = k) = e−λ , k = 0, 1, 2, 3, · · ·
k!
Compute E(aX ), where a is an arbritraty constant.

15. In quantum mechanics we call spin an intrinsic characteristic carried by elementary particles. Photons, the
fundamental particle (yes!) of light, have spin 1. In special, the angular momentum Sz of a particle that has
spin 1 could take values: −}, 0 and }, where } is a constant (Planck constant divided by 2π). We know that,
in a given quantum state,

} 2}2
E(Sz ) = and E(Sz2 ) =
3 3
Find the probability distribution P (Sz ).

3
16. The geometric distribution is defined as: X ∼Geom(p) such that

Pk = P (X = k) = p(1 − p)k , k = 0, 1, 2, 3, · · · (1)

This distribution is interpreted as the number os failures before the first sucess, i.e., X represents the number
os failures. For example, X = 3 means that you flip a coin 4 times obtaining head in the first three and tail
in the fourth.
(a) Show that equation (1) is normalized.
(b) Compute E(X)
(c) The probability of a student to be approved in the Statistics course is p = 0.3 (maybe I’m being a bit
dramatic). On average, how many times he will fail until be approved?
(d) The previous result is not an integer number. What this means? What is the most likely number of
times he will attend Statistics?
17. Two quantum particles are moving along the x axis with joint probability distribution p(x1 , x2 ) given by
2
(x2 + x2 )
  
1 x2 − x1
p(x1 , x2 ) = exp − 1 2 2
πx20 x0 x0

Find the marginal distribution for particle 1 and particle 2. Are they independent?
Given:
Z ∞ r
−αx2 π
e dx =
−∞ α

18. Nanoparticles produced by chemical synthesis have a distribution for their diameter that could be approxi-
mated by a lognormal distribution: D ∼LN(d0 , σd2 ) such that

ln2 (d/d0 )
 
1
p(d) = √ exp − ,d>0
2πσd d 2σd2

where d0 and σd are parameters. Show that the volume V = πD3 /6 is also lognormally distributed and find
the paramaters of the distribution.
1
19. Let S be a sample space that consist of four points, each with probability 4. Find three events that are
pairwise independent but not independent. Generalize.
20. Let {Xj } be a sequence of independent identically
Pn distributes positive random variables such that E(Xj ) =
a < ∞ and E(1/Xj ) = b < ∞, and let Sn = i=1 Xj .
(a) Show: E(Xj /Sn ) = 1/n if j ≤ n, and E(Xj /Sn ) = aE(1/Sn ) if j > n.
(b) Show: E(Sm /Sn ) = m/n if m ≤ n and E(Sm /Sn ) = 1 + (m − n)aE(1/Sn ) if m > n.
21. Suppose that {Ei }ni=1 is a collection of events in Ω.
(a) If E1 , · · · , En are independent, so are E1 , · · · , En−1 , Enc .
(b) If {Ei }ni=1 is an independent set, so if {Fi }ni=1 , where each Fi is either Ei or Eic .
(c) {Ei }ni=1 is an independent set of events if and only if {χEi }ni=1 is an independent set of random variables.
We define χEi in the following way:

1, if ω ∈ Ei
χEi (ω) =
0, otherwise

iid
22. Suppose you repeat a Bernoulli experiment n times independently (i.e., Xi ∼ Bern(p)).

4
Pn
(i) First show that if we define Y = i=1 Xi , then Y ∼Bin(n, p).
(ii) Now take the liminit in which the probability of sucess goes to zero and you repeat the experiment
many times (n → ∞) satisfying then condition that n · p is fixed and finite. Show that in this case the
distribution of Y can be approximated by a Poisson distribution with parameter λ. This is known as the
Poisson paradigm.
(iii) The probability of a brazilian (population = 200 million) acess the site www.euamoestatistica.com.br
in a certain day is p = 10−9 . Estimate the probability that at least two people visit this site in one day.

23. Let S be a sample space and ∅ 6= B ⊂ P(S). Show that the following statements are equivalent: B is closed
with respect to:
(i) finite union and difference
(ii) finite union and proper difference, defined as A\B with B ⊂ A
(iii) finite intersection and symmetric difference, defined as A∆B := (A\B) ∪ (B\A) = (A ∪ B)\(A ∩ B)
(iv) disjoint finite union, proper difference and finite intersection
24. Let S be the sample space. R ⊂ 2S is defined as σ-ring if it is closed under countable unions and difference.
If R is an σ-ring, then R is an σ-algebra if and only if S ∈ R.
25. Let (S, B, P ) a probability space. If E, F ∈ B, then P (E ∪ F ) + P (E ∩ F ) = P (E) + P (F ).

26. Let (S, B, P ) a probability space. If E, F ∈ B and P (E∆F ) = 0, then P (E) = P (F ).


27. Casella 1.2
28. Casella 1.9

29. Casella 1.11


30. Casella 1.12
31. Casella 1.35
32. Casella 1.38

33. Casella 2.18


34. Casella 3.40
35. Casella 3.44

Você também pode gostar