Você está na página 1de 7

STA347 Problem Set

Problem 1. Using the definition of probability measure, prove that


P (A B) = P (A) + P (B) P (A B).
Problem 2. Let An be a sequence of events. Prove Booles inequality, that is,
P(

An )

n=1

P (An ).

n=1

Problem 3. An is a monotone decreasing event to , that is, A1 An1 An


An+1 and
n=1 An = . Prove that limn P (An ) = 0.
Problem 4. For P (A) > 0, define QA (B) = P (B | A) = P (B A)/P (A). Prove that QA is
a probability measure.
Problem 5. Show that the distribution function F of X satisfies the followings.
(a) F is non-decreasing.
(b) limx F (x) = 1 and limx F (x) = 0.
(c) F is right-continuous, that is, F (x+) = limh&0 F (x + h) = F (x).
Problem 6. Suppose X Poisson(). Prove that

P (X > n) = E(X).

n=0

Problem 7. Let X be a random variable. Define X+ = max(0, X) and X = max(0, X).


Show that X = X+ X and |X| = X+ + X .
Problem 8. If Xn 0, then
E(lim inf Xn ) lim inf E(Xn ).
n

Problem 9. Write the probability density/mass function, probability generating function,


moment generating function, cumulative generating function and characteristic function of
the following distributions.
(a) Bernoulli(p); (b) Binomial(n, p); (c) Poisson(); (d) Geometric(p); (e) N egative
Binomial(r, p); (f) N (, 2 ); (g) U nif orm(1 , 2 ); (h) Gamma(, ); (i) Beta(, ) and (j)
Exponential() Gamma(1, ).
Problem 10. Suppose E(|X|) < , F is the distribution function of X and f (x) = F 0 (x)
is the derivative of F . Prove that
Z
Z 0
(1 F (x))dx
F (x)dx = E(X).

Problem 11. Suppose X, Y have the joint density f (x, y). Prove that W =
satisfies that for any set B, E(1A W ) = E(1A X) where A = (Y B).

xf (x, y)/fY (y)dx

Problem 12. Prove Slutskys Theorem.


Problem 13. Suppose Xi s are i.i.d. random variables
having E(Xi ) = and E(Xi2 ) = 2 .
P
n
Let X = (X1 + + Xn )/n and S 2 = (n 1)1 i=1 (Xi X)2 . Prove that
E(X) = and E(S 2 ) = Var(Xi ).
Problem 14. Let X1 , . . . , Xn be a random sample from Poisson().
(a) Find the distribution of T = X1 + + Xn .
(b) Compute E(T ), E(T 2 ), E(T 3 ) and E(T 4 ).
Problem 15. Let X1 , . . . , Xn be a random sample from a distribution having a density
f (x) = I(x )c()/x4

for > 0.

(a) Find c().


(b) Find the density of X(1) = min(X1 , . . . , Xn ).
(c) Compute mean and variance of X1 and X(1) .
Problem 16. Assume X1 , . . . , Xn are i.i.d. random variables sampled from Uniform(
n = (X1 + + Xn ), X(1) = min(X1 , . . . , Xn ) and
1, + 1). Find the mean and variance of X
X(n) = max(X1 , . . . , Xn ).

Problem 17. Suppose that X1 N (, 2 ), X2 N (3, 4 2 ) are independent.


(a) Show Ta,b = aX1 + bX2 is a normal distribution.
(b) Compute mean and variance of Ta,b = aX1 + bX2 .
(c) Find a condition for a, b to make E(Ta,b ) = .
(d) Find a, b so that Var(Ta,b ) is the smallest satisfying E(Ta,b ) = .
Problem 18. Let X1 , . . . , Xn be a random sample from the probability density function
given by f (x) = I(x > ) 1 exp((x )/) where = (, ). Compute mean and
where X(1) = min(X1 , . . . , Xn ).
variance of X(1) , X
Problem 19. Let X1 , . . . , Xn be a i.i.d. sample from a distribution having density f (x) =
I(x > 0)1 exp(x/).
(a) Find the density function of T = X1 + + Xn .
(b) Compute mean and variance of T .
Problem 20. Let X1 , . . . , Xn be a random sample from Uniform(1 , 2 ) whose density is
I(1 x 2 )/(2 1 ).
(a) Show that n(X(1) 1 ) converges in distribution.
(b) Show that n(2 X(n) ) converges in distribution.
(c) Prove or disprove that n(2 X(n) + X(1) 1 ) converge in distribution.
2

Problem 21. Xi i.i.d. Uniform(0, ) for i = 1, . . . , n.


d
(a) Find the distribution of Z such that n( X(n) ) Z.
(b) Find c > 0 such that E(Tc ) = where Tc = cX(n) .
Problem 22. Suppose E(|X|) < , F is the distribution function of X and f (x) = F 0 (x)
is the derivative of F . Prove that
Z
Z 0
(1 F (x))dx
F (x)dx = E(X).

2
Problem 23. Suppose Xi s are i.i.d. random variables having E(X
i ) = and E(Xi ) = 2
P
n
for i = 1, . . . , n. Let X = (X1 + + Xn )/n and S 2 = (n 1)1 i=1 (Xi X)2 . Compute
the mean of X and S 2 .

Problem 24. Suppose X and Y are the characteristic functions of X and Y , respectively.
(a) Prove that X is symmetric if and only if X is real-valued.
(b) Find the characteristic function of aX + b using X .
(c) Prove |X |2 is also a characteristic function.
(d) Prove (X + Y )/2 is also a characteristic function.
Problem 25. (a) Find the density function of X1 /X2 when Xi i.i.d. N (0, 2 ) for i = 1, 2.
(b) Find the characteristic function of X Cauchy(0, 2 ) having density (/)/(x2 + 2 ).
(c) Suppose Xi i.i.d. Cauchy(0, 2 ) for i = 1, . . . , n. Find the distribution of X.
Problem 26. Two random variables X and Y are independent. If X Poisson() and
X + Y Poisson( + ), then find the distribution of Y .
d

Problem 27. Suppose Xn Binomial(n, /n). Prove that Xn Poisson().


Problem 28. Prove the following memoryless properties.
(a) If X Exponential(), then P (X > a + b | X > a) = P (X > b) for all positive real
numbers a and b.
(b) If X Geometric(), then P (X > a + b | X > a) = P (X > b) for all positive integers a
and b.
Problem 29. Suppose X | Y = y N (y, 2 ) and Y N (0, 2 ).
(a) What is the marginal distribution of X?
(b) What is the conditional distribution of Y given X = x?
Problem 30. Assume Xi i.i.d. N (0, 2 ) for i = 1, 2, . . . , n.
(a) Show that Xi2 / 2 i.i.d. 2 (1) Gamma(1/2, 1/2).
(b) Find the kurtosis of Xi , i.e., E[(X E(X))4 ].
Problem 31. Let X1 , . . . , Xn be a random sample from a distribution having a density
f (x) = c()x2 I(0 x ).
(a) Compute c().
(b) Show that X(n) = max(X1 , . . . , Xn ) converges to in probability.
(c) Prove or disprove that X(n) = max(X1 , . . . , Xn ) converges to almost surely.
3

Problem 32. Assume X1 , . . . , Xn are i.i.d. random variables from Poisson().


(a) Find the moment generating function of Xi .
(b) Show that T = X1 + + Xn is also a Poisson distribution.
(c) Assume X Poisson() and Y Poisson() are independent. Show that the conditional
distribution of X given X + Y = t is Binomial(t, /( + )).
Problem 33. Determine whether the following statements are True or False.
(a) Assume X Poisson(), Y Poisson(). If X and Y are independent, then X + Y
Poisson( + ).
(b) T = + (X1 X2 )/2 is a statistic.
(c) The maximum likelihood estimate and the method of moments estimate are always
the same.
(d) {N (, 2 ) or Poisson()} is not a model.
p

(e) If Xn X and Yn 1, then Xn /Yn X.


Problem 34. Assume
X Gamma(, ) having density (() )1 x1 exp(x/)I(x
R 1
0) where () = 0 x
exp(x)dx.
(a) Prove E(X r ) = r ( + r)/() for r > .
(b) Compute mean and variance of X.
Problem 35. The conditional expectation E(X | Y ) of X given Y is the random variable
Z = Z(Y ) such that E[(X Z(Y ))g(Y )] = 0 for all bounded function g.
(a) Prove that E(E(X | Y )) = E(X).
(b) Show that Var(X) = Var(E(X | Y )) + E(Var(X | Y )) where Var(X | Y ) = E(X 2 | Y )
[E(X | Y )]2 .
Problem 36. Consider a probability density function
(
c()x2 y if 0 x y
f (x, y | ) =
0
otherwise.
(a) Find c().
(b) Prove or disprove that X and Y are independent.
p

Problem 37. Assume that Xn X and Yn Y . Using Slutskys theorem, prove the
followings.
p
(a) Xn + Yn X + Y .
p
(b) Xn Yn XY .
Problem 38. Find a continuous function f and a sequence Xn X in Lp but f (Xn ) 6
f (X).
4

R1
Problem 39. Monte Carlo integration Let f be a measurable function on [0, 1] with 0 |f (x)|2 dx <
. Let U1R, U2 , . . . be i.i.d Uniform[0, 1], and In = (f (U1 ) + + f (Un ))/n. Show that
1
In I = 0 f (x) dx in probability and compute a convergence rate P (|In I| > /n1/2 )
using Chebyshevs inequality.
Problem 40. Let Xn is an AR (autoregressive) process satisfying X0 = and Xn = (1
) + Xn1 + n where || < 1, n i.i.d.N (0, 2 ). Prove that X n = (X1 + + Xn )/n
in probability.
Problem 41. Let X1 , X2 , . . . be i.i.d. with E|X1 | < . Show that max(X1 , . . . , Xn )/n 0
in probability.
Problem 42. Prove that Xn X in probability if andPonly if there exist n & 0 such that
P (|Xn X| > n ) n . Compare Xn X a.s. if
n=0 P (|Xn X| > n ) < for a
sequence n & 0.
Problem 43. Suppose X 0 and E(X 2 ) < . Prove that P (X > 0) (E(X))2 /E(X 2 ).
Problem 44. Let X1 , X2 , . . . be independent random variables with E(Xn ) = 0 and Var(Xn )/n
0 as n . Show that X n = (X1 + + Xn )/n 0 in L2 .
Problem 45. A sequence of random variables Xn is uniformly integrable if lim supn E(|Xn |1(|Xn |
)) = 0. Suppose Xn X almost surely. Show the following conditions are equivalent:
(a) Xn are uniformly integrable,
(b) E(|Xn X|) 0,
(c) E(|Xn |) E(|X|).
P
2
Problem 46. Let Xn be a martingale with E(X1 ) = 0 and
n=1 E[(Xn Xn1 ) ] < .
Show
almost surely. [Hint: Xn2 is a submatingale and E(Xn2 ) = E(X02 ) +
Pn that Xn converges
2
k=1 E[(Xk Xk1 ) ].]
Problem 47. Let Xn,i be i.i.d. nonnegative integer valued random variables with mean
0. Define Z0 = 1 and Zn+1 = (Xn+1,1 + + Xn+1,Zn ) if Zn > 0 and Zn+1 = 0 if Zn = 0.
(a) Show that Zn /n is a martingale.
(b) Show that Zn 0 if < 1.
(c) Show that Zn 0 if = 1 and P (Xn,i = 1) < 1.
Problem 48. Let X1 , X2 , . . . be i.i.d. with EXn = 0 and E|Xn |p < for some 1 < p < 2.
Show that (X1 + + Xn )/np/2 converges to 0 a.s.
Please ignore it is a bit beyond our scope.
Problem 49. Show that Xn X in probability if and only if E[|Xn X|/(1 + |Xn X|)] 0.
Problem50. Let X and Y be i.i.d. from a distribution having finite second moment. Also
(X + Y )/ 2 and X have the same distribution. Find the distribution of X.
5

Problem 51. Show that Xn + Yn X + Y in Lp if Xn X, Yn Y in Lp .


Problem 52. Let X1 , X2 , . . . be an i.i.d. random variables satisfying E(|Xn |) < . Show
n E(X1 ) in L1 .
that X
Problem 53. Let X1 , X2 , . . . be i.i.d. with finite second moment.
n = (X1 + + Xn )/n converges to E(X1 ) in probability, in L2 and almost
(a) Show that X
surely.
n )2 + + (Xn X
n )2 ]/n converges to Var(X1 ) in probability,
(b) Show that Sn2 = [(X1 X
in L1 and almost surely.
Problem 54. Let Xn be a homogeneous Markov chaing of which transition matrix is
a b
c
a 0
1
0
p=
b 1
0
0
c 0.2 0.3 0.5
(a) Find an irreducible set.
(b) Determine whether each state is recurrent or not.
(c) Find the period of each state.
(d) Prove or disprove the uniqueness of a stationary distribution.
Problem 55. Let Xn be a HMC with state space S = {A, B} and the transition probability
A
B

p=A 1
B

1
(a) Compute PA (TA = n) where TA is the first returning time to A, that is, TA = inf{n
1 : Xn = A}.
(b) Compute EA TA .
Problem 56. There is a plant species blooming three different colors (red, white and pink).
If pollinated within the same flower color group, the flower color of the offspring follows a
homogeneous Markov chain having the transition probability

red white pink


red
1
0
0
white 0
1
0
pink 1/4
1/4
1/2

(a) Compute the probability that pink color flower eventually absorbed into the red color
flower group.
(b) Compute the expected time (generation) that pink color flower eventually absorbed into
either red or white color flower group.

Problem 57. Let Xn be a homogeneous Markov chain. Let A be a closed subset of recurrent
states and B be the set of recurrent states not in A. Assume both A
Pand B are nonempty.
Define h(x) = 1 for all x A, h(x) = 0 for all x B and h(x) = yS p(x, y)h(y) for all
x 6 A B where p is the transition probability. Show that h(Xn ) is a martingale.
Problem 58. John is playing a gamble. He gains a dollar when he tosses a fair coin and
it lands head. Otherwise he loses a dollar. He starts the gamble with $3 and will stop the
gamble when either he loses all money or his wealth becomes $5. Let Xn be the wealth of
John at time n which is known to be a homogeneous Markov chain.
(a) Specify the state space and transition probability.
(b) Compute the probability Johns wealth reaches $5 before it reaches $0.
(c) Compute the expected time for John to stop the gambling.
Problem 59. Let Xn and Yn be two positive stochastic processes satisfying E(Xn+1 | X0 , . . . , Xn )
Xn Yn . Assume that Yn s are functions of X0 , . . . , Xn , that is, Yn = gnQ
(X0 , . . . , Xn ) for some
n1
Yk for n 2 is
functions gn . Show that Zn defined by Z1 = X1 and Zn = Xn / k=1
supermartingale.
Problem 60. Let Xn and Yn be two positive stochastic processes satisfying E(Xn+1 | X0 , . . . , Xn )
Xn + Yn . Assume that Yn s are functions of X0 , . . . , Xn , that is, Yn = gP
n (X0 , . . . , Xn ) for
some functions gn . Show that Zn defined by Z1 = X1 and Zn = Xn k = 1n1 Yk for
n 2 is supermartingale.
Problem 61. Let X1 , X2 , . . . be an i.i.d. sequence of random variables with E(|Xn |k ) <
for a positive integer k. Let k = E(Xnk ). For a sequence of posivie number an with an ,
show that Yn = (X1k k )/a1 + + (Xnk k )/an is a martingale.

The followings might be useful


x

1. e =

X
xn
n=0

n!

for any real number x.

2. log(1 z) =

X
zn
n=1

for |z| < 1.

1 X n
r
,
nr =
for |r| < 1.
2
1

r
(1

r)
n=0
n=0

n 



1a
a
1 2
2 2
n
4.
=
+ (1 a b)
where 1 =
b
1b
1 2
1 1

1


1
a b
d b
5.
=
when ad bc 6= 0.
c d
ad bc c a
3.

rn =

b
, 2
a+b

a
.
a+b

Você também pode gostar