Você está na página 1de 5

Answer – First Internal Assessment

1. True/False/Uncertain, Please briefly explain (no explanation no point). Answer any five.
5 X 7 = 35 points

a. The result E[ g ( X )] = g[ E ( X )] holds for g ( X ) = 2 x + 4 and for g ( X ) = X 2


The result E[ g ( X )] = g[ E ( X )] is only valid for linear functions of X as integration is
only distributive over linear functions.

b. If the joint pdf of (X,Y) is f(x,y)= xy+ y if a < x < b and c < y < x, and f(x,y) = 0 if
otherwise, then X and Y are not independent.

Because the range of Y depends on the value of X, the conditional pdf of Y|X cannot be
equal to its marginal pdf.

c. The only difference between F(x) and F(x|y) is that in the latter case you know the
value of y, but the distribution of X is the same in both cases.

The statement is False. Suppose Pr(X=0) = Pr(X=1) = 0.5 and Y = X then


FX (0.5) = 0.5 ≠ FX |Y(0.5|1) = 1. The statement is true only for two independent random
variables.
d. For two nonzero probability events to be independent they must be based in two
different experiments. If not, they are dependent. For example, consider tossing two
coins. The outcome of the first flip is independent of the outcome of the second flip,
because they are based on two different and independent flips (experiments).

False: Independence has nothing to do with the "number of experiments." If it did, then
we would always face existential questions like "Is flipping a coin twice one experiment
or two?" Additionally, consider selecting an individual at random from the MSE student
population. There is no reason that the events "the student is male" and "the student is
originally from Chennai" cannot be independent. Clearly, this is only one experiment;
the events can potentially be independent.

e. Assume that events A and B exhaust S. Then, the intersection between the union of A
and B and the intersection of A and B, is the empty set.
For A and B that exhaust S, [(AUB) ∩ (A∩B)] = A ∩ B. This will be equal to empty
set only if A and B are disjoint sets.
f. Whenever you need to perform a transformation of one or more random variables,
you need to compute the Jacobian.
We do not need it always for example in case of discrete random variable.

2. Suppose that X, Y and e are random variables Y = a + bX + e. Assume E(X) = 0, Var(X) =


σ2, e has a uniform distribution over [-1/2,1/2], and Cov(X,e) = 0. (25 points)

a. What is the pdf of Y|X?

1
b. Calculate E(Y|X) and Var(Y|X)?
c. What is E(Y) and Var(Y)?

a)
The distribution of Y|X is the distribution of a + bX + e where e is the only random variable
and a + bX is fixed. Let us define K = a + bX. Therefore, the distribution of K + e is the
distribution of e shifted by K, or U[K-1/2, K+1/2]. Thus Y|X ~ U[a + bX-1/2, a + bX+1/2].

b)

For a random variable Z with a uniform distribution U[a,b], fZ(z) = 1/ b-a


b
bz ⎡ z2 ⎤ b2 − a2 a + b
E (Z ) = ∫ dz = ⎢ ⎥ = =
ab − a ⎣⎢ 2(b − a ) ⎦⎥ a 2(b − a ) 2
b
2 z2 b ⎡ z3 ⎤ b 3 − a 3 a 2 + 2ab + b 2
E (Z ) = ∫ dz = ⎢ ⎥ = =
ab − a ⎢⎣ 3(b − a) ⎥⎦ 3(b − a )
a
3

a + 2ab + b 2 (b + a ) 2 (b − a ) 2
2
Var ( Z ) = E ( Z 2 ) − [ E ( Z )]2 = − =
3 4 12
Therefore, for U [a + bX − 1 / 2, a + bX + 1 / 2]
a + b a + bX − 1 / 2 + a + bX + 1 / 2 a + bX
E (Y | X ) = = =
2 2 2
(b − a ) 2 ((a + bX + 1 / 2) − (a + bX − 1 / 2)) 2 1
Var (Y | X ) = = =
12 12 12

c)
E (Y ) = E (a + bX + e) = E (a ) + bE ( X ) + E (e) = a
Var (Y ) = Var (a + bX + e) = b 2 var( X ) + var(e) + 2b cov( X , e)
[0 + 0.5 − (0 − 0.5)]2 1
= b 2σ 2 + + 2b * 0 = b 2σ 2 +
12 12

Or

Assume that X is distributed uniformly over the interval [0, 4].

a. Calculate the moment generating function of X. Use the MGF to find the
mean and variance of X.
b. Use the Chebyshev inequality to calculate an upper bound on the probability
that X is outside the interval [0.5, 3.5].
c. Now use the fact that X is distributed uniformly over the interval [0, 4] to
calculate the probability that X is outside the interval [0.5, 3.5]. Is it higher or
lower than the answer you got in part b?

2
a)
The moment generating function is:

M X (θ ) = E (eθX )
Since X ~ U[0,4]
41 1 4θ
M X (θ ) = E (eθX ) = ∫ eθx dx = (e − 1)
04 4θ

∂ ∂ 1 4θ e 4θ e 4θ − 1
E( X ) = M X (θ ) |θ = 0 = (e − 1) |θ = 0 = − |θ = 0
∂θ ∂θ 4θ θ 4θ 2
4θe 4θ − e 4θ + 1 16θe 4θ + 4e 4θ − 4e 4θ
= |θ = 0 = |θ = 0 = 2
4θ 2 8θ

∂2 4θe 4θ − e 4θ 16θ 2 e 4θ − 8θ (e 4θ − 1)
E( X 2 ) = M X (θ ) |θ = 0 = − |θ = 0
∂θ 2 4θ 2 16θ 4
4θe 4θ − e 4θ 2θe 4θ − e 4θ + 1
= − |θ = 0
4θ 2 2θ 3
8θ 2 e 4θ − 4θe 4θ + e 4θ − 1
= |θ = 0
2θ 3
32θ 2 e 4θ + 16θe 4θ − 16θe 4θ − 4e 4θ + 4e 4θ
= |θ = 0
6θ 2
16e 4θ
= |
3 θ =0
16
=
3
16 4
Var ( X ) = −4=
3 3

b)
The Chebyshev inequality states that:
Var ( X )
Pr( X − E ( X ) ≥ θ ≤
θ2
E( X ) = 2
4
Var ( X ) =
3
3
θ=
2
3 (4 / 3) 16
Pr( X ∉ (0.5,3.5)) = Pr( X − 2 ≥ ) = =
2 2 27
(3 / 2)

3
c)
0.5 1 4 1 1 1 1 13 1
Pr( X ∉ (0.5,2.5)) = ∫ dx + ∫ dx = [ x]00.5 + [ x]24.5 = + =
0 4 2.5 4 4 4 8 42 2
3. Suppose that the joint pdf of two random variables X and Y is: (40 marks)

f ( x, y ) = e − x − y if 0 < x < ∞ and 0 < y < ∞


= 0 otherwise

a. Check whether the pdf is correctly specified.


b. Find the joint cdf of X and Y, F(x,y). Compute Pr( X < 1, Y <1).
c. Find the marginal distribution of X, f(x) and the conditional distribution of X
given Y = y, f(x|y). Check both are well defined pdfs.
d. Define Pr(X-Y > 1) and Pr(X+Y >1, X >Y). You do not need to solve the
integrals.
e. Calculate the cdf of the random variable: X/Y.

Since the exponential function is always positive, the pdf satisfies the positivity requirement.
Therefore, we need only check that the pdf integrates to 1 over the entire sample space.
Notice that X and Y are independent: f ( x, y ) = f X ( x) f Y ( y ) = e − x − y = e − x e − y

a)

∞∞ ∞ ∞
−x −y
∫ ∫e e dydx = ∫ e − x dx ∫ e − y dy = [−e − x ]∞ −y ∞
0 [ −e ]0 = (−0 + 1)(−0 + 1) = 1
00 0 0

Therefore the pdf is correctly specified.


b)

The joint cdf is:


xy x y
F ( x, y ) = ∫ ∫ e − u e − v dudv = ∫ e − v dv ∫ e − u du = [−e − v ]0x [−e − u ]0y = (1 − e − x )(1 − e − y )
00 0 0

Pr( X < 1, Y < 1) = (1 − e −1 )(1 − e −1 )


c)
As X and Y are independent,

f ( x) = f ( x | y ) = e − x

From part a), we know that both the marginal and conditional distribution integrates to 1.
d)

4
∞ x −1
Pr( X − Y > 1) = Pr(Y < X − 1) = ∫ ∫ e − x − y dydx
1 0
1 x ∞x
− x− y
Pr( X + Y > 1, X > Y ) = Pr(Y < X − 1) = ∫ ∫e dydx + ∫ ∫ e − x − y dydx
0.5 1− x 10

e)
Let us define Z = X/Y.
X ∞ ∞ ∞ ∞
F ( z ) = Pr( Z ≤ z ) = Pr( ≤ z ) = Pr( X ≤ zY ) = ∫ ∫ e − x − y dydx = ∫ e − x ∫ e − y dydx
Y 0x/ z 0 x/z
∞ ∞ − e − x (1+1 / z ) ∞ 1 z
= ∫ e − x e − x / z dx = ∫ e − x (1+1 / z ) dx = | 0 = −(0 − )=
0 0 1 + 1/ z 1 + 1/ z z +1

Você também pode gostar