Você está na página 1de 3

Stat 411 Homework 07

Solutions

1. The distribution N(, 1) is a regular


one-parameter exponential family problem
P
with K(x) = x. Therefore, T = ni=1 Xi is a complete sufficient statistic for and,
2
consequently, the MVUE of is X = T /n. It is easy to check that = X 1/n
is an unbiased estimator of = 2 . Since its a function of the complete sufficient
statistic, by the LehmannScheffe theorem, it must be the MVUE.
2. Problem 7.5.3 from HMC67.
(a) The beta distribution with PDF f (x) = x1 , with x (0, 1) and > 0, is
a regular exponential family. That is,
f (x) = exp{log + ( 1) log x}.
iid

Since K(x) = log


, Xn
Pnx here, a complete sufficient statistic, based on X1 , . . .1/n
f (x), is T0 = i=1 log Xi . So too is T = exp{T0 /n} = (X1 X2 Xn ) , the
geometric mean, since the function t 7 et/n is one-to-one.
P
P
(b) The log-likelihood function `() = ni=1 log f (Xi ) = n log +(1) ni=1 log Xi .
Differentiating and setting equal to zero gives the likelihood equation:
n

n X
+
log Xi = 0.

i=1
Therefore, the MLE is = n/
geometric mean T from part (a).

Pn

i=1

log Xi = 1/ log T , a function of the

3. The moment-generating function MX (t) of X is given by


Z
tX
MX (t) = E (e ) = etx ex+S(x)+q() dx
Z
= eq()q(+t) e(+t)x+S(x)+q(+t) dx
Z
q()q(+t)
=e
e(+t)x+S(x)+q(+t) dx
= eq()q(+t) ,
where the last inequality is because the integrand is the exponential family PDF
with parameter + t instead of .
4. Problem 7.5.13 in HMC6 = Problem 7.5.12 in HMC7.
(a) The log-likelihood looks like
`() =

n
n
X
X


Xi log + log(1 ) = (log )
Xi + n log(1 ).
i=1

i=1

Differentiating and setting equal to zero gives the equation


n

1X
n
Xi
= 0.
i=1
1
P
The solution to this is the MLE: = T /(n + T ), where T = ni=1 Xi .
(b) The distribution is a member of the regular exponential family:
f (x) = x (1 ) = exp{(log )x + log(1 )}.
P
Therefore, by Theorem 7.5.2, T = ni=1 Xi is a complete sufficient statistic.
(c) To find the MVUE of , we need a function g(T ) thats unbiased; by the
LehmannScheffe theorem, this g(T ) must be the MVUE. To start, recall that
T (a sum of independent geometric RVs) has a negative binomial distribution.1
That is, the PMF of T is


n+t1 t
fT, (t) =
(1 )n , t = 0, 1, 2, . . .
n1
Since the MLE = T /(n + T ) is a reasonable choice, lets first try g(T ) =
T /(n + T ). The expected value looks like:
E [T /(n + T )] =

X
t=0



n+t1 t
t
(1 )n .
n+t
n1

Theres some potential for cancellation if we instead take g(T ) = T /(n+T 1).
In this case,
E [T /(n + T 1)] =
=

X
t=0



t
n+t1 t
(1 )n
n+t1
n1
t
(n + t 1)! t
(1 )n
n + t 1 t!(n 1)!

t=0

(n + t 2)! t1
(1 )n
(t

1)!(n

1)!
t=1


X
n+u1 u
=
(1 )n
n

1
u=0

= .
The next-to-last line follows from a change of variable: u = t 1. Therefore,
T /(n + T 1) is an unbiased estimator of ; moreover, since its a function of
the complete sufficient statistic, it must be the MVUE.
1

Each geometric RV Xi counts the number of trials until the first success; T =
number of trials until the n-th success.

Pn

i=1

Xi counts the

5. (Graduate only)
P
)] by defini(a) The estimator m
= n1 ni=1 K(Xi ) is unbiased for m = E [K(X1P
tion. Since its a function of the complete sufficient statistic T = ni=1 K(Xi )
it must be the MVUE according to LehmannScheffe.
(b) Here I prove that m
is the MVUE of m by showing that its variance is equal
to the CramerRao lower bound.
Observe that V (m
) = V (T /n) = V (T )/n2 . By Theorem 7.5.1(3),
V (T ) = n{p00 ()q 0 () p0 ()q 00 ()}/p0 ()3 ,
which implies
V (m
) =

p00 ()q 0 () p0 ()q 00 ()


.
np0 ()3

For the CramerRao lower bound, I first need the Fisher information. The
second derivative of log f (x) is
( 2 /2 ) log f (x) = p00 ()K(x) + q 00 ().
Therefore,
I() = E

i
h 2
log
f
(X
)

1
2

= p00 ()E [K(X1 )] q 00 () =

p00 ()q 0 () p0 ()q 00 ()


,
p0 ()

where the last equality uses the result in Theorem 7.5.1(2) to express
E [K(X1 )] in terms of p and q. Since the estimand is m , not just , the
CramerRao lower bound for estimating m() involves the derivative of
m(). Since m = E [K(X1 )] = q 0 ()/p0 () from Theorem 7.5.1(2), its
derivative with respect to is
m0 =

p00 ()q 0 () p0 ()q 00 ()


.
p0 ()2

Therefore, the CramerRao lower bound is


LB =

(m0 )2
p00 ()q 0 () p0 ()q 00 ()
.
=
nI()
np0 ()3

Since the CramerRao lower bound is the same as V (m


), theres no other
estimator with smaller variance, i.e., its the MVUE.
This result in this exercise showed that the CramerRao lower bound is attained in an exponential family problem. There is a partial converse to this
result which says, roughly, if the CramerRao lower bound is attained, then the
problem must be an exponential family typevery interesting! See Lehmann
and Casella (1998, p. 121) for the precise statement of this result.

Você também pode gostar