Escolar Documentos
Profissional Documentos
Cultura Documentos
Solutions
n X
+
log Xi = 0.
i=1
Therefore, the MLE is = n/
geometric mean T from part (a).
Pn
i=1
n
n
X
X
Xi log + log(1 ) = (log )
Xi + n log(1 ).
i=1
i=1
1X
n
Xi
= 0.
i=1
1
P
The solution to this is the MLE: = T /(n + T ), where T = ni=1 Xi .
(b) The distribution is a member of the regular exponential family:
f (x) = x (1 ) = exp{(log )x + log(1 )}.
P
Therefore, by Theorem 7.5.2, T = ni=1 Xi is a complete sufficient statistic.
(c) To find the MVUE of , we need a function g(T ) thats unbiased; by the
LehmannScheffe theorem, this g(T ) must be the MVUE. To start, recall that
T (a sum of independent geometric RVs) has a negative binomial distribution.1
That is, the PMF of T is
n+t1 t
fT, (t) =
(1 )n , t = 0, 1, 2, . . .
n1
Since the MLE = T /(n + T ) is a reasonable choice, lets first try g(T ) =
T /(n + T ). The expected value looks like:
E [T /(n + T )] =
X
t=0
n+t1 t
t
(1 )n .
n+t
n1
Theres some potential for cancellation if we instead take g(T ) = T /(n+T 1).
In this case,
E [T /(n + T 1)] =
=
X
t=0
t
n+t1 t
(1 )n
n+t1
n1
t
(n + t 1)! t
(1 )n
n + t 1 t!(n 1)!
t=0
(n + t 2)! t1
(1 )n
(t
1)!(n
1)!
t=1
X
n+u1 u
=
(1 )n
n
1
u=0
= .
The next-to-last line follows from a change of variable: u = t 1. Therefore,
T /(n + T 1) is an unbiased estimator of ; moreover, since its a function of
the complete sufficient statistic, it must be the MVUE.
1
Each geometric RV Xi counts the number of trials until the first success; T =
number of trials until the n-th success.
Pn
i=1
Xi counts the
5. (Graduate only)
P
)] by defini(a) The estimator m
= n1 ni=1 K(Xi ) is unbiased for m = E [K(X1P
tion. Since its a function of the complete sufficient statistic T = ni=1 K(Xi )
it must be the MVUE according to LehmannScheffe.
(b) Here I prove that m
is the MVUE of m by showing that its variance is equal
to the CramerRao lower bound.
Observe that V (m
) = V (T /n) = V (T )/n2 . By Theorem 7.5.1(3),
V (T ) = n{p00 ()q 0 () p0 ()q 00 ()}/p0 ()3 ,
which implies
V (m
) =
For the CramerRao lower bound, I first need the Fisher information. The
second derivative of log f (x) is
( 2 /2 ) log f (x) = p00 ()K(x) + q 00 ().
Therefore,
I() = E
i
h 2
log
f
(X
)
1
2
where the last equality uses the result in Theorem 7.5.1(2) to express
E [K(X1 )] in terms of p and q. Since the estimand is m , not just , the
CramerRao lower bound for estimating m() involves the derivative of
m(). Since m = E [K(X1 )] = q 0 ()/p0 () from Theorem 7.5.1(2), its
derivative with respect to is
m0 =
(m0 )2
p00 ()q 0 () p0 ()q 00 ()
.
=
nI()
np0 ()3