Você está na página 1de 7

STA 113 HW 6

(Provided by Andrew Dreher)


October 31, 2004
6.9
Each of 150 newly manufactured items is examined and the number of scratches per item is recorded (the
items are supposed to be free of scratches), yielding the following data:
Number of scratches per item 0 1 2 3 4 5 6 7
Observed frequency 18 37 42 30 13 7 2 1
Let X = the number of scratches in a randomly chosen item, and assume that X has a Poisson distribution
with parameter .
Part a
Find an unbiased estimator of and compute the estimate for the data. [Hint: E(X) = for X Poisson, so
E(

X) =?]

= x =
X
1
+ .... + X
150
150

=
0 18 + 1 37 + 2 42 + 3 30 + 4 13 + 5 7 + 6 2 + 7 1
150

=
317
150
= 2.113
Part b
What is the standard deviation (standard error) of your estimator? Compute the estimated standard error:
(Hint:
2
x
= for X Poisson.)
For a Poisson r.v. E(X) = V(X) = . However, we need V ( x) =

n
. So, =
_

n
.
=

n
=

317
150
150
= 0.1187
1
6.11
Of n
1
randomly selected male smokers, X
1
smoked ltered cigarettes, whereas of n
2
randomly selected female
smokers, X
2
smoked ltered cigarettes. Let p
1
and p
2
denote the probabilities that a randomly selected male
and female, respectively, smoke lter cigarettes.
Part a
Show that
_
X1
n1

X2
n2
_
is an unbiased estimator for p
1
p
2
. [Hint: E(X
1
) = n
i
p
i
for i = 1, 2.]
To show that the estimator is unbiased, we require that E(X) = p.
E(X) = E
_
X
1
n
1

X
2
n
2
_
=
1
n
1
E(X
1
)
1
n
2
E(X
2
)
=
1
n
1
n
1
p
1

1
n
2
n
2
p
2
= p
1
p
2
Thus, the estimator is unbiased. since
E
__
X
1
n
1

X
2
n
2
__
= p
1
p
2
Part b
What is the standard error of the estimator in part(a)? To nd the standard error, the variance is rst
needed
V (X) = V
_
X
1
n
1

X
2
n
2
_
=
_
1
n
1
_
2
V (X
1
) +
_
1
n
2
_
2
V (X
2
)
=
_
1
n
1
_
2
n
1
p
1
q
1
+
_
1
n
2
_
2
n
2
p
2
q
2
=
p
1
q
1
n
1
+
p
2
q
2
n
2
So, the standard error is:
S.E.(X) =
_
p
1
q
1
n
1
+
p
2
q
2
n
2
Where p
i
=
Xi
ni
for i = 1, 2 and q = 1 p
i
.
Part c
How would you use the observed values x
1
and x
2
to estimate the standard error of your estimator?
You would plug in the values of x
1
and x
2
into the S.E.(X) formula. That is, replace p
i
with
xi
ni
for
i = 1, 2 and q
i
with 1 p
i
. This would give you the standard error. The problem is that you still need to
know what n
1
and n
2
are. If you are given these, its simple. If you are not given these, its not possible to
get a realistic estimate.
2
Part d
If n
1
= n
2
= 200, x
1
= 127 and x
2
= 176, use the estimator of part (a) to obtain an estimate of p
1
p
2
.
Let p = p
1
p
2
p = p
1
p
2
=
X
1
n
1

X
2
n
2
= 0.245
Part e
Use the results from part (c) and the data of part (d) to estimate the standard error of the estimator.
S.E.(X) =
_
p
1
q
1
n
1
+
p
2
q
2
n
2
p
1
=
127
200
= 0.635 p
2
=
176
200
= 0.880
q
1
= 1 p
1
= 0.365 q
2
= 1 p
2
= 0.120
S.E.(X) =
_
0.635 0.365
200
+
0.880 0.120
200
= 0.0411
6.19
An investigator wishes to estimate the proportion of students at a certain unversity who have violated the
honor code. having obtained a random sample of n students, she realizes that asking each, Have you violated
the honor code? will probably result in some untruthful responses. Consider the following scheme, called a
randomized response technique. The investigator makes up a deck of 100 cards, of which 50 are of type I and
50 are of type II.
Type I: Have you violated the honor code (yes or no)?
Type II: Is the last digit of your telephone number a 0, 1, or 2 (yes or no)?
Each student in the random sample is asked to mix the deck, draw a card, and answer the resulting question
truthfully. Because of the irrelevant question on type II cards, a yes response no longer stigmatizes the re-
spondent, so we assume that responses are truthful. Let p denote the proportion of honor-code violators (i.e.
the probability that a randomly selected student being a violator), and let = P(yes response). Then and
p are related by = 0.5p + (0.5)(0.3).
Part a
Let Y denote the number of yes responses, so Y Bin(n, ). Thus
Y
n
is an unbiased estimator of . Derive
an estimator for p based on Y . If n = 80 and y = 20, what is your estimate? (Hint: Solve = 0.5p + 0.15
for p and the substitute
Y
n
for ).
From a tree diagram it is easy to see that the estimated probability,

, of a yes answer is 0.5 p + 0.15.

= 0.5 p + 0.15
p = 2

0.3 = 2
Y
n
0.3
Plugging in n = 80 and y = 20, it is found that p is
p = 2
20
80
0.3 = 0.2
3
Part b
Use the fact that E
_
Y
n
_
= to show that your estimator p is unbiased. For the estimator to be unbiased,
E( p) = p.
E( p) = E(2
Y
n
0.3)
Using the property of linear combinations, E( p) = 2E(
Y
n
) 0.3.
E( p) = 2 0.3 = p
Part c
If there were 70 type I and 30 type II cards, what would be your estimator for p?
Y
n
=

= 0.7 p + (0.30)(0.30)
p =

0.09
0.70
=
100

9
70
6.20
A random sample of n bike helmets manufactured by a certain company is selected. Let X = the number
among the n that are awed, and let p = P(awed). Assume that only X is observed, rather than a sequence
of Ss and Fs.
Part a
Derive the maximum likelihood estimator of p. If n = 20 and x = 3, what is the estimate?
P(x) = Likelihood(p) =
_
n
x
_
p
x
(1 p)
nx
Now, we take the natural log of L(p) and then dierentiate.
ln (L(p)) = ln
__
n
x
__
+ x ln (p) + (n x) ln (1 p)
d
dp
ln (L(p)) =
x
p

n x
1 p
Setting
d
dp
ln (L(p)) = 0 will give the mle. So, for n = 20 and x = 3,
3
p

17
1 p
= 0
p =
3
20
= 0.15
4
Part b
Is the estimator of part (a) unbiased?
For the estimator to be unbiased, we require E(X) = p. We now check if E( p) = p to see if we have an
unbiased estimator.
E( p) = E(
x
n
)
E( p) =
1
n
E(x)
Since this is a binomial distribution, E(x) = np.
E( p) =
1
n
np = p
Hence, the derived estimator is unbiased.
Part c
If n = 20 and x = 3, what is the mle of the probability (1 p)
5
that none of the next ve helmets examined
is awed?
Using the Invariance Principle, we can use p which was found in part(a).
(1 p)
5
= (1 p)
5
(1 0.15)
5
= 0.4437
So, the probability that none of the next ve helmets examined is awed is 0.4437.
6.23
Two dierent computer systems are monitored for a total of n weeks. Let X
i
denote the number of breakdowns
of the rst system during the ith week, and suppose that the X
i
s are independent and drawn from a Poisson
distribution with parameter
1
. Similarly, let Y
i
denote the number of breakdowns of the second system
during the ith week, and assume independence with each Y
i
Poisson with paramter
2
. Derive the mles of

1
,
2
, and
1

2
. [Hint: Using independence, write the joint pmf (likelihood) of the X
i
s and Y
i
s together.]
First, I will calculate
1
.
p(x
1
, ..., x
n
;
1
) =
e
n1

n
i=1
xi
1

n
i=1
x
i
!
The ln(likelihood) is
ln L(
1
) = n
1
+ ln(
1
)x
i
ln(
n

i=1
x
i
!)
d
d
1
ln L(
1
) = n +
x
i

1
= 0
n =
x
i

1
= x
i
/n = x
5
Next, I will calculate
2
.
p(y
1
, ..., y
n
;
2
) =
e
n2

n
i=1
yi
2

n
i=1
y
i
!
The ln(likelihood) is
ln L(
2
)] = n
2
+ ln(
2
)
n
i=1
y
i
ln(
n

i=1
y
i
!)
d
d
2
ln(
2
) = n +
y
i

2
= 0
n =
y
i

2
= y
i
/n = y
So, using the above results, the mle for
1

2
is simply x y.
6.32
Part a
Let X
1
, ...., X
n
be a random sample from a uniform distribution on [0, ]. Then the mle of is

= Y =
max(X
i
). Use the fact that Y y i each X
i
y to derive the cdf of Y . Then show that the pdf of
Y = max(X
i
) is
f
Y
(y) =
_
_
_
ny
n1

n
0 y
0 otherwise
F(y) = P(x
1
y, ..., x
n
y) 0 y
Since all the xs are independent, we have
F(y) = P(x
1
y) .... P(x
n
y)
=
n

i=1
P(x
i
y)
Because the distribution is uniform, the P(x
i
y) =
y

. Thus,
F(y) =
_
y

_
n
0 y
Now, f
Y
(y) =
d
dy
F(y).
f
Y
(y) =
d
dy
F(y)
=
d
dy
y
n

n
0 y
=
ny
n1

n
0 y
Hence,
f
Y
(y) =
_
_
_
ny
n1

n
0 y
0 otherwise
6
Part b
Use the result from part (a) to show that the mle is biased but that
n+1
n
max(X
i
) is unbiased.
E(

) = E(Y ) =
_

0
y f
Y
(y)dy
=
_

0
y
ny
n1

n
dy
=
_

0
ny
n

n
dy
=
n

n
_

0
y
n
dy
=
n

n

1
n + 1
y
n+1

0
=
n

n

1
n + 1
_

n+1
0
_
=
n

n

1
n + 1

n+1
=
n
n + 1

Therefore, the estimator is biased. However, if we multiply by


n+1
n
, the estimator would be unbiased. So,
n+1
n

is an unbiased estimator. This indicates that


n+1
n
max(x
i
) is unbiased.
7

Você também pode gostar