Você está na página 1de 22

Applied Probability (OS2103) Lecture Notes

Prof. James Eagle Rev: July 27, 2006


3 Discrete Random Variables and Probability Distributions
3.1 Random Variables
Denition: For a given sample space S of some experiment, a random variable (rv) is a
rule (or function) which maps S into the real line.
The random variable quanties the experiment by assigning numbers to all possible
outcomes. For example, an experiment could be rolling red and green dice. Example
outcomes are (R3,G2) or (G6,R3). There are 36 such singleton events. One possible
random variable X associated with this experiment is
X =
_
1, sum of the top faces is 8
0, sum of the top faces is 7.
1

2

3

4

5

6
1 2 3 4 5 6
Green
Die
Red
Die
0 1
(1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
(2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
(3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
(4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
(5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
(6,1) (6,2) (6,3) (6,4) (6,5) (6,6)
Figure 1: Two-dice random variable.
In this case, the X is a function mapping the state space S to either 0 or 1.
And P(X = 1) = 15/36, and P(X = 0) = 21/36.
Denition: A random variable which takes on only 0 or 1 is a Bernoulli random vari-
able.
1
Denition: A set is discrete if it contains either a nite number of elements or a countably
innite number of elements. Examples:
{1, 2, 10, -4} nite, discrete set
{2, 4, 6, 8, 10, . . .} countably innite, discrete set
Denition: A random variable is discrete if its set of possible values is a discrete set.
3.2 Probability Distributions for Discrete Random Variables
Denition: For discrete random variable X and real number x, the probability distri-
bution function (pdf ) (dened for both discrete and continuous random variables) or
probability mass function (pmf ) (dened only for discrete random variables) is
p(x) = P(X = x) = P(all events s S s.t. X(s) = x). (1)
Examples:
1. Experiment: Roll two dice. The discrete random variable illustrated in Figure 1 is
X =
_
1, sum of the top faces is 8
0, sum of the top faces is 7
with an associated pmf of
p(x) =
_
15/36, x = 1
21/36, x = 0
2. Experiment: Roll two dice and the random variable, X, is the sum of the top faces.
Then the pmf is then
p(x) =
_

_
1/36, x = 2 5/36, x = 6 3/36, x = 10
2/36, x = 3 6/36, x = 7 2/36, x = 11
3/36, x = 4 5/36, x = 8 1/36, x = 12
4/36, x = 5 4/36, x = 9
Denition: The cumulative distribution function (cdf ) F(x) for a discrete rv X with
pmf p(x) is dened as
F(x) = P(X x) =

y:yx
p(y). (2)
2
Example:
p(x) =
_

_
.1, x = 0
.3, x = 2
.2, x = 3
.1, x = 5
.3, x = 10
F(x) =
_

_
.1, x [0, 2)
.4, x [2, 3)
.6, x [3, 5)
.7, x [5, 10)
1.0, x 10
Figure 2 shows plots p(x) and F(x).
For rv X,
P(a X b) = F(b) F(a

), (3)
where a

is the largest possible X value less than a. So, for example,


P(2 X 3) = F(3) F(0) = .6 .1 = .5.
3.3 Expected Values of Discrete Random Variables
Denition: Let X be a discrete rv with pmf p(x
1
), p(x
2
), . Then the expected value
of X is

X
= E[X] =

x=x
1
,x
2
,...
x p(x). (4)
The mean is the average value of a random variable. It is one measure of the center
of a probability distribution.
Examples:
1. (from Devore [2004, Section 3.3])
p(x) =
_

_
.01, x = 1
.03, x = 2
.13, x = 3
.25, x = 4
.39, x = 5
.17, x = 6
.02, x = 7

X
= E[X] = 1(.01) + 2(.03) + 3(.13) + 4(.25) + 5(.39) + 6(.17) + 7(.02) = 4.57.
3
s
s
s
s
s
0 5 10
.1
.2
.3
p(x)
c
c
c
c
-
0 5 10
.2
.4
.6
.8
1.
F(x)
Figure 2: Probability Mass Function (p(x)) and Cumulative Distribution Function (F(x)).
4
2. X Bernoulli(p). So,
p(x) =
_
1 p, x = 0
p, x = 1
And,

X
= 0(1 p) + 1(p) = p.
3. (Devore [2004, example 3.9]) It can be shown that

k=1
1/k
2
=
2
/6. So, random
variable X can have this pmf:
p(x) =
_
6

2
_
1
x
2
, x = 1, 2, 3, . . . .
And the expected value is
E[X] =
_
6/
2
_

x=1,2,...
x
_
1/x
2
_
=
_
6/
2
_

x=1,2,...
(1/x)
.
So this heavy tailed distribution has no mean (or a mean which is arbitrarily large).
Notes:
1. The mean can be thought of as the center of gravity of a distribution, as illustrated
in Figure 3, with p(x) values from Example 1.
0 1 2 3 4 5 6 7
.01
.03
.13
.25
.39
.17
.02
E[X] = 4.57
p(x) values
Figure 3: Mean as Center of Gravity or Balance Point of a Distribution.
2. For random variables X 0 with cdf F(x), E[X] is the area above F(x) and below
1. That is,
E[X] =
_

0
(1 F(x))dx. (5)
5
For example, if p(2) = .2, p(4) = .4, and p(5) = .4, then E(X) = 2(.2)+4(.4)+5(.4) =
4, which is the area between F(X) and 1 in Figure 4.
0 1 2 3 4 5 6 7
.2
.4
.6
.8
1.0
p(2)
p(4)
p(5)
5 p(5)
4 p(4)
2 p(2)
F(x)
x
Figure 4: E[X] = 2p(2) + 4p(4) + 5p(5), which is the area between F(x) and 1.
3.3.1 The Expected Value of a Function of a Random Variable
For random variable X with pmf p(x
1
), p(x
2
) . . . and real function h(x),
E[h(X)] = h(x
1
)p(x
1
) +h(x
2
)p(x
2
) +. . . =

x=x
1
,x
2
,...
h(x) p(x). (6)
Examples:
1. (Devore [2004, p.114]) Let X have the following pmf:
p(x) =
_
_
_
.5, x = 4
.3, x = 6
.2, x = 8
And let the function h(x) = 20 + 3x + .5x
2
. We can consider h(X) as a rv in its
own right and compute its mean as p(4)h(4) +p(6)h(6) +p(8)h(8) = 52. This is the
expected cost of a tune-up when vehicles can have either 4, 6, or 8 cylinders. In terms
of the vector dot product, we can write E[X] = p x and E[h(X)] = p h(x).
x p(x) h(x) xp(x) h(x)p(x)
4 .5 $40 2 $20
6 .3 $56 1.8 $16.8
8 .2 $76 1.6 $15.2
(column sum) 5.4 (E[X] =
X
) $52 (E[h(X)] =
h(X)
)
6
2. For discrete rv X and real number m, E[(X m)
2
] is the mean squared deviation
of the data from m. It is easy to demonstrate that when m =
X
, this expression is
minimized.
Let
f(m) = E[(X m)
2
]
=

x
(x m)
2
p(x)
=

x
(x
2
2xm+m
2
)p(x).
Taking a derivitave wrt m and setting it to 0 gives
df(m)
dm
=

x
(0 2x + 2m)p(x)
= 2
X
+ 2m = 0.
Or, m =
X
. This lends justication to the statement that
X
is a reasonable
measure of the center of a probability distribution.
3.3.2 Special Case where h(x)=ax+b
Proposition:
E[aX +b] = aE[X] +b, (7)
(or,
(aX+b)
= a
x
+b).
Proof:
E[aX +b] =

x
(ax +b)p(x)
=

x
a x p(x) +

x
b p(x)
= aE[X] +b.
7
3.3.3 The Variance of a Random Variable
The variance is the expected squared deviation from the mean. It is a measure of how
distributed a probability distribution is about its mean value. A small variance means
the probability mass is largely in the vicinity of the mean. A large variance means the
distribution is more spread out.
Denition: Let X be a discrete rv with pmf p(x
1
), p(x
2
), . The variance of X is

2
X
= V [X] = Var[X] =

x=x
1
,x
2
,...
(x
X
)
2
p(x)
= E[(X
X
)
2
]. (8)
And the standard deviation of X is
X
=
_

2
X
.
Continuing the previous example, we have
2
X
= 2.44. And
X
=

2.44 = 1.562.
x p(x) (x
X
)
2
(x
X
)
2
p(x)
4 .5 (4 5.4)
2
= 1.96 (1.96).5
6 .3 (6 5.4)
2
= .36 (.36).3
8 .2 (8 5.4)
2
= 6.76 (6.76).2
(column sum)
2
X
= 2.44
3.3.4 Shortcut Formula for Variance
Proposition:

2
X
= E[X
2
] (
X
)
2
. (9)
Proof: Dropping the subscripts on
X
and
X
for simplicity,

2
=

x
(x )
2
p(x)
=

x
(x
2
2x +
2
)p(x)
=

x
x
2
p(x) +
2

x
p(x) 2

x
xp(x)
= E[X
2
] +
2
2
= E[X
2
]
2
.
And continuing the same example,
8
x p(x) x
2
x
2
p(x)
4 .5 16 (16).5=8
6 .3 36 (36).3=10.8
8 .2 64 (64).2=12.8
(column sum) E[X
2
]=31.6
So E[X
2
] E[X]
2
= 31.6 5.4
2
= 2.44 (as before).
3.3.5 The Variance of a Linear Function of a Random Variable
For a general function h(x), we rst determine
h(X)
= E[h(X)] and then can com-
pute
V [h(X)] =
2
h(X)
= E[(h(X)
h(X)
)
2
],
=

x
(h(x)
h(X)
)
2
p(x).
And when h(x) = ax +b,
V [h(X)] =
2
aX+b
= a
2

2
X
, (10)
and the standard deviation is

aX+b
= |a|
X
. (11)
Example: Let C be a random variable for centigrade temperature. Possible values for C
are (15, 17, 21, 22), and the associated probabilities are (.2, .3, .1, .4). Let rv F = f(C) =
(9/5)C + 32. What are E[C], V [C], E[F], and V [F]?
c c
2
p(c) cp(x) c
2
p(c)
15 225 .2 3 45
17 289 .3 5.1 86.7
21 441 .1 2.1 44.1
22 484 .4 8.8 193.6
(column sum) E[C] = 19.0 E[C
2
] = 369.4
So, E[C] = 19 and V [C] = E[C
2
] (E[C])
2
= 369.4 19
2
= 8.4.
Also, E[F] = (9/5)E[C] + 32 = 66.2, and V [F] = (9/5)
2
V [C] = 27.22.
3.4 The Binomial Distribution
Denition: A binomial experiment is one which
1. consists of n independent, identical trials,
9
2. each trial results in either a success S or failure F,
3. the probability of success for each trial is p.
And given a binomial experiment consisting of n trails, the binomial rv associated with
this experiment is X= # successes in n trials.
For binomial rv X Bin(n, p),
P(X = x) = b(x; n, p) =
_
_
_
_
n
x
_
p
x
(1 p)
nx
, x = 0, 1, 2, . . . , n
0, o.w.
(12)
Notes:
1. b(x; n, p)= (# ways x successes can occur in n trials)(probability of any one par-
ticular sequence of x successes in n trials).
2. In MATLAB with the Statistics Toolbox,
binopdf(x, n, p) = b(x; n, p)
binocdf(x, n, p) =
x

i=1
b(x; n, p).
And in Excel,
binomdist(x, n, p, 0) = b(x; n, p)
binomdist(x, n, p, 1) =
x

i=1
b(x; n, p).
3.4.1 The Mean and Variance of a Binomial RV
X Bin(n, p)
_
E[X] = np
V [X] = np(1 p) = npq
(13)
Examples:
1. A quality control test succeeds 95% of the time. If 10 tests are conducted, and rv X
is the number of the number of successes, then X Bin(10,.95). The the probability
of 9 or 10 successes is
P(X 9) = b(9; 10, .95) +b(10; 10, .95)
=
_
10
9
_
.95
9
.05
1
+
_
10
10
_
.95
10
.05
0
= (10)(.630)(.05) + (1)(.599)(1) = .914.
10
The expected number of successes is E[X] = (10)(.95) = 9.5, and the variance is
V [X] = (10)(.95)(.05) = .475.
2. A fair coin is ipped 3 times. X=# Heads. This is a binomial experiment with these
2
3
possible outcomes:
S =
_

_
HHH (X = 3)
HHT (X = 2)
HTH (X = 2)
HTT (X = 1)
THH (X = 2)
THT (X = 1)
TTH (X = 1)
TTT (X = 0)
Each outcome is equally likely, so P(X = 2) = P(HHT HTHTHH) = 3/8. The
complete pmf is
P(X = x) = b(x, 3, .5) =
_

_
1/8, x = 0
3/8, x = 1
3/8, x = 2
1/8, x = 3
We can calculate E[X] as either np = (3)(.5) = 1.5 or as
(1/8)(0) + (3/8)(1) + (3/8)(2) + (1/8)(3) = 1.5.
3. Six test are performed. Each is independent with a probability of success of .2. Let
X be the total number of successes. Then,
X Bin(6, .2), and
P(4 successes out of 6 trials) = b(4; 6, .2) =
_
6
4
_
.2
4
.8
2
= (15)(.0016)(.64) = .01536.
4. X Bin(15, .2). Then,
P(2 X 7) = P(X 7) P(X < 2)
= P(X 7) P(X 1)
= B(7; 15, .2) B(1; 15, .2)
= .996 .167 (from Cumulative Binomial Tables (Devore, Table A.1)), or
= binocdf(7, 15, .2) binocdf(1, 15, .2) (in MATLAB), or
= binomdist(7, 15, .2, 1) binomdist(1, 15, .2, 1) (in Excel)
= .829.
11
Note: The trickiest part of this problem is to realize that because the binomial cdf
is discontinuous at non-negative integers,
P(2 X 7) = P(X 7) P(X < 2)
= P(X 7) P(X 2).
5. (Ghahramani [1996, p. 171]) A town of 100,000 is exposed to a biological agent.
The probability of infection is .04. Random variable X is the number of inhabitants
infected. Then X Bin(100,000,.04). The capacity for medical care is 4,200 patients.
(a) The expected number of infected patients is E[X] = (100, 000)(.04) = 4, 000.
(b) The probability of exceeding the available medical care capacity is
P(X 4201) = 1 P(X 4200)
= 1 B(4200, 10
5
, .04)
= 1 .9993 = .0007(using MATLAB).
Notes:
i. For large n, it is can be computationally challenging to compute binomial
probabilities. For example,
B(4200; 10
5
, .04) =
_
10
5
4200
_
.04
4200
.96
(10
5
4200)
+
_
10
5
4199
_
.04
4199
.96
(10
5
4199)
+
+
_
10
5
0
_
.04
0
.96
10
5
.
The additive components in this sum are generally very large numbers mul-
tiplied by very small numbers, which can lead to large roundo errors. We
will see during discussions of the Poisson and normal distributions that
when n is large and small p, good approximations are available which sim-
plify calculation of the binomial cdf. Computational engines like MATLAB
and Excel (unlike some handheld calculators) make use of these approxima-
tions.
ii. In this case, getting infected by the biological agent is a binomial success.
Those infected might disagree with this wording.
12
3.5 The Negative Binomial and Geometric Distributions
Denition: A negative binomial experiment consists of a sequence of independent trials
where:
1. Each trail results in either a success (S) for failure (F).
2. For each trial, P(S) = p.
3. The experiment continues until r successes have been obtained.
Denition: The negative binomial rv is
X = number of failures that precede the r
th
success.
If X = x, then (r 1) Ss occur in the rst (x +r 1) trials, followed by a nal S on the
(x +r)
th
trial.
P(X = x) = nb(x; r, p)
= P((r 1) Ss in (x +r 1) trials) P(S)
=
_
x +r 1
r 1
_
p
r1
(1 p)
x
p
=
_
x +r 1
r 1
_
p
r
(1 p)
x
(14)
Also,
E[X] =
r(1 p)
p
(15)
V [X] =
r(1 p)
p
2
(16)
Examples:
1. We want to roll three 4s with a fair die.
Q1: What is the probability that we roll zero non-4s and then three 4s?
nb (0; 3, 1/6) =
_
0 + 3 1
3 1
_
(1/6)
3
(5/6)
0
= (1/6)
3
Q2: What is the probability of rolling a total of ten non-4s before rolling three 4s?
nb (10; 3, 1/6) =
_
10 + 3 1
3 1
_
(1/6)
3
(5/6)
10
= .04935
13
Why does this make sense? One way of rolling ten non-4s and three 4s is
FFFFFFFFFFSSS,
where F is a non-4, and S is a 4. The last roll must be a S, but any rearrangement of
the rst 10 Fs and 2 Ss is OK. There are 12!/(10!2!) =
_
12
2
_
possible rearrangements
of the rst 12 Fs and Ss. Add the required, nal S, and then each legal 13-roll
sequence occurs with probability (1/6)
3
(5/6)
10
.
2. It takes 3 hits to disable a target. Each shot has a probability of .25 of hitting. What
is the probability that 15 or fewer shots are required to disable the target? And what
is the expected number of shots required? Let X be the random number of shots
required. Then Y = (X 3) is the number of misses required. Y nb(3, .25), so
P(X 15) = P(Y 12) =
12

i=0
nb(i; 3, .25).
Using MATLAB (see note below), nbincdf(12,3,.25)=.7639. And the expected
number of misses before disabling the target is E[Y ] = (3)(1 .25)/.25 = 9. So the
expected number of shots is E[Y + 3] = E[Y ] + 3 = 12.
Computational Note: In MATLAB with the Statistical Toolbox,
nb(x; r, p) = nbinpdf(x, r, p), and the cdf is
x

i=0
nb(i; r, p) = nbincdf(x; r, p).
And in Excel,
nb(x; r, p) = negbinomdist(x, r, p).
There is no Excel function for the negative binomial cdf.
3.5.1 The Geometric Distribution
Denition: A geometric experiment consists of a sequence of independent trials, each
trial resulting in either a success S or failure F. P(S) = p, P(F) = (1 p) = q, and the
experiment continues until the rst success S.
Denition: The geometric rv X is the number trials required to achieve the rst suc-
cess.
P(X = x) = P((x 1) Fs followed by one S)
= q
x1
p, x = 1, 2, 3, . . . (17)
14
From our knowledge of the geometric series, we know that the pmf of X sums to 1 as it
should.

x=1
P(X = x) = p +qp +q
2
p +q
3
p +
= p(1 +q +q
2
+q
3
+ )
= p(1/(1 q))
= p(1/p) = 1.
The mean of a geometric random variable is given by
E[X] = 1P(X = 1) + 2P(X = 2) + 3P(X = 3) +
= p + 2qp + 3q
2
p + 4q
3
p +
= p(1 + 2q + 3q
2
+ 4q
3
+ )
= p(1/(1 q)
2
)
= p(1/p
2
)
= 1/p (18)
The formula for variance is
V [X] = q/p
2
. (19)
And the cdf is
P(X x) = P(1 or more Ss occur in x trials)
= 1 P(no Ss occur in x trials)
= 1 q
x
. (20)
Examples: When rolling a fair die,
1. What is the mean number of rolls required to see a 4?
X geometric(1/6), so
E[X] =
1
(1/6)
= 6.
2. What is the probability of 10 rolls being required before seeing the rst 4?
P(X = 10) = (5/6)
9
(1/6) = .0323.
15
3. What is the probability of more than 5 rolls being required to see the rst 4?
P(X 6) = 1 P(X 5)
= 1
_
1 (1 p)
5
_
= (5/6)
5
= .40188
= P(rolling ve non-6s in a row).
Notes:
1. The geometric rv counts number of trials to the rst success, while the nb(1, p) counts
the number of failures to the rst success. So,
X geometric(p) (X 1) nb(1, p).
2. In MATLAB with the Statistical Toolbox, geopdf(x,p) and geocdf(x,p) are the
geometric rv pdf and cdf functions. Just to keep us on our toes, MATLAB uses the
convention that geometric rv X counts the number of failures required to achieve
success, not the number of trials. So in MATLAB, geopdf(0,.1)=.1, which is the
probability of a geometric rv with p = .1 achieving a success on the rst trial. Excel
does not have these functions, but the geometric pdf and cdf are easily computed
directly from (17) and (20).
3.6 The Poisson Distribution
Denition: A rv X has a Poisson distribution with parameter > 0 if the pmf is
p(x; ) =
e

x
x!
, x = 0, 1, 2, . . . (21)
If X Poisson(), then it can be shown that
E[X] = V [X] = . (22)
For example, when = 3,
p(x; 3) =
_

_
(e
3
3
0
)/0! = .0498, x = 0
(e
3
3
1
)/1! = .1494, x = 1
(e
3
3
2
)/2! = .2240, x = 2
(e
3
3
3
)/3! = .2240, x = 3
(e
3
3
4
)/4! = .1680, x = 4
(e
3
3
5
)/5! = .1008, x = 5
(e
3
3
6
)/6! = .0504, x = 6
.
.
.
16
Figure 5 shows the Poisson pmf with parameters (i.e., means) = 1, 2, 3, 4, and 5. The
Poisson rv is completely specied by one parameter .
0 2 4 6 8 10 12
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
x
P
o
i
s
s
o
n

P
D
F
= .5
= 1
2
3
4
5
Figure 5: Poisson PMF with =.5, 1, 2, 3, 4, and 5.
Notes:
1. (e

x
)/x! 0 for large x. So factorials get larger faster than do exponentials.
2. e

= 1 + +
2
/2! +
3
/3! +
4
/4! + , so
1 = e

+e

+e

2
/2! +e

3
/3! + . Thus,

x=0
p(x; ) = 1, and p(x; ) is a proper pdf.
3. In Excel,
p(x; ) = poisson(x, , 0), and
x

x=0
p(x; ) = poisson(x, , 1)
And in MATLAB with the Statistics Toolbox,
p(x; ) = poisspdf(x, ), and
x

x=0
p(x; ) = poisscdf(x, ).
17
3.6.1 The Poisson Approximation to the Binomial Distribution
In a binomial experiment with n large enough and p small enough,
b(x; n, p) p(x; ), where = np. (23)
The rule of thumb in Devore [2004] is that when n 100, p .01, and np 20, then this
approximation can be used.
Examples:
1. If the probability of an error occurring on any single page of a book is .005, then
P(a 400-page book contains exactly 3 pages with errors)
= b(3; 400, .005) =
_
400
3
_
.005
3
.995
397
= .1809
p(x; 400 .005 = 2) = (e
2
2
3
)/3! = .1804.
2. A ship has a crew of 200. On the average, each crew member makes 3 visits to sick
bay per year. Let X be the total number of sick bay visits during one day.
(a) What is the pmf for X and its mean?
X Bin(200, 3/365 = .00822), and
E[X] = np = (200)(3/365) = 1.64 visits/day.
(b) What is the probability that the number of visits to sick bay in one day exceeds
4?
P(X 5) = 1P(X 4) = 1 binocdf(4, 200, 3/365) = .0256 (in MATLAB).
Using the Poisson approximation, X Poisson(np = 1.64), and
1 P(X 4) = 1 poisscdf(4, 1.64) = .0262 (in MATLAB).
Note: Having Excel or MATLAB to compute the binomial cdf directly makes the Poisson
approximation less critical.
18
3.6.2 Proof of the Poisson Approximation to the Binomial
b(x; n, p) =
n!
(n x)!x!
p
x
(1 p)
nx
=
n!
(n x)!x!
(/n)
x
(1 (/n))
nx
, since = np
=
n(n 1) (n x + 1)
n
x
_

x
x!
_
(1

n
)
n
(1

n
)
x
(1)
_

x
x!
_
e

1
, for large n and moderate
=
_

x
x!
_
e

= p(x; ).
3.6.3 Calculation of Poisson Probabilities Recursively
p(x + 1; n, p)
p(x; n, p)
=
e

(
x+1
/(x + 1)!)
e

x
/x!
=

x + 1
So,
p(x + 1; n, p) = p(x; n, p)
_

x + 1
_
, x = 0, 1, . . . . (24)
Starting with p(0; n, p) = e

,
p(1; n, p) = p(0; n, p)(/1)
p(2; n, p) = p(1; n, p)(/2)
p(3; n, p) = p(2; n, p)(/3)
.
.
.
3.6.4 The Poisson Process
Suppose we have a process which create events at a mean rate of events/unit time. An
example we have seen is the arrival of patients to sick bay on a ship, which occurred at an
average rate of 3 arrivals per year. This is the same rate as 3/365 arrivals per day. The
event rate has the following interpretation:
A1 : P(event occurs in the small interval of length t) = t +o(t).
19
Note: o(t) is any function which goes to 0 faster than linearly. That is,
lim
t0
o(t)
t
= 0.
For example, (t)
2
is o(t) since
lim
t0
(t)
2
t
= 0.
We also assume that the process is such that
A2 : P( 2 events occurring in interval of length t) = o(t).
And nally, we assume that
A3 : The occurrence of events in disjoint time periods are probabilistically independent.
If assumptions A1, A2 and A3 all hold, and N
t
is the random number of events in occurring
in an interval of length t, then it can be shown that
N
t
Poisson(t), so (25)
P(N
t
= k) = e
t
(t)
k
k!
, and (26)
E[N
t
] = t. (27)
Examples:
1. Telephone calls to a 911 exchange arrive according to a Poisson process with rate
.1/min.
(a) What is the probability that 5 calls will occur during the next hour?
N
60
Poisson((.1)(60) = 6), so
P(N
60
= 5) = e
6
6
5
/5! = .1606.
(b) What is the mean number of calls during a 1-hour period?
N
60
Poisson((.1)(60)) E[N
60
] = 6.
(c) What is the probability of 0 calls arriving during a 30-minute period?
N
30
Poisson((.1)(30)) P(N
30
= 0) = e
(.1)(30)
= .0498.
20
2. An ASW search process is modeled as a Poisson process with detection rate de-
tections/day. Let T be the random time of initial detection. For any specied t 0,
what is P(T t)?
N
t
Poisson(t), where t is measured in days. So,
P(T t) = P(one or more detections occurs in the interval [0,t])
= 1 P(no detections in the interval [0,t])
= 1 P(N
t
= 0)
= 1 e
t
.
Figure 6 shows P(initial detection occurs on or before time t) when the average de-
tection rate is 1 detection/day.
0 0.5 1 1.5 2 2.5 3
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
P
(
T

<
=

t
)
time t (days)
P(T <= t) = 1 e
t
= 1 detection/day
Figure 6: Probability that initial detection occurs before time t.
21
References
Donald R. Bar and Peter W. Zehna. Probability. Brooks/Cole Publishing Co., 1971.
Kenneth N. Berk and Patrick Carey. Data Analysis with Microsoft Excel. Thomson-
Brooks/Cole, 2004.
Jay L. Devore. Probability and Statistics for Engineering and the Sciences. Thomson-
Brooks/Cole, 6
th
edition, 2004.
Saeed Ghahramani. Fundamentals of Probability. Prentice Hall, 1996.
Jim Pitman. Probability. Springer, 1993.
Sheldon M. Ross. A First Course in Probability. Prentice Hall, 5
th
edition, 1998.
Sheldon M. Ross. Introduction to Probability Models. Academic Press, 6
th
edition, 1997.
Sheldon M. Ross. Introduction to Probability and Statistics for Engineers and Scientists.
Harcourt Academic Press, 2
nd
edition, 2000.
22

Você também pode gostar