Você está na página 1de 5

# ISyE 3232C

YL. Chang

## Stochastic Manufacturing and Service Systems

Fall 2015

Probability Review
August 19th, 2015

Definitions
(1) Sample space (S): the set of all possible outcomes of an experiment.
Example: rolling a die S = {1, 2, 3, 4, 5, 6}.
(2) Event (E): Any subset of the sample space S.
Example: the event that an even number appears on the roll.
Union: E F , Intersection: E F , Complement of E: E c .
(3) Probability of the event E, P (E) satisfies:
0 P (E) 1,
P (S) = 1,
If E F = , then P (E F ) = P (E) + P (F ), where is the empty set.
Hence,
P (E c ) = 1 P (E)
P () = 0
If E F 6= , P (E F ) = P (E) + P (F ) P (E F )
(4) Conditional Probability: P (E|F ) is the probability that event E occurs given that F has occurred.
P (E|F ) =

P (E F )
.
P (F )

## (5) Independence: Two events of E and F are said to be independent if

P (E F ) = P (E)P (F )
P (E|F ) = P (E) if E and F are independent.
The occurrence of E is independent of whether or not F occurs.
(6) Bayes Rule: Note, E = (E F ) (E F c ). In order for a point to be in E, it must either be in both
E and F , or it must be in E and not in F .
P (E) = P (E F ) + P (E F c )
since E F and E F c are mutually exclusive.
Generally, suppose that F1 , F2 , ..., Fn are mutually exclusive events such that ni=1 Fi = S. In other
words, exactly one of the events F1 , F2 , ..., Fn will occur.
P (E) =

n
X
i=1

P (E Fi ) =

n
X

P (E|Fi )P (Fi )

i=1

Suppose E has occurred and we are interested in determining which one of the Fj also occurred.
P (Fj |E) =

P (E Fj )
P (E|Fj )P (Fj )
= Pn
(Bayes formula)
P (E)
i=1 P (E|Fj )P (Fj )

(7) Random Variables: A random variable is a variable whose value is subject to variations due to chance.

Discrete random variable: can take on at most a countable number of possible values
Continuous random variable: can take on a continuum of possible values.
(8) Cumulative distribution function (cdf ): F (.) of the random variable X: for any real number
b, < b < ,
F (b) = P (X b)
Meaning: F (b) denotes the probability that the random variable X takes on a value that is less than or
equal to b.
Properties:
F (b) is non-decreasing function of b,
limb F (b) = F () = 1,
limb F (b) = F () = 0.
All probability questions about X can be answered in terms of the cdf F (.).
P (a < X b) = F (b) F (a).
(9) Probability mass function (pmf ) of X: A function that gives the probability that a discrete random
variable X is exactly equal to some value. Denote pmf p(a) = P {X = a}.
Properties: If X must assume one of the values x1 , x2 , ..., then
p(xi ) > 0, i = 1, 2, 3, ... and p(x) = 0, all other values of x.
P

i=1 p(xi ) = 1
F (x) = P (X x) =

x
X
y=

P (X = y) =

x
X

p(y).

y=

(10) Probability density function (pdf ) of X: A function that describes the relative likelihood for
continuous random variable X to take on a given value.
f (a) is a measure of how likely it is that the random variables that appear frequently in probability theory.
Properties:
R +
f (x)dx = P (X (, )) = 1
Rb
P (a X b) = a f (x)dx
Ra
P (X = a) = a f (x)dx = 0
Ra
F (a) = P (X (, a]) = f (x)dx

d
da F (a)

P (a


2

= f (a)
X a + 2 ) =

R a+ 2
a 2

f (a)

(11) Expectation: Intuitively, expectation of a r.v. is the long-run average value of repetitions of the
experiment it represents.
(a) Discrete random variable
P
E[X] = x:p(x)>0 xp(x)
P
E[g(X)] = x:p(x)>0 g(x)p(x)
(b) Continuous random variable
R +
E[X] = xf (x)dx
R +
E[g(X)] = g(x)f (x)dx
2

## (12) Variance: V ar(X) = E(X E(X))2 = E(X 2 ) (E(X))2

p
(13) Standard Deviation (sd): sd(X) = V ar(X)
(14) Coefficient of Variation (cv): CX =
2
(15) Square-CV CX
=

sd(X)
E(X)

V ar(X)
(E(X))2

Common Distributions
Bernoulli: Outcome can be classified as either a success or as a failure.
Eg: Whether or not the next vehicle contains a child
Binomial: Models the number of successes in n trials, when the trails are independent with common
success probability p.
Eg: the number of cars among the next 20 cars that have more than two bumper stickers
Geometric: Models the number of trials required until the first success, when the trails are independent
with common success probability p.
Eg: the number of cars that go by until there is a car carrying 3 people.
Poisson X poi(): Models the number of independent events that occur in a fixed amount of time
or space.
k e
for k = 0, 1, 2, ...
k!
Eg: the number of cars in the next 10 minutes containing two or more children.
P (X = k) =

Normal X N (, 2 ): Models the distribution of a process that can be thought of as the sum of a
number of component processes.
pdf:
(x)2
1
f (x) = e 22
2
Eg: the total number of people in the next 100 cars
Uniform X U (a, b): Models the outcomes are equally likely to be observed.
pdf:
(
1
if a x b
f (x) = ba
0
otherwise
cdf:
F (x) =

xa
ba

if x a
if a x b
if x > b

## Eg: throwing a fair dice.

Exponential X exp(): Models the time between independent events, or a process time which is
memoryless.
pdf:
(
ex if x 0
f (x) =
0
if x < 0
cdf:

(
1 ex
F (x) =
0

if x 0
otherwise

## Eg: the length of time until the next ambulance.

Other distributions such as Gamma, lognormal, Erlang etc.
Exercise
Define
(

if x y
if x > y

x
y
(
x
max(x, y) = x y =
y
min(x, y) = x y =

if x y
if x < y

x+ = max(x, 0)
x = max(x, 0)
1. Assume that D follows the following distribution. Please fill this table and calculate E[(23 D)] and
E[(23 D)+ ].

15

20

30

35

P[D = d]

0.2

0.3

0.2

0.3

23 D

15

20

23

23

(23 D)+

30 D

15

20

30

30

(D 30)+

## E[23 D] = 15 0.2 + 20 0.3 + 23 0.2 + 23 0.3 = 20.5

E[(23 D)+ ] = 8 0.2 + 3 0.3 + 0 0.2 + 0 0.3 = 2.5
2. For X U (20, 40), evaluate E[X 25] and E[(25 X)+ ].
Z

E[X 25] =

(x 25)f (x)dx

40

(x 25)f (x)dx =
1
20

20
25

40

(x 25) =
20

Z
Z 40
1
(
(x 25)dx +
(x 25)dx)
20 20
25

Z 25
Z 40
1
1 1 2 25
=
(
xdx +
25dx) =
( x |20 + 25 x|40
25 )
20 20
20
2
25
1
225 75
1 1
=
(252 202 ) + 25(40 25) =
+
20 2
20
40
4

40

E[(25 X) ] =
20

1
1
(25 x)
dx =
20
20
+

25

(25 x)dx =
20

1
1
[25(25 20) x2 |25
]
20
2 20

(a) P (X = 0)
(b) P (2 X 4)
(c) P (X > 2)
e5 50
= e5
0!

P (X = 0) =

53
54
52
+
+ )
2!
3!
4!

5 k
3
4
5
X
e 5
5
5
5
P (X > 2) =
= e5 ( +
+
+ ...)
k!
3!
4!
5!

k=3

## = 1 P (X 2) = 1 P (0) P (1) P (2)

= 1 e5 (

51
52
25
50
+
+ ) = 1 e5 (1 + 5 + )
0!
1!
2!
2

## 4. For X exp(7), evaluate E[max(X, 7)].

Recall Integration by parts:
Z

f 0 gdx

f g dx = f g

E[max(X, 7)] =

max(x, 7)7e7x dx

## max(x, 7)f (x)dx =

Z 7

max(x, 7)7e7x dx +

=
0

0
+

max(x, 7)7e7x dx

Z
= 49

e7x dx + 7

xe7x dx

1
1

## = 49( e7x |70 ) + 7[x( e7x )|+

7
7
7
1
= 7 + e49
7
5. For X exp(8), find x such that F (x ) = 0.6.

1 e8x = 0.6

e8x = 0.4
8x = ln(0.4)
1
x = ln(0.4)
8
6. For X U (10, 20), find x such that F (x ) = 0.7.
x 10
= 0.7
20 10
x = 17.

F (x ) =

Z
7

1
( e7x )dx]
7