Escolar Documentos
Profissional Documentos
Cultura Documentos
UNIVERSITAS ANDALAS
STATISTIKA dan
PROBABILITAS
oleh :
Purnawan, PhD ----- Kuliah ke 4 -----
Chapter 4
Using Probability and
Probability Distributions
Chapter Goals
After completing this chapter, you should be
able to:
Explain three approaches to assessing
probabilities
Apply common rules of probability
Contingency Tables
Ace Not Ace Total
Black 2 24 26
Red 2 24 26
Total 4 48 52
Tree Diagrams Sample
Ac e 2 Space
Sample
Space k Card
Full Deck Blac Not an Ace 24
of 52 Cards Ace
R ed C 2
ard
Not an 24
Ace
Elementary Events
A automobile consultant records fuel type and
vehicle type for a sample of vehicles
2 Fuel types: Gasoline, Diesel
3 Vehicle types: Truck, Car, SUV
k e1
6 possible elementary events: Truc
Car e2
e1 Gasoline, Truck line
s o
e2 Gasoline, Car Ga SUV e3
e3 Gasoline, SUV k e4
Die Truc
e4 Diesel, Truck sel Car e5
e5 Diesel, Car
SUV e6
e6 Diesel, SUV
Probability Concepts
Dependent Events
E1 = rain forecasted on the news
E2 = take umbrella to work
Probability of the second event is affected by the
occurrence of the first event
Assigning Probability
Classical Probability Assessment
Number of ways Ei can occur
P(Ei) =
Total number of elementary events
k
0 P(ei) 1
For any event ei
P(e ) = 1
i =1
i
where:
k = Number of elementary events
in the sample space
ei = ith elementary event
Addition Rule for Elementary Events
P( E ) = 1 P(E) E
Or, P(E) + P( E ) = 1
Addition Rule for Two Events
Addition Rule:
P(E1 or E2) = P(E1) + P(E2) - P(E1 and E2)
E1 + E2 = E1 E2
So
0 utualvlye
= i f m l usi
P(E1 or E2) = P(E1) + P(E2) - P(E1 and E2) c
ex
= P(E1) + P(E2)
Conditional Probability
P(E1 and E 2 )
P(E1 | E 2 ) =
P(E 2 )
CD No CD Total
AC .2 .5 .7
No AC .2 .1 .3
Total .4 .6 1.0
.2
|E 1
)=0 P(E1 and E3) = 0.8 x 0.2 = 0.16
k: P (E 3
Truc
Car: P(E4|E1) = 0.5 P(E1 and E4) = 0.8 x 0.5 = 0.40
Gasoline SUV:
P(E1) = 0.8 P (E |
5 E1 ) = P(E1 and E5) = 0.8 x 0.3 = 0.24
0. 3
6
|E ) = 0. P(E2 and E3) = 0.2 x 0.6 = 0.12
: P (E 3 2
Diesel Truck
P(E2) = 0.2 Car: P(E4|E2) = 0.1
P(E2 and E4) = 0.2 x 0.1 = 0.02
SUV:
P (E |
5 E2 ) =
0. 3 P(E3 and E4) = 0.2 x 0.3 = 0.06
Bayes Theorem
P(Ei )P(B| Ei )
P(Ei | B) =
P(E1)P(B| E1) + P(E2 )P(B| E2 ) + K + P(Ek )P(B| Ek )
where:
Ei = ith event of interest of the k possible events
B = new event that might impact P(Ei)
Events E1 to Ek are mutually exclusive and collectively
exhaustive
Bayes Theorem Example
Sum = .36
Bayes Theorem Example
(continued)
Sum = .36
Introduction to Probability
Distributions
Random Variable
Represents a possible numerical value from
a random event
Random
Variables
Discrete Continuous
Random Variable Random Variable
Discrete Probability Distribution
.50
.25
H H
0 1 2 x
Discrete Probability Distribution
A list of all possible [ xi , P(xi) ] pairs
xi = Value of Random Variable (Outcome)
P(xi) = Probability Associated with Value
xis are mutually exclusive
(no overlap)
xis are collectively exhaustive
(nothing left out)
0 P(xi) 1 for each xi
P(xi) = 1
Discrete Random Variable
Summary Measures
Expected Value of a discrete distribution
(Weighted Average)
E(x) = xi P(xi)
Example: Toss 2 coins,
x P(x)
x = # of heads, 0 .25
compute expected value of x: 1 .50
2 .25
E(x) = (0 x .25) + (1 x .50) + (2 x .25)
= 1.0
Discrete Random Variable
Summary Measures
(continued)
x = {x E(x)} P(x) 2
where:
E(x) = Expected value of the random variable
x = Values of the random variable
P(x) = Probability of the random variable having
the value of x
Discrete Random Variable
Summary Measures
(continued)
Example: Toss 2 coins, x = # heads,
compute standard deviation (recall E(x) = 1)
x = {x E(x)} P(x) 2
where:
xi = possible values of the x discrete random variable
yj = possible values of the y discrete random variable
P(xi ,yj) = joint probability of the values of xi and yj occurring
Interpreting Covariance
xy
=
x y
where:
= correlation coefficient (rho)
xy = covariance between x and y
x = standard deviation of variable x
y = standard deviation of variable y
Interpreting the
Correlation Coefficient
The Correlation Coefficient always falls
between -1 and +1
=0 x and y are not linearly related.
The farther is from zero, the stronger the linear
relationship: