Você está na página 1de 10

Recitation 7: Discrete-Time Markov Chains

Hung-Bin (Bing) Chang and Yu-Yu Lin


Electrical Engineering Department University of California (UCLA), USA,
hungbin@seas.ucla.edu and skywoods2001@ucla.edu

Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

1 / 10

Outline

Markov Chains

Transition Probability Function

Transient State Analysis

Steady State Analysis

Examples
Example 1
Example 2

Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

2 / 10

Markov Chains

Stochastic Process
A stochastic process (SP) is a collection of random variables over
the index set by time.
X = {Xn , n = 0, 1, 2, . . . } is a discrete time stochastic process.
States assumes values from a countable state space
S = {0, 1, 2, . . . }.
Xn
X2
3
2

X5
X3

X1

X4

n
1

Figure : An example for discrete-time stochastic process

Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

3 / 10

Markov Chains

Markov Chains

A stochastic process (X ) is considered to be a Markov Chain if it


satisfies the Markov Property
Future evolution of the process is independent of the past given the
present.

Discrete-time Markov Chain (DTMC):


P(Xn+1 = j | X0 , X1 , . . . , Xn ) = P(Xn+1 = j | Xn )
Continuous-time Markov Chain (CTMC):
P(Xt+s = j | Xu , u t) = P(Xt+s = j | Xt ), t, s 0.

Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

4 / 10

Transition Probability Function

Transition Probability Function


The one-step (time homogeneous) transition probability is
independent of n
Pn (i, j) = P(Xn+1 = j | Xn = i) = P(X1 = j | X0 = i) = P(i, j).
Given a time homogeneous state space S = {1, 2, 3}, the
transition probability function (TPF) and state transition diagrams
are shown as follows.

P(1, 1) P(1, 2) P(1, 3)


P = P(2, 1) P(2, 2) P(2, 3)
P(3, 1) P(3, 2) P(3, 3)

Figure : State transition diagram


Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

5 / 10

Transient State Analysis

Transient State Analysis


State distribution at time n:
Pn (j) = P(Xn = j)
m-step transition probability function (TPF):
P (m) (i, j) = P(Xm = j | X0 = i)
Recursive computation:
P (m+1) (i, j) =

P (m) (i, k )P(k , j)

k S

Initial state computation:


Pn (j) =

P0 (k )P (n) (k , j)

k S

Chapman-Kolmogorov:
P (m+n) (i, j) =

P (m) (i, k )P (n) (k , j)

k S
Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

6 / 10

Steady State Analysis

Steady State Analysis


= {(i), i S} is a stationary distribution iff
P
It is a valid distribution, i.e., iS (i)
P = 1.
It has a unique solution for (j) = iS (i)P(i, j) (Matrix form
P = ).

= {(i), i S} is a steady stationary distribution iff


It is a stationary distribution
limn P (n) (i, j) = (j)
P (n) (i, j) is the probability of going from state i to state j in n steps.

Example:
If the state space S = {1, 2},
(1) = (1)P(1, 1) + (2)P(2, 1)

(2) = (1)P(1, 2) + (2)P(2, 2)


Figure : State transition diagram
Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

7 / 10

Examples

Example 1

Example 1
A discrete-time Markov chain (DTMC) has state space
S = {0, 1, 2} and its TPF is given as

0.7
P= 0
0.5

0.2
0.6
0

0.1
0.4 .
0.5

(a) P(X8 = 0 | X7 = 2) = P(2, 0) = 0.5.


(b) P(X2 = 1, X3 = 1 | X1 = 0)
= P(X3 = 1 | X2 = 1, X1 = 0)P(X2 = 1 | X1 = 0)
= P(1, 1)P(0, 1) = 0.6 0.2 = 0.12.
(c) P(X2 = 1 | X0 = 0) = P (2) (0, 1)
X
=
P(0, k )P(k , 1)
k S

= P(0, 0)P(0, 1) + P(0, 1)P(1, 1) + P(0, 2)P(2, 1)


= 0.7 0.2 + 0.2 0.6 + 0.1 0 = 0.26.
Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

8 / 10

Examples

Example 2

Example 2
A discrete-time Markov chain (DTMC) has state space {0, 1, 2}
and TPF is given as

0.3
P = 0.5
0.5

0.2
0.1
0.2

0.5
0.4 .
0.3

(a) Find the two-step transition probability P (2)

(2)

(i, j) =

P (2)

Prof. Izhak Rubin (UCLA)

P(i, 0)

0.44
= 0.4
0.4

P(i, 1)
0.18
0.19
0.18

P(i, 2)

P(0, j)
P(1, j) ,
P(2, j)

0.38
0.41 .
0.42

EE 132B

2014 Fall

9 / 10

Examples

Example 2

Example 2 - Contd

(b) Find P(X2 = 0) given that the initial distribution




0 = 21 12 0 .

P(X2 = 0) =

P(X2 = 0, X0 = j) (Total probability theorem)

jS

P(X2 = 0 | X0 = j)P(X0 = j) (Conditional probability)

jS

P (2) (j, 0)0 (j).

jS

Prof. Izhak Rubin (UCLA)

EE 132B

2014 Fall

10 / 10

Você também pode gostar