Você está na página 1de 2

# Markov chains cheat sheet

Winter 2014

## and pij (m) to denote

pij (m) = P[Xn+m = j|Xn = i].
The transition probability matrix is the matrix P whose entry in the ith row and jth column is pij . The
m-step transition matrix P(m) is the one whose entry in the ith row and jth column is pij (m). They
are related by
P(m) = Pm .

The state i comunicates with the state j if it is possible for the chain to go from state i to j in any number
of steps. If i communicates with j and j communicates with i, then the two states intercomunicate.

A state i is persistent if
 

P chain eventually returns to state i X0 = i = 1.

## If a state is not persistent, it is transient.

Denote by Ti the time it takes for the chain to return to state i, given that it begins in state i. That is,
 
Ti = min n such that Xn = i|X0 = i .

If i is a persistent state, then Ti is a random variable on Z+ , the set of positive integers. If E[Ti ] exists,
we say that the state i is non-null or positive, and we denote E[Ti ] = i Otherwise, i is null persistent.

X
pjj (n) = .
n=1

## lim pjj (n) = 0.

n
1
The period of a state i is  
 
gcd n such that P Xn = i|X0 = i > 0 .

## A state is ergodic if it is positive (non-null) persistent and aperiodic.

A subset C of the state space S is called closed if pij = 0 for all i C and all j S \ C.

## Decomposition theorem: The state space S can be partitioned into

S = T C1 C2 ,

where T is the set of transient states and each Ci is a closed and irredudible set of persistent states.

For finite state spaces, at least one state is persistent, and all persistent states are non-null.

A stationary distribution
~ = (0 , 1 , 2 , . . . )
is a probability distribution on Xn such that if Xn has distribution ~ , then so does Xn+1 . That is,

~ P = ~ .
1
Stationary distributions exist uniquely whenever all states are positive persistent, in which case j = j
.

For an irreducible ergodic Markov chain, limn Pn exists and is equal to the matrix for which all rows
are ~ .